A Generational Leap in Graphics [2020] *Spawn*

So you agree that RT does add positively to the games visuals then. It seems we're in agreement.
RT adds positively. It just doesn't seem like an effective use of resources at the moment, past some mild threshold of utilization.
Like rendering at native 4K adds positively compared to say 1800p + temporal reconstruction, but the later is a much more effective use of resources.

In the case of Cyberpunk, it would look a lot better and more immersive if there had been more effort put into e.g. NPC character modelling, instead of enabling a full suite of raytracing options.
Of course, the academic exercise of observing the advantages of raytracing would be lost, but that should obviously never be the developer's end goal.
CDPR must sell a videogame, and it looks like that part isn't going very well at the moment.


In what way? No-one's suggested that Lumen and Nanite won't work on modern consoles because they require impractically large assets. Only that this particular demo used impractically large assets so expecting something that looks just like this across an entire game may be unrealistic.
How large were the assets in the demo, in GB? Have you seen official numbers, or is this just assumption on your part?


No-ones trying to bully anyone, I'm not sure that's a particularly helpful statement. If you're telling me your interest in talking down the graphics of this game has nothing to do with it looking it's best on a platform/IHV hardware that differs from your preferred platform/IHV hardware then I'll happily accept that at face value.
You know you already poisoned the well with the suggestion and accusations. Instead of trying to argue my original point (which no one has BTW) you went straight to a "fanboy warring", to the point of accusing me of writing stuff I didn't write.
I wish you hadn't. Perhaps this would have been a much more productive conversation from the start.


It means the console cpus are dogshit Jaguar cores.
DF shows that there seems to be some inconsistencies with texture sizes and LODs on the base consoles (i.e. they're using IQ settings they shouldn't, on those GPUs).
The dogshit Jaguar cores are already having a negative impact on the scarcity of NPCs in the streets, and they shouldn't justify all the performance bugs that people started to encounter.


Wtf are you talking about? As far as you understand the game only has RT because of Nvidia? Where are you getting this from?
RT in CP77 is only available for RTX graphics cards. You think the game would have RT at launch if CDPR hadn't made a cross-promotion deal with Nvidia?


This thread is people who haven't seen it themselves telling those who have seen it, that they're mistaken about its impact.
This thread is people assuming too much.


Perhaps a dumb question, but wasn't Ark 1 considered to have a pretty terrible graphics engine? IQ was good but performance was pretty terrible across the board.
 
...
DF shows that there seems to be some inconsistencies with texture sizes and LODs on the base consoles (i.e. they're using IQ settings they shouldn't, on those GPUs).
The dogshit Jaguar cores are already having a negative impact on the scarcity of NPCs in the streets, and they shouldn't justify all the performance bugs that people started to encounter.

RT in CP77 is only available for RTX graphics cards. You think the game would have RT at launch if CDPR hadn't made a cross-promotion deal with Nvidia?
...

The Jaguar cores are awful at this point. They were pretty bad when they consoles launched. Apple M1 dumpsters them, so it shouldn't be surprising that M1 performance is ok. As for the game itself, yah, it has a ton of problems including performance issues. Better hardware can mask those problems.

As for RT support, I don't really like to get into hypothetical discussions. We don't know what they would have done without the Nvidia deal. They game may have launched with ray tracing anyway, if they had interest in doing it. Your guess is as good as mine. The game is made with DXR, nothing proprietary. I don't know why it doesn't run on AMD gpus. Maybe they're working on DXR 1.1 support now that the API is available. Not sure if AMD cards work better with one version of the API over the other. Again, I'm just guessing. Not sure anyone has real info there.
 
Outside of just giving developers an incentive to add more collect-a-thon and fetch quests, I'm hard pressed to see what I'm missing out on by not being able to go into every room in a high rise.

You are not missing anything. Devs aren't building "leisure exploration" open-world simulators, they are building games. A significant portion of open worlds is just fluff anyway. No aspect of development exists in a vacuum. Time and resources adding more to the fluff takes away from investment into actual gameplay mechanics and level building of story missions.

It's like producing a movie that offers an interactive camera and editor and then wasting a significant portion of the movie budget to build out fully realized movie sets. Unless moviegoers actually base their purchases around the ability to move around a set like looking in the closet to see what clothes are hanging up, there is nothing to warrant wasting production costs on such a feature.

It's not about what's nice or neat, but rather is value added for the majority of people who will buy and play the game.

Think how much cost is added to flesh out the inside of a highrise. Procedural generation may reduce the cost of design. But is there procedural game testing to reduce that cost as well?
 
Last edited:
RT in CP77 is only available for RTX graphics cards. You think the game would have RT at launch if CDPR hadn't made a cross-promotion deal with Nvidia?

Weren't the 6000 series released three weeks prior to Cyberpunk? And isn't the issue with the 6000 series is that it doesn't accelerate as much of the RT pipeline as RTX?

I imagine CDPR was more interested in stomping out as many bugs as possible to reduce the shitshow that's now reality instead of adding on top of it by trying to throw in RT support for AMD's newly released cards.
 
Last edited:
In the case of Cyberpunk, it would look a lot better and more immersive if there had been more effort put into e.g. NPC character modelling, instead of enabling a full suite of raytracing options.


How large were the assets in the demo, in GB? Have you seen official numbers, or is this just assumption on your part?

First point: do you think thousands of hours of professional character artist time went in to implementing the raytracing? You can't just switch those out, they require different staffing and are suitable at different times of the production. Additionally RT was probably relatively cheap to add in terms of development time (compared to the rest of the effects in their renderer) -- at a certain point, the core renderer works great and you have x engineers sitting around and y time to implement techniques using partially solved problems that "just work" if you can feed them the right data and optimize.

Second point: They're commercially available megascans. Anybody who's ever seen a compressed 3d model can eyeball the size, but additionally, you can track down the individual files (iirc they said they used cinematic presets without optimization.) As shown, that demo would definitely be the biggest game ever shipped if it scaled up to a real game size -- athough, as i said earlier in the thread, I don't expect that to be an unsolveable problem for them and I'm sure real nanite games when the engine comes out will be reasonable size.

I'd ballpark something like 40-60 gigs for the scene we saw, from memory, but that number could be like 50% off since I don't really remember how many unique assets we saw in the demo.
 
RT adds positively. It just doesn't seem like an effective use of resources at the moment, past some mild threshold of utilization.
Like rendering at native 4K adds positively compared to say 1800p + temporal reconstruction, but the later is a much more effective use of resources.

So your completely honest and unbiased opinion is that the difference between RT off and RT in in CB2077 is comparable to the difference between 1800p and 2060p?

In the case of Cyberpunk, it would look a lot better and more immersive if there had been more effort put into e.g. NPC character modelling, instead of enabling a full suite of raytracing options.

That's an interesting opinion. Personally I'd say there's nothing wrong with the NPC's in CB2077 for an open world game that can potentially have dozens of them on screen at the same time. Compare it to a similar game that was praised previous gen for it's graphics:

https://cramgaming.com/cyberpunk-2077-boasts-some-awesome-looking-npcs-55451/

People expecting NPC models akin to those in much smaller scale games like Fifa or non-open world story driven single player games that may only have 3 or 4 on screen at once are being unrealistic.

Of course, the academic exercise of observing the advantages of raytracing would be lost, but that should obviously never be the developer's end goal.

So we're back to RT being worthless outside of the "academic exercise of observing the advantages" are we?

CDPR must sell a videogame, and it looks like that part isn't going very well at the moment.

The PR hasn't been good but the sales have been spectacular. Even with a high percentage of returns they're likely to be making a bucket on this. And bear in mind that the majority of pre-orders, and thus likely overall sales have been on PC. Where uses have no incentive to return what is by all accounts a spectacular game.

https://gameworldobserver.com/2020/...-sells-15-million-copies-pre-orders-included/
https://www.gamesindustry.biz/artic...gest-retail-launch-of-the-yer-uk-boxed-charts

How large were the assets in the demo, in GB? Have you seen official numbers, or is this just assumption on your part?

I thought I'd read in the thread about the demo here that the assets filled the majority of the HDD. I may be mistaken but it doesn't really matter since it's irrelevant to the core point that it's a tech demo - why are you comparing it to a released game?

You know you already poisoned the well with the suggestion and accusations. Instead of trying to argue my original point (which no one has BTW) you went straight to a "fanboy warring", to the point of accusing me of writing stuff I didn't write.
I wish you hadn't. Perhaps this would have been a much more productive conversation from the start.

Perhaps it could have been more productive but you seem quite keen to carry on this line of conversation so here we are.

Firstly, your original point that the majority of the RT doesn't look any better on than off has been disputed numerous times. cwjs answered you directly in this post. Others (who've actually played the game) have stated in here that the RT effects both improve the games look significantly and are easily noticeable during normal gameplay - not just in specific scenes, but in general gameplay. Even subtle effects can have a macro impact in terms of the overall realism of the scene even if you have to have those subtle effects pointed out to you - just like ambient occlusion for example which I don't think anyone would argue is worthless.

As to "accusing you of stuff you didn't write", that is a lie. I asked "why do you seemingly care so much whether some see CB2077 as a generational leap or not?". You are in a thread about whether or not Cyberpunk 2020 is a generation leap or not (check the title) making statements like "But a handful of scenes like those don't turn the game into OMG next-genz material" are you not?
 
First point: do you think thousands of hours of professional character artist time went in to implementing the raytracing? You can't just switch those out, they require different staffing and are suitable at different times of the production. Additionally RT was probably relatively cheap to add in terms of development time (compared to the rest of the effects in their renderer) -- at a certain point, the core renderer works great and you have x engineers sitting around and y time to implement techniques using partially solved problems that "just work" if you can feed them the right data and optimize.

...

This is one of those things that people tend to get wrong when they look at game budgets. You can't just divert money from ray tracing and put it towards animation. The people who worked on the ray tracing are the same people that worked on the rest of the renderer. Cutting out ray tracing doesn't free up resources for anything other than maybe different features to the renderer.
 
Outside of just giving developers an incentive to add more collect-a-thon and fetch quests, I'm hard pressed to see what I'm missing out on by not being able to go into every room in a high rise.

Well depending on the game, you might not be missing anything. But I think there are a few cases where procedurally generating - or at least procedurally furnishing and setting out - things like houses might be useful.

In the case of a game like Morrowwind there are a lot of houses with no story or fetch quest elements, that exist just for world building and robbing purposes. The more highly you can automate elements of setting up that world, the more developer resources you can move to where they'll do the most good. So you have lots of houses for gameplay or atmosphere reasons, but you spend very little time on them.

Additionally, it could give you a lot of already functional areas for mission scripters / builders to tie side quests, side stories, incidental events, random events, and relatively "hidden" or rewardingly obscure stuff to without requiring a lot of additional developer time. So where in something like GTA someone shouts you from a street corner, it could be walking past someone's house. I can think of a few other potential reasons too that might be worth it if the labour costs were low enough. I guess the payoff all depends on the game and the player.

You could apply the same idea to forests, caves, rooms on a spaceship, or whatever I guess. Might be good for replayability for some types of game too I suppose.

In the case of Shenmue 2, I think it's simply part of an unfinished element of the game. The population of Kowloon were probably supposed to have houses they went back to (like everyone in Shenmue 1) and there were probably events and side stuff that were supposed to happen there. A lot got cut from the games - particularly the second - because Sega couldn't afford big delays. In the end, it didn't help.
 
Well depending on the game, you might not be missing anything. But I think there are a few cases where procedurally generating - or at least procedurally furnishing and setting out - things like houses might be useful.

In the case of a game like Morrowwind there are a lot of houses with no story or fetch quest elements, that exist just for world building and robbing purposes. The more highly you can automate elements of setting up that world, the more developer resources you can move to where they'll do the most good. So you have lots of houses for gameplay or atmosphere reasons, but you spend very little time on them.

Additionally, it could give you a lot of already functional areas for mission scripters / builders to tie side quests, side stories, incidental events, random events, and relatively "hidden" or rewardingly obscure stuff to without requiring a lot of additional developer time. So where in something like GTA someone shouts you from a street corner, it could be walking past someone's house. I can think of a few other potential reasons too that might be worth it if the labour costs were low enough. I guess the payoff all depends on the game and the player.

You could apply the same idea to forests, caves, rooms on a spaceship, or whatever I guess. Might be good for replayability for some types of game too I suppose.

In the case of Shenmue 2, I think it's simply part of an unfinished element of the game. The population of Kowloon were probably supposed to have houses they went back to (like everyone in Shenmue 1) and there were probably events and side stuff that were supposed to happen there. A lot got cut from the games - particularly the second - because Sega couldn't afford big delays. In the end, it didn't help.


Such as

https://www.guerrilla-games.com/read/gpu-based-procedural-placement-in-horizon-zero-dawn

the GPU based procedural placement system that dynamically creates the world of Horizon Zero Dawn around the player. Not limited to just rocks and trees, the procedural system assembles fully-fledged environments while the player walks through them, complete with sounds, effects, wildlife and game-play elements.

Next gen might take this to a new level of VRS,

Variable rate stuff
 
I imagine CDPR was more interested in stomping out as many bugs as possible to reduce the shitshow that's now reality instead of adding on top of it by trying to throw in RT support for AMD's newly released cards.

Agreed. In which case, stability and performance on consoles >>>>>>>>>>>> RTX >> AMD RT.
Though if we were to think of total available market, PS5+SeriesS+SeriesX are probably worth a lot more than RTX at this point.
Of course it's feasible that CDPR couldn't have access to the new gen devkits on time for RT implementation, but that didn't stop Ubisoft Toronto from using RT on Watchdogs Legion. I don't remember if Codemasters got RT shadows working on the new-gens for Dirt 5, also.


First point: do you think thousands of hours of professional character artist time went in to implementing the raytracing? You can't just switch those out, they require different staffing and are suitable at different times of the production. Additionally RT was probably relatively cheap to add in terms of development time (compared to the rest of the effects in their renderer) -- at a certain point, the core renderer works great and you have x engineers sitting around and y time to implement techniques using partially solved problems that "just work" if you can feed them the right data and optimize.
Is RT that cheap to implement though? At the very least, I'd say you need to manually define each light source, reflection distance and LOD for each reflective surface, etc.
How cheap would that be compared to e.g. applying subsurface scattering to the street NPC skins?
I wonder if you'd really need thousands of hours of character artist time to make those NPCs look ridiculously better.


I'd ballpark something like 40-60 gigs for the scene we saw, from memory, but that number could be like 50% off since I don't really remember how many unique assets we saw in the demo.
Not many. For the most part the character walks around tight spaces, the statues were all the same and the flying scene at the end seems to be hiding a lot of lower polygon count geometry by using motion blur.


Firstly, your original point that the majority of the RT doesn't look any better on than off has been disputed numerous times.
No, I never said that.
Your whole crusade is based on a wrong bunch of assumptions.

My point has always been that RT is nice but particularly in CP77 its advantages are trickled down by the white elephant in the room that are the low-quality NPCs (the anonymous ones at least) and a bunch of inconsistent assets throughout the game. And the bugs, so many bugs.
That's why the emperor might be wearing a nice gold-plated shirt but he's missing his pants.

As for RT in general, my opinion is simply that higher degrees of its utilization bring diminishing returns.


As to "accusing you of stuff you didn't write", that is a lie.
And the dishonesty continues.
Here:
You've invested a lot of your time to try persuade people that's not the case (citing PlayStation exclusive games that you think look better in an entirely unpredictable twist)
Where do I say there's a Playstation exclusive game I think look better?
You're so focused on this ridiculous pitchfork pursuit that you didn't even realize you're quoting the wrong person.
And after I called you out on that mistake, not only do you not own it but then you even double down on it.
If you can't at the very least be honest and mature in your comments then this conversation isn't worth having.



@pjbliverpool They announced sell through is 13 million copies after returns, so they're doing fine.

https://www.cdprojekt.com/en/wp-content/uploads-en/2020/12/current-report-no-672020-1.pdf
Refunds are still ongoing and we don't know if they'll have to pay their investors within the course of a class-action suit.
Note: I really hope CDPR comes out on top after all of this.


Base UE4 + OpenWorld
Still... I've been playing Days Gone on the PS5 and it has excellent graphics + great performance on all consoles. It's also an UE4 open world game.
Perhaps the differentiator here is it being UE4 + openworld + multiplayer.
 
In the case of Cyberpunk, it would look a lot better and more immersive if there had been more effort put into e.g. NPC character modelling, instead of enabling a full suite of raytracing options.

Even more immersive? Yes please :p Seriously, this game is a true generational leap, its the best looking game out there in a total package. Even without ray tracing its a real good looking game.
From spiderman to miles morales, thats what i would call not being that leap. CP2077 is designed around pc in mind, ignoring consoles at first. Hence the bad performance on them. Comes with the package. Im glad they did, sales are through the roof (thanks to pc), and we got the leap we got.

deal with Nvidia?

I still dont think there is any conspiracy going about.

consoles >>>>>>>>>>>> RTX >> AMD RT

Nah, sales/pre order where higher on PC.

PS5+SeriesS+SeriesX are probably worth a lot more than RTX at this point.

Nope. Since RT is subpar on them, they couldnt deliver their vision on those. Also, theres people playing without ray tracing on pc, and it still looks bonkers.
People have a choice there.

My point has always been that RT is nice but particularly in CP77 its advantages are trickled down by the white elephant in the room that are the low-quality NPCs (the anonymous ones at least) and a bunch of inconsistent assets throughout the game. And the bugs, so many bugs.
That's why the emperor might be wearing a nice gold-plated shirt but he's missing his pants.

Then you have a totally different take then most here, and in particular DF, they call it the game changer. I think il side with them :)

Note: I really hope CDPR comes out on top after all of this.

Im sure you do ;)

Still... I've been playing Days Gone on the PS5 and it has excellent graphics + great performance on all consoles.

Hm, my personal opinion is that it isnt all that great of a looker. HZD, and death stranding are much and MUCH better looking open world games. Perhaps not a fair comparison since those use own custom engines by high quality AAA studios, but still.
 
Judging by the leaked ConsoleEarlyNextGenQuality configuration, it appears like there will be no RT in the next gen patch for the consoles.

I mean it can change but I doubt it. Frame budget is way too tight for this even without RT.
 
Judging by the leaked ConsoleEarlyNextGenQuality configuration, it appears like there will be no RT in the next gen patch for the consoles.

I mean it can change but I doubt it. Frame budget is way too tight for this even without RT.

Hm, maybe just for the reflections? Would be boring and maybe not worth the performance hit though.
 
Hm, maybe just for the reflections? Would be boring and maybe not worth the performance hit though.
Nope, no reflections, no shadows and no diffuse lighting from what I saw. Quality cranks up post processing effects, LOD, SSR, shadow maps etc but unfortunately no RT.

Could be that it just doesn't affect RT settings for the PC version but I doubt it, the framerate seems to be pretty inline how I would expect the consoles to perform.

By the way, I don't think reflections are boring. They make a huge difference because it also changes how light appears on the surface of materials, indirectly affecting the game's lighting as well. Roads for example look a lot more natural with reflections, even when they are not reflective at all!
 
Still... I've been playing Days Gone on the PS5 and it has excellent graphics + great performance on all consoles. It's also an UE4 open world game.
Perhaps the differentiator here is it being UE4 + openworld + multiplayer.

The difference is Days Gone has drastically revamped portions of the engine, where as ARK 1 is base UE4.
 
While not a real game for sure, it was still more than a classic non interactive tech demo, that UE5 demo was played by someone who had control other the character and the camera, the demo used AI for the bats, physics for the character, water, destruction, even dynamic animations for the climbing part, it's all said in the video of the demo.
It's running in the same conditions as a game like say tomb raider, just that it's a small portion so they could crank up the detail and use a lot of data.
No doubt it is a real time demo, but vertical slices don't have the code bloat full games have when producers request all these features, especially an open world game where many things outside of graphics can tank frame rate. And I have to repeat, let's see how well UE5 games run on lower end PC's and consoles.
Not directed toward you Karamazov, but if the general conversation here is about opinions on what is the better way to use limited resources, it's going to go in circles cause you know what they say about opinions. In truth, I've never fully liked CDPR's art direction even in Witcher. But that's not so much a technical analysis. 2077 + RT requires next gen hardware so it's next gen. And it's not just RT that makes it so. Someone posted Crysis running with RT on Xbox OneX because there was enough headroom for software RT on an older console.
 
Lets just agree to disagree on that.



I didn't mean the absolute best. I simply meant a standard bearer for the console generations graphics. Most people would still consider UC4 one of the best looking games on PS4 even if there are a few that are better looking. But all those games you mention came much later in the generation than the first 3 months.

When a new console launches, we generally get a slew of cross gen titles or remasters than look like shinier versions of last gen titles before we start getting games that look like they could only be run on next gen hardware, at least in their intended form. Other posters shakey claims to the contrary I believe that CB2077 maxed out the first example of such a game that we've had this gen (with the probably exception of FS2020). But those early next gen games are always exceeded later on in the generation by the true top tier standard bearers for the whole generation. I believe UC4 falls into that category even though graphics have been even further refined since.
I guess to my eyes Cyberpunk veers closer to a shinier last gen title than the latter. Not completely, just more so. It punches higher than something like Black Flag/Ghosts but below Ryse/Shadowfall.
 
Back
Top