A Generational Leap in Graphics [2020] *Spawn*

Once we start seeing ground up PS5 and Series X games we're going to be far more impressed and Cyberpunk will be all but forgotten.

Yes, every new graphical benchmark in history, has been left behind by newer games eventually, regardless of how big a jump it is. That's not really relevant to what you can play today though is it?

Honestly, I think many of the detracting posts in this thread have more to do with the posters not being happy that their chosen platform/IHV can't make the game look like this rather than expressing genuine concerns about the games graphics.
 
Its not even a fair comparison to run the 3090 at 972p, since cost per pixel scales very non-linearly vs the console version due to raytracing (and all of the other dialed up effects, like volumetrics).
So am I the only one here who looks at this video and thinks that, on about 75% of the examples @Dictator shows to highlight the benefits of RT-ON, the emperor is kind of missing his pants and using polkadot underwear?

look at your own cherrypicked screen dude -- sure the npc quality isnt great, but even in this really awkward unflattering shot the benefits of RT are obvious on the npcs. Every npc in the rt off shot has some kind of fresnel rim light glow, to varying extents, because of the inaccurate shading. Little things like that add up a lot in motion (not to mention how much better npcs look in more flattering cases)

edit: btw, I played the game on a 2060 super with heavy dlss (probably running at or below the resolution that the consoles are, and at a 25-40 fps) -- definitely not ideal, but a good "like for like" comparison with the lastgen console versions in res and performance, except the game looked completely amazing instead of like shit. You should try that config out if you want to make comparisons with ps4/xbox one
 
Last edited:
True, though to be fair, the new PlayStation is running a game that's not optimised for a last gen machine in backwards compatibility mode. So while, yes, the PC has a significantly better spec we shouldn't be so quick to judge a BC game on the PlayStation and thereby assume we've seen its maximum capability.

I wouldn't judge a modern GPU by its capacity to run the OG Crysis. :)

I agree, I wasn't intending that to be a judgement on the PS5's overall performance, simply an explanation of why the PS5 looks to be running faster in that video than the 3090 and why using that as a basis for comparison is wrong.
 
some facts:
CP2077 came out more than 7 years after the release of the PS4 and Xbox One

7 years before Crysis 1, PS2 was not on the market, Gamecube wasn't announced yet and even Xbox did not exist in the market.
Just for one moment, imagine Crysis 1 also running on N64 and Sega Dreamcast. And people needing to point out specific moments in comparison videos to show that the difference between a 3000 euro pc and a 7+ year old low budget console ?

If that were the case, nobody would have called Crysis 1 a generational leap. Crysis 1 was a generational leap because it could not be done on then current generation hardware that came out 1 year ago, let alone more than 7 years ago.

considering the kind of comparisons you make now I think if you were posting back then you woulda been saying that crysis wasn't that impressive because perfect dark had crisper reflections on the n64.
 
Yes, every new graphical benchmark in history, has been left behind by newer games eventually, regardless of how big a jump it is. That's not really relevant to what you can play today though is it?

Honestly, I think many of the detracting posts in this thread have more to do with the posters not being happy that their chosen platform/IHV can't make the game look like this rather than expressing genuine concerns about the games graphics.

I think you're one of the most reasonable posters on this forum. Not sure that you're right with either of the last two comments though.

I haven't been a PC gamer since 2000 (and actually I hardly play games anymore - Spelunky 2 is the last game I loved). I purchased a PS5 for my kids' Xmas.

For me, the technology and the art is what I'm interested in these days. I like B3D because the majority of the posters are reasonable and intellectual people. It's refreshing and often challenging.

I've raised criticisms of 2077, not because the technology is bad, but because the implementation doesn't do it many favours. I genuinely looked for the first two screenshots earlier in this thread that were a near match. I even used both from the PC.

My genuine feeling is that 2077 isn't as strong visually as some other games are, not because it has a weak feature set (it doesn't!), but because the eventual image isn't as pleasing.

I looked again at the same two videos from Alex and Watch Dogs Legion consistently looks better to me. If I were to chose a game to show my partner, coz I'm amazed at how modern games look, it wouldn't be 2077. No matter how much I talked about how ray tracing can finally be done in real time.
 
Last edited by a moderator:
And people needing to point out specific moments in comparison videos to show that the difference between a 3000 euro pc and a 7+ year old low budget console ?


I think this is the best point made about CP77's claims of it being a generational leap (not between console and PC, but between PC RT-OFF and PC RT-ON).
If one needs to watch a video that serves the specific purpose of pointing out and explaining the differences between RT-ON and RT-OFF to be made aware of the RT-ON advantages, then the RT-ON version is definitely not looking next-gen enough to be considered as such.


On the other hand, when I show this video to my wife and friends, I don't need to make any explanation to impress them:


Everyone I've shown this to simply acknowledges that it's completely unlike anything we have right now, and we'll be in the next-gen when games look even remotely like it.
And much to the dismay of many (me included, at the time), this is running entirely with zero raytracing, on a $500 console or what should turn out to be the equivalent of a mid-range GPU coming in early 2021 and without resorting to AI-based upsampling techniques.




Honestly, I think many of the detracting posts in this thread have more to do with the posters not being happy that their chosen platform/IHV can't make the game look like this rather than expressing genuine concerns about the games graphics.
To summarize, the ones disagreeing with you are mostly jealous and biased?
An odd comment to make, especially coming from someone who just warned against straying into trolling territory.
I think there are enough arguments out there to "allow an informed discussion without bringing this kind of rubbish here".
 

I am not trolling, I am really honest, in this video I initially assumed the PC version was running on a low end (Intel) CPU, and a 2060 grade GPU because of the low frame rate, and also I thought it was an unfair comparison because it did not really look like RT was turned on at all. Not to the extent in the DF video where the non-RT version did not have any shadows in most cases

It must be the video compression then because there is no way in hell that is a 3090 in the video.

The same way that the hair in Fifa2021 is a generational leap, and the game itself is not, I believe the lighting is a generational leap in CP2077.

I didn't mean to offend anyone, just stating the facts.



I think this is the best point made about CP77's claims of it being a generational leap (not between console and PC, but between PC RT-OFF and PC RT-ON).
If one needs to watch a video that serves the specific purpose of pointing out and explaining the differences between RT-ON and RT-OFF to be made aware of the RT-ON advantages, then the RT-ON version is definitely not looking next-gen enough to be considered as such.


On the other hand, when I show this video to my wife and friends, I don't need to make any explanation to impress them:


Everyone I've shown this to simply acknowledges that it's completely unlike anything we have right now, and we'll be in the next-gen when games look even remotely like it.
And much to the dismay of many (me included, at the time), this is running entirely with zero raytracing, on a $500 console or what should turn out to be the equivalent of a mid-range GPU coming in early 2021 and without resorting to AI-based upsampling techniques.





To summarize, the ones disagreeing with you are mostly jealous and biased?
An odd comment to make, especially coming from someone who just warned against straying into trolling territory.
I think there are enough arguments out there to "allow an informed discussion without bringing this kind of rubbish here".

Fully agree. UE5 will probably also run on iPhone, PS4 and Switch, but the video you linked will be impossible on last generation hardware, for me that means a generational leap as well. You will see it even in the thumbnail, even when squinting or when far away from the display. It does not require a documentary or a 200% zoom to point it out, it simply exists the difference between that UE5 demo and all that came before it.

PC is much much much more powerful than even PS5 and Series X and Series S all combined, so it would run the UE5 demo even better (the PS5 demo was at 1440p?)

Nobody is denying that PC is the most powerful platform ever
 
And much to the dismay of many (me included, at the time), this is running entirely with zero raytracing, on a $500 console or what should turn out to be the equivalent of a mid-range GPU coming in early 2021 and without resorting to AI-based upsampling techniques.

Yeah ue5 looks amazing, but it also has tons of visual defects you could cherrypick just like good RT -- there's an incredibly long accumulation time for lighting changes, there's a total lack of transparent surfaces(!), foliage, reflections(!), there are lightbleed artificats, etc. More importantly: We won't see games shipping with it for years. Not really fair to compare games that started dev six years ago with games that start dev a year from now.

Edit: Also, for your premise -- if you let players walk around in a game with rt on vs off for 5 minutes the difference is incredibly obvious. The only people who need comparisons are stubborn gamers who haven't actually tried the content.

I think all the bickering in this thread is nonsense. Graphics (even sound) are very subjective. Personally, CP 2077 is a very gorgeous game with RT, without it, it looks no better than GTA V, IMHO. But that's just my personal opinion.

Please remember folks, there is always the ignore feature.

As a whole they're subjective, but a large component is not subjective and this is ostensibly a technical forum.
 
Last edited:
I think this is the best point made about CP77's claims of it being a generational leap (not between console and PC, but between PC RT-OFF and PC RT-ON).

It's not the RT on vs Off that makes this game a generation leap in some peoples views. It's the RT on top of what is already a visually stunning game that can't really run on the previous generation consoles.

If one needs to watch a video that serves the specific purpose of pointing out and explaining the differences between RT-ON and RT-OFF to be made aware of the RT-ON advantages, then the RT-ON version is definitely not looking next-gen enough to be considered as such.

That's not the purpose of the video and it's disingenuous of you to suggest that it is. Alex is explaining the technology behind the RT, not pointing out where to look for it. The differences are obvious at a Macro level.

On the other hand, when I show this video to my wife and friends, I don't need to make any explanation to impress them:

Your implication being that people looking at a maxed out Cyberpunk 2077 would not be impressed without having to have it's RT elements explained to them?

I think that's a statement that most reasonable people would find unreasonable.


[Everyone I've shown this to simply acknowledges that it's completely unlike anything we have right now, and we'll be in the next-gen when games look even remotely like it.
And much to the dismay of many (me included, at the time), this is running entirely with zero raytracing, on a $500 console or what should turn out to be the equivalent of a mid-range GPU coming in early 2021 and without resorting to AI-based upsampling techniques.

As amazing as it looks, it's a demo made using assets that would be impossible to use in a shipping game due to it's size. We may well get final games that look comparable to that in a few years but I don't see why that should detract from the likes of CB2077 now if they show a clear uplift over currently available titles.

To summarize, the ones disagreeing with you are mostly jealous and biased?
An odd comment to make, especially coming from someone who just warned against straying into trolling territory.
I think there are enough arguments out there to "allow an informed discussion without bringing this kind of rubbish here".

So why do you seemingly care so much whether some see CB2077 as a generational leap or not? You've invested a lot of your time to try persuade people that's not the case (citing PlayStation exclusive games that you think look better in an entirely unpredictable twist) so I'm curious as to why this seems to matter to you so much?
 
As a whole they're subjective, but a large component is not subjective and this is ostensibly a technical forum.

Yes, technical discussions that usually devolve into e-penis wagging and platform bickering over someone's personal opinion. But since we're talking technical... why does CP 2077 NPC AI look outdated when compared to other open-world games? As if the NPCs are drone like.
 
Yes, technical discussions that usually devolve into e-penis wagging and platform bickering over someone's personal opinion. But since we're talking technical... why does CP 2077 NPC AI look outdated when compared to other open-world games? As if the NPCs are drone like.

Well this is a graphics technical forum so i'm a bit out of my depth, but I'll swing:
Probably some combination of -
1. Lack of development focus on AI, since the game is first and foremost a linear story based rpg
2. Development chaos or lack of talented designers, due to high crunch and low pay atmosphere that devs are known to flee
3. A large backlog of critical bugs preventing bug fixes or feature improvements required for cutting edge AI
4. Technical challenges with pathfinding or other key systems that interact with their engine/world, exacerbated by requirement to target jaguar cpus

As amazing as it looks, it's a demo made using assets that would be impossible to use in a shipping game due to it's size. We may well get final games that look comparable to that in a few years but I don't see why that should detract from the likes of CB2077 now if they show a clear uplift over currently available titles.

Seeing as ue5 is middleware, I think it's fair to concede this one point -- I dont think they would be selling the product if it it has a huge impossible to solve blocker like having to ship 1tb games. That probably is going to be solved by release.
 
How many games last gen ran at between 20-30fps with drops as low as 15fps at 720p on the PS4? Can you name one? The PS4 is blatantly not capable of running even a massively cut down version of the game to the usual standard of "last gen" games. That alone suggests that the game even without RT goes beyond everything that's come before.
DF have pointed out that both the XBone and PS4 versions are apparently running with the same assets and shadow resolutions as their mid-gen upgrades (PS4 Pro and OneX), meaning there has been very little optimization for the 2013 consoles.
Also, both the PS5 and SeriesX versions are just running the code for their respective mid-gens, just at higher framerates or dynamic resolutions. The RDNA2 consoles are probably just running emulated Polaris code.

In the end, this points to CDPR having focused their console efforts on the PS4 Pro and OneX first and foremost, and then planned to scale it up for the 2020 consoles and scale it down for the 2013 consoles (but never got enough time for this last part).
Which.. seems like a pretty terrible decision from the get-go, I think?

Nonetheless, I think CDPR is probably right when they say they're going to make the game look great on the base 2013 consoles, and what we're seeing right now isn't at all representative of those consoles' limitations.



look at your own cherrypicked screen dude
It's a cherrypicked screen taken from a video that consists of taking cherrypicked scenes that show the advantages of enabling raytracing in the game.
In the end, I'd say at worst it's a balanced sum of cherrypicking that nullifies itself.


Yeah ue5 looks amazing, but it also has tons of visual defects you could cherrypick just like good RT -- there's an incredibly long accumulation time for lighting changes, there's a total lack of transparent surfaces(!), foliage, reflections(!), there are lightbleed artificats, etc. More importantly: We won't see games shipping with it for years.
None of which matters to the point I made. The UE5 demo just looks much better than anything out there, and for whatever the engine limitations it might still have, by the time the games made on it start shipping, I'm sure devs will make up some trickeries to hide them.
What matters is the demo looks next-gen to the untrained eye. And the market is driven / games are purchased mostly by untrained eyes, not experts.


Edit: Also, for your premise -- if you let players walk around in a game with rt on vs off for 5 minutes the difference is incredibly obvious. The only people who need comparisons are stubborn gamers who haven't actually tried the content.
How about we just try to communicate our own opinion based on our arguments instead of making broad attacks and crude generalizations towards everyone who disagrees?
It's just a game. There's no need to get so worked up over other people's opinions.
 
Well this is a graphics technical forum so i'm a bit out of my depth, but I'll swing:
Probably some combination of -
1. Lack of development focus on AI, since the game is first and foremost a linear story based rpg
2. Development chaos or lack of talented designers, due to high crunch and low pay atmosphere that devs are known to flee
3. A large backlog of critical bugs preventing bug fixes or feature improvements required for cutting edge AI
4. Technical challenges with pathfinding or other key systems that interact with their engine/world, exacerbated by requirement to target jaguar cpus

While a graphic centric forum... AI/DL topics are a plenty.

Anyhow, great response... I guess studios such as Rockstar are far more capable at creating more convincing NPCs (GTA V, RDR 2, etc.) even on dated jaguar hardware and budgets under 270M.
 
Kindof a bigger theme I find frustrating with forums like this which makes the side by side comparisons unproductive: The trend in graphics has always been moving towards effects that are more accurate, more flexible, more universal. Typically steps in that direction involve (very) considerable performance hits, so the advancement might be coupled with an overall decrease in resolution, precision, or a decreased scope in games.

Every time
one of those steps gets taken, if gamers can notice, they throw a fit and make a bunch of comparisons about how, i don't know, say how we lost msaa when we jumped to deferred rendering (somehow nobody noticed we lost transparency iirc, which is mind boggling to me). PBR brought muddier lighting and "glowing" artifacts. ssao darkened images -- real GI techniques lacked the sharpness and precision of SSAO. The next generation of shaders and graphics cards brought a lot of complex effects, area lights, volumetrics, etc, but along with them temporal anti aliasing and upscaling techniques which permit stochastic rendering techniques and temporal upscaling (b i g compromises that are necessary to run these effects at anywhere near practical framerates.)

Raytracing has performance hits and flickering. Lumen and nanite will have their own drawbacks. What matters is that both bring us huge steps forwards in terms of what kind of content games can represent, how fast and flexibly artists can work, and totally massive increases in accuracy.

Engineers and game developers aren't ever wasting resources or interested in ripping you off, they're trying to solve huge open problems in rendering good graphics, you've just gotten used to the side effects of the current state of the art.


While a graphic centric forum... AI/DL topics are a plenty.

Anyhow, great response... I guess studios such as Rockstar are far more capable at creating more convincing NPCs (GTA V, RDR 2, etc.) even on dated jaguar hardware and budgets under 270M.

This reads kinda sarcastic (not sure if it's intended to be), but I think it's literally true: Almost all of gta (and therefor rdr's) design is based around responsive AI. They're clearly at the top of the pack in the world in this. They've iterated on the same systems for several games rather than switching genre and focus. They've invested a lot of money into behavior related middleware like whatever that physics tech they first used in gta4 was. Also, gta5, gta4, and rdr2 all clearly share tech and were a combined what like, five hundred million dollars? (altho gta4 had better ai than cyberpunk does right out of the gate)

They also have problems with crunch, but you usually see other parts of their games suffer (there's bad uv stretching in like a billion parts of rdr2's environment, but nobody cares -- probably a much wiser choice on where to cut corners than the ones cdpr made.)
 
Last edited:
This reads kinda sarcastic (not sure if it's intended to be), but I think it's literally true: Almost all of gta (and therefor rdr's) design is based around responsive AI. They're clearly at the top of the pack in the world in this. They've iterated on the same systems for several games rather than switching genre and focus. They've invested a lot of money into behavior related middleware like whatever that physics tech they first used in gta4 was. Also, gta5, gta4, and rdr2 all clearly share tech and were a combined what like, five hundred million dollars? (altho gta4 had better ai than cyberpunk does right out of the gate)

They also have problems with crunch, but you usually see other parts of their games suffer (there's bad uv stretching in like a billion parts of rdr2's environment, but nobody cares -- probably a much wiser choice on where to cut corners than the ones cdpr made.)

No your response was great. :yep2:

My opinion about R* being able to use outdated hardware (jaguar) more effectively than CDPR when it comes to AI in GTA V and RDR 2, was simply me pointing that out after your response (not in a sarcastic manner).
 
By some logic, GTA V on PC was also a generational leap because it ran the game better than PS3 on which it launched 2 years earlier
 

I guess you don't agree with the following then? :

"while it is true that CP2077 on Xbox One 2013 runs the exact same assets as PC, it doesn't really count because the frame rate and resolution is low!! So PC represents a generational leap"
 
Back
Top