A Generational Leap in Graphics [2020] *Spawn*

The Witcher 3 was scaled down to accommodate the lesser performing consoles...and people on PC (CDPR's biggest customer segment) complained and expressed disappointment.
Witcher 3's PC sales were 31%.

But it kinda broke the marketing of consoles as "next gen"...and now we have ruffled feathers, the grapes are sour and broken e-peen in a certain crowd. Marketing vs. reality :devilish:

There aren't versions of the game for nextgen consoles, they're running the previous mid-gen console versions with some settings unlocked.

Let's not pretend that a lot of PC owners are also struggling to get good performance from this game even on Nvidia's previous generation graphics cards. Wander over to the Cyberpunk thread in the console forum which is crammed with PC owners swapping tips and hex edits to get better performance.
 
I don't expect people that consider lower resolution / lower FPS / lower image quality / lower settings "equal" to alter their minds one bit.
But I am not speaking to them but the the passive readers so they avoid getting suckered in by fallacies.

I don't fight windmills, but I find this launch funny.
The Witcher 3 was scaled down to accommodate the lesser performing consoles...and people on PC (CDPR's biggest customer segment) complained and expressed disappointment.
CDPR apologized and wowed that this would not happen in CP2077.
And they kept their words.
And that revealed just how MUCH consoles are holding back games using a different approach (designed for consoles, just increase resolution/FPS on the PC)
CDPR are trying to scale the game DOWN to the consoles (not holdning back progress) and it is obviously a very hard thing to do.

But it kinda broke the marketing of consoles as "next gen"...and now we have ruffled feathers, the grapes are sour and broken e-peen in a certain crowd.

Marketing vs. reality :devilish:
I’m not sure what any of this has to do with my post.
 
Witcher 3's PC sales were 31%.



There aren't versions of the game for nextgen consoles, they're running the previous mid-gen console versions with some settings unlocked.

Let's not pretend that a lot of PC owners are also struggling to get good performance from this game even on Nvidia's previous generation graphics cards. Wander over to the Cyberpunk thread in the console forum which is crammed with PC owners swapping tips and hex edits to get better performance.

What I believe also plays a part in all this: PC gamers realising on their machine, no matter how much it cost; this game cannot hold 60 at native resolution with everything maxed out.

Aside for maybe this game, every multiplatform title is developed for 2013 consoles, which means that any current PC from 600 euro and up, can run those games at 60fps minimum.
Because PCs have always been running the exact same console games but with higher resolution, frame rate and or some effects, the PC gamer consensus is "anything below 60fps is unplayable!!! It hurts my eyes! It makes me ill!"

However, this does not apply to CP2077 because... yeah, even the 1500 euro GPUs cannot run this game at a stable 60.

I'd be pretty pissed as well to be honest. Without upscaling this game cannot not even run at 30 fps lol.

Because their PCs in 2007 also could not run Crysis at native res 30 fps, they believe this game is some technical revolution.
 
What I believe also plays a part in all this: PC gamers realising on their machine, no matter how much it cost; this game cannot hold 60 at native resolution with everything maxed out.

Aside for maybe this game, every multiplatform title is developed for 2013 consoles, which means that any current PC from 600 euro and up, can run those games at 60fps minimum.
Because PCs have always been running the exact same console games but with higher resolution, frame rate and or some effects, the PC gamer consensus is "anything below 60fps is unplayable!!! It hurts my eyes! It makes me ill!"

However, this does not apply to CP2077 because... yeah, even the 1500 euro GPUs cannot run this game at a stable 60.

I'd be pretty pissed as well to be honest. Without upscaling this game cannot not even run at 30 fps lol.

Because their PCs in 2007 also could not run Crysis at native res 30 fps, they believe this game is some technical revolution.

I would rather pay €1500 than suffer the console graphics/performance (or lack there of).
I have around €1800 per month labled "for fun"...high-end PC gaming costs are not a problem for me...and it is the cheapest of my hobbies...by a LARGE margin.
Besides, CDPR have stated that modding will come to CB2077...meaning we will be able to cranck it up from "11" to "12"...on the PC...while consoles is about "5".
In ~2 years time I will upgrade/replace my PC, meaning the "next gen" consoles will have fallen even further behind the curve than they already ARE compared to my current PC.

Like anything in life the cost of higher quality is always exponential (PC vs. console gaming eg.)
So keep trying be snide about the costs will do you nothing...except make you seem like the "grapes are sour" kinda guy :p

I pay more...but I also get more (with no regrets)...welcome to life.
 
"i'm rich enough to buy high end PCs so that gives me the right to piss on console peasants and mock them, because some how they Hurt my feelings by liking those shitty boxes"

You won't last long here with that attitude.

I am simply responding to stuff like this:

"However, this does not apply to CP2077 because... yeah, even the 1500 euro GPUs cannot run this game at a stable 60.

I'd be pretty pissed as well to be honest. Without upscaling this game cannot not even run at 30 fps lol."

Not the first time he tried that "angle"...I am simply responding...but I guess you don't care about that? ,)
 
well, regarding CP2077 "theres a reason it looks like a PS3 PS4 game :)", as a matter of fact; CP2077 is a PS4 game

Well, a 720p, sub-30fps PS4 game with reduced settings, yes.

And by similar logic, Demons Souls is a PS3 game. I would never describe it as such, because it would be entirely unreasonable to do so. But what you're doing above isn't much better than that and it highlights the fact that just because the same game can run on an older platform (in this case 2 console generations older) does not mean that the best version of that game should be considered technically of visually equivalent to the worst version.

Nvidia partnered with them too put their version of RT in the game first.
Cyberpunk isn't a "RT game", is a "RTX game".
If they really wanted to make their game as good as possible they should have worked on a version that would work on all RT enabled hardware.

Still, I don't understand what people really see so next gen in this game, removing the RT differential it fails to meet expectations even for the past gen.

Cyberpunk started development 7 years ago. Nvidia released RT hardware 4 years ago. AMD released RT hardware 4 weeks ago. I trust you get my point?

Almost no one has said their is no difference/RTX on isn't better. We just don't all consider the final visual result a generational improvement.

If the difference between Cyberpunk ultra and ultra+RT was equivalent to the level of visual improvement achieved in Sony 1st party PS5 exclusive games do you think peoples views would really change?

I don't think anyone's really saying that the difference between RT on and off is the generation difference. Potentially transformative to the graphics perhaps, but on it's own, not a generational difference. It's the entire package. The base game is gorgeous even by non-massively dense open world standards, but exceptionally so for that genre. And when you add in the RT, then some people, including myself, are arguing that it's the first game that represents a true "taste" of what the next generation of games will look like.

Obviously this is entirely subjective so no-one can be right or wrong here, but to my eyes, those video's I posted earlier in the thread for example are a clear step above everything else I've see to date (although as some have already mentioned there are games which can compete in some ways like RDR2 or FS2020). Of course it will be surpassed soon enough. But just as the likes of Killzone Shadowfall were considered a clear step above what had come before (at least in the console space) when the PS4 launched, with the full knowledge that it would be rapidly exceeded, I don't see why CB2077 can't be shown the same consideration.


Something I consider an example of generational improvement.

Yes but that's a best to best comparison. Cyberpunk is really a best to worst comparison. i.e. it's naturally being compared to the best of the outgoing generation, but being the (claimed) first visually "next gen" game, it's going to be sitting towards the bottom of the pile at the end of the gen. That doesn't mean it can't still be considered representative of next gen graphics though. Otherwise we'd have to say there is no such thing as next gen graphics until the very best examples are launched, several years into the console cycle. Some may well say that of course, and they wouldn't necessarily be wrong, because as I said, this is entirely subjective.


It's 44%

https://comicbook.com/gaming/news/the-witcher-3-sales-platform-breakdown-which-system-sold-most/#:~:text=Another interesting takeaway is that,while the Switch has 1%.

Let's not pretend that a lot of PC owners are also struggling to get good performance from this game even on Nvidia's previous generation graphics cards. Wander over to the Cyberpunk thread in the console forum which is crammed with PC owners swapping tips and hex edits to get better performance.

In fairness, the game runs surprisingly well on older PC hardware without RT. The threading issue effecting 6 and lower core Ryzen processors has been patched now, but even then I don't believe it was resulting in sub 30fps gameplay which for me at least is the line between "performance issues" and "no performance issues".

Here it is breaking a 30fps average on the GTX 1060 which is a 2 generations (4 years) old $249 (at launch) GPU. That's at medium quality and 1080p which is more than reasonable for a mid range GPU of that age.

What I believe also plays a part in all this: PC gamers realising on their machine, no matter how much it cost; this game cannot hold 60 at native resolution with everything maxed out.

Aside for maybe this game, every multiplatform title is developed for 2013 consoles, which means that any current PC from 600 euro and up, can run those games at 60fps minimum.
Because PCs have always been running the exact same console games but with higher resolution, frame rate and or some effects, the PC gamer consensus is "anything below 60fps is unplayable!!! It hurts my eyes! It makes me ill!"

However, this does not apply to CP2077 because... yeah, even the 1500 euro GPUs cannot run this game at a stable 60.

I'd be pretty pissed as well to be honest. Without upscaling this game cannot not even run at 30 fps lol.

Because their PCs in 2007 also could not run Crysis at native res 30 fps, they believe this game is some technical revolution.

I don't think you understand PC gamers. I would imagine very few of us are happy to not see high end hardware getting pushed to it's limits, and beyond. That's exactly why Crysis was so celebrated in it's day, and the same for Doom 3, Half Life 2 and Far Cry before them.

We want games that will push the highest end hardware to it's limits because that's what PC gaming is all about - the ability to customise your experience based on your own preferences, including pushing beyond the limits of console graphics if you have the money to do so.

But anyway, you base your argument above on a faulty premise of "native resolution". Firstly, what is native resolution? Someone with high end hardware and a 1440p monitor for example can run this game acceptably at maxed out settings without enabling DLSS. Drop that down to 1080p and you can do the same on modern mid range hardware.

And secondly, why would you? DLSS offers virtually indistinguishable image quality at a native resolution output to no DLSS while having a huge performance benefit and being available to every single gamer who is able to turn RT on in this game. When considering the games performance, why would you artificially handicap it to force it to run at lower frame rates than are necessary? It'd be a bit like saying Horizon Zero Dawn can't run very well on the PS4Pro if you turned off CBR? The reasonable response would be, "yeah but why would you do that? It does use CBR, and it looks and runs great".
 
Last edited:
In drive club you could see the other cars reflected onto your own car. Outside of screen space even. So SSR was impossible. Maybe the headlights and a basic shape of the car was all it took to have believable reflections onto your car.

Now, 6-7 years later, GT7 is going to do that as well, but with RT. It will probably be more accurate, but certainly more pixelated as well, especially now that since the PS4 generation GT is designed for locked 60..

What I am saying is, people will look at GT7 and go: this is next gen, seeing the other cars onto your own car. Whereas that effect, or at least a very good approximation of it, was possible even in the previous generation albeit at 30fps instead of 60.

That would make you wonder how good a DC2 would look on PS5 hardware, if they employ similar tricks instead of brute forcing it.

here is a video of the ground reflections which look simply perfect to me:

This is what they could achieve more than 6 years ago without RT
 
In drive club you could see the other cars reflected onto your own car. Outside of screen space even. So SSR was impossible. Maybe the headlights and a basic shape of the car was all it took to have believable reflections onto your car.

Now, 6-7 years later, GT7 is going to do that as well, but with RT. It will probably be more accurate, but certainly more pixelated as well, especially now that since the PS4 generation GT is designed for locked 60..

What I am saying is, people will look at GT7 and go: this is next gen, seeing the other cars onto your own car. Whereas that effect, or at least a very good approximation of it, was possible even in the previous generation albeit at 30fps instead of 60.

That would make you wonder how good a DC2 would look on PS5 hardware, if they employ similar tricks instead of brute forcing it.

here is a video of the ground reflections which look simply perfect to me:

This is what they could achieve more than 6 years ago without RT

Those are obviously Screen Space Reflections on the grounds, I see objects disappear unaturally, like here:
https://ibb.co/fq42dpv
 
Nice red herring...I adjust my tone to the "argumentation" I adress.


So as long you you ignore that, but point at me...I will ignore your posts :)

That was my honest thought process;
1) I thought, is RT even turned on?
RT was running at the highest setting...
2) then I thought, well, must a crappy low end PC then, because even the PS5 runs it smoother...
turns out it is the most expensive PC you can currently buy
3) then I thought, well, it must be running without DLSS then...
but the choppy frame rate was with DLSS turned on.

I wanted to give the PC the benefit of the doubt, but at least in that video, with the differences being so minimal outside of specific scenes, and the performance of the PC being so low, even lower than PS5 which a lot of people would shit on as it is only 10 TF or something, my only logical reaction was to either ROFL or LMFAO.
I mean, people on PS5 would be playing this at higher frame rates than most self proclaimed 'master racers' who: out of pride only play at 4K ultra

Is the PC superior to PS5? In almost every way it is. Nobody will ever argue against that so in a way I am on your side
 
That was my honest thought process;
1) I thought, is RT even turned on?
RT was running at the highest setting...
2) then I thought, well, must a crappy low end PC then, because even the PS5 runs it smoother...
turns out it is the most expensive PC you can currently buy
3) then I thought, well, it must be running without DLSS then...
but the choppy frame rate was with DLSS turned on.

I wanted to give the PC the benefit of the doubt, but at least in that video, with the differences being so minimal outside of specific scenes, and the performance of the PC being so low, even lower than PS5 which a lot of people would shit on as it is only 10 TF or something, my only logical reaction was to either ROFL or LMFAO.
I mean, people on PS5 would be playing this at higher frame rates than most self proclaimed 'master racers' who: out of pride only play at 4K ultra

Is the PC superior to PS5? In almost every way it is. Nobody will ever argue against that so in a way I am on your side

But this is an apples to screw drivers comparison so if your intention isn't a simple troll then I'm not sure what point you're trying to prove?

A high end PC has worse performance than a PS5 when running the same game at massively higher settings and resolution.

So what?
 
Those are obviously Screen Space Reflections on the grounds, I see objects disappear unaturally, like here:
https://ibb.co/fq42dpv

Good catch. Honest question, imagine a hypothetical sequel. Should they use generational increase in performance to render reflections that don't disappear near the edges or should they use that for other stuff; what would have the most visual impact?
 
But this is an apples to screw drivers comparison so if your intention isn't a simple troll then I'm not sure what point you're trying to prove?

A high end PC has worse performance than a PS5 when running the same game at massively higher settings and resolution.

So what?

Remember Battlefield 3 on PS3 compared to a high end PC. No pausing or zooming any video required to see the difference

or this:

 
In drive club you could see the other cars reflected onto your own car. Outside of screen space even. So SSR was impossible. Maybe the headlights and a basic shape of the car was all it took to have believable reflections onto your car.

Now, 6-7 years later, GT7 is going to do that as well, but with RT. It will probably be more accurate, but certainly more pixelated as well, especially now that since the PS4 generation GT is designed for locked 60..

What I am saying is, people will look at GT7 and go: this is next gen, seeing the other cars onto your own car. Whereas that effect, or at least a very good approximation of it, was possible even in the previous generation albeit at 30fps instead of 60.

That would make you wonder how good a DC2 would look on PS5 hardware, if they employ similar tricks instead of brute forcing it.

here is a video of the ground reflections which look simply perfect to me:

This is what they could achieve more than 6 years ago without RT

Driveclub was built with magic, so not a fair comparison to any competitor.
 
You really do not have to pause cyberpunk 2077 to see the difference that all the RT options bring 0_o

If you were to pauze the video when the player got in the car with Dex, would you have been able to see the difference?

as for HLJ; if you were to run CP2077 at the PS4 settings you would get 200fps easily on your system, that is without DLSS so all is good!
 
I'm catching up Death Stranding, game looks awsome and characters face's are in own league. Environment is empty and repetitive so maybe thats why looks so good but yeah, it's better looking game than cyberpunk at first glance and 0 rt (but with rt it would looks even better)
 
Back
Top