well, regarding CP2077 "theres a reason it looks like a
PS3 PS4 game
", as a matter of fact; CP2077
is a PS4 game
Well, a 720p, sub-30fps PS4 game with reduced settings, yes.
And by similar logic, Demons Souls is a PS3 game. I would never describe it as such, because it would be entirely unreasonable to do so. But what you're doing above isn't much better than that and it highlights the fact that just because the same game can run on an older platform (in this case 2 console generations older) does not mean that the best version of that game should be considered technically of visually equivalent to the worst version.
Nvidia partnered with them too put their version of RT in the game first.
Cyberpunk isn't a "RT game", is a "RTX game".
If they really wanted to make their game as good as possible they should have worked on a version that would work on all RT enabled hardware.
Still, I don't understand what people really see so next gen in this game, removing the RT differential it fails to meet expectations even for the past gen.
Cyberpunk started development 7 years ago. Nvidia released RT hardware 4 years ago. AMD released RT hardware 4 weeks ago. I trust you get my point?
Almost no one has said their is no difference/RTX on isn't better. We just don't all consider the final visual result a generational improvement.
If the difference between Cyberpunk ultra and ultra+RT was equivalent to the level of visual improvement achieved in Sony 1st party PS5 exclusive games do you think peoples views would really change?
I don't think anyone's really saying that the difference between RT on and off is the generation difference. Potentially transformative to the graphics perhaps, but on it's own, not a generational difference. It's the entire package. The base game is gorgeous even by non-massively dense open world standards, but exceptionally so for that genre. And when you add in the RT, then some people, including myself, are arguing that it's the first game that represents a true "taste" of what the next generation of games will look like.
Obviously this is entirely subjective so no-one can be right or wrong here, but to my eyes, those video's I posted earlier in the thread for example are a clear step above everything else I've see to date (although as some have already mentioned there are games which can compete in some ways like RDR2 or FS2020). Of course it will be surpassed soon enough. But just as the likes of Killzone Shadowfall were considered a clear step above what had come before (at least in the console space) when the PS4 launched, with the full knowledge that it would be rapidly exceeded, I don't see why CB2077 can't be shown the same consideration.
Something I consider an example of generational improvement.
Yes but that's a best to best comparison. Cyberpunk is really a best to worst comparison. i.e. it's naturally being compared to the best of the outgoing generation, but being the (claimed) first visually "next gen" game, it's going to be sitting towards the bottom of the pile at the end of the gen. That doesn't mean it can't still be considered representative of next gen graphics though. Otherwise we'd have to say there is no such thing as next gen graphics until the very best examples are launched, several years into the console cycle. Some may well say that of course, and they wouldn't necessarily be wrong, because as I said, this is entirely subjective.
It's 44%
https://comicbook.com/gaming/news/the-witcher-3-sales-platform-breakdown-which-system-sold-most/#:~:text=Another interesting takeaway is that,while the Switch has 1%.
Let's not pretend that a lot of PC owners are also struggling to get good performance from this game even on Nvidia's previous generation graphics cards. Wander over to the Cyberpunk thread in the console forum which is crammed with PC owners swapping tips and hex edits to get better performance.
In fairness, the game runs surprisingly well on older PC hardware without RT. The threading issue effecting 6 and lower core Ryzen processors has been patched now, but even then I don't believe it was resulting in sub 30fps gameplay which for me at least is the line between "performance issues" and "no performance issues".
Here it is breaking a 30fps average on the GTX 1060 which is a 2 generations (4 years) old $249 (at launch) GPU. That's at medium quality and 1080p which is more than reasonable for a mid range GPU of that age.
What I believe also plays a part in all this: PC gamers realising on their machine, no matter how much it cost; this game cannot hold 60 at native resolution with everything maxed out.
Aside for maybe this game, every multiplatform title is developed for 2013 consoles, which means that any current PC from 600 euro and up, can run those games at 60fps minimum.
Because PCs have always been running the exact same console games but with higher resolution, frame rate and or some effects, the PC gamer consensus is "anything below 60fps is unplayable!!! It hurts my eyes! It makes me ill!"
However, this does not apply to CP2077 because... yeah, even the 1500 euro GPUs cannot run this game at a stable 60.
I'd be pretty pissed as well to be honest. Without upscaling this game cannot not even run at 30 fps lol.
Because their PCs in 2007 also could not run Crysis at native res 30 fps, they believe this game is some technical revolution.
I don't think you understand PC gamers. I would imagine very few of us are happy to not see high end hardware getting pushed to it's limits, and beyond. That's exactly why Crysis was so celebrated in it's day, and the same for Doom 3, Half Life 2 and Far Cry before them.
We want games that will push the highest end hardware to it's limits because that's what PC gaming is all about - the ability to customise your experience based on your own preferences, including pushing beyond the limits of console graphics if you have the money to do so.
But anyway, you base your argument above on a faulty premise of "native resolution". Firstly, what is native resolution? Someone with high end hardware and a 1440p monitor for example can run this game acceptably at maxed out settings without enabling DLSS. Drop that down to 1080p and you can do the same on modern mid range hardware.
And secondly, why would you? DLSS offers virtually indistinguishable image quality at a native resolution output to no DLSS while having a huge performance benefit and being available to every single gamer who is able to turn RT on in this game. When considering the games performance, why would you artificially handicap it to force it to run at lower frame rates than are necessary? It'd be a bit like saying Horizon Zero Dawn can't run very well on the PS4Pro if you turned off CBR? The reasonable response would be, "yeah but why would you do that? It does use CBR, and it looks and runs great".