Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

I don’t think we can stipulate what it should cost to run the latest graphics because that is an undefined workload that varies significantly from game to game. Maybe reviewers should back off their obsession with ultra max settings.

What we can expect is that games should look good and run well on affordable cards and they do for the most part.
I think we can have pretty set expectations here based on the existence of the similar and fixed spec consoles that games are ultimately being built around.

Of course not every game will run the same, but in the year 2024, it should not cost $600+ to run some new game at a >PS4/XB1 level of resolution just to have ray tracing and 60fps. This isn't because of some huge shift in relative demands over the years, it's because that $600 GPU that people need now would have only cost like $330 not so long ago. That's the real issue. People would be way more glad and accepting of ray tracing if it they could have it and not have to make sacrifices that essentially make it a sidegrade because they cant run a level of resolution that is beyond what most people expected over a decade ago already in 2013.
 
I'm not talking static images, unless you don't actually play games?



No it's not, it's all part of the same discussion.



Yes they do, maybe try picking one up and trying it.



It does, the remaster's TAA is very blurry, even at 4k.

The original 2007 release with 4xMSAA+4xTrSSAA absolutely trashes the remake in terms of image quality.



*Blurrier.



They rely on that because they ideally need to be 1:1 pixel mapped with the display.

That's not a problem on a CRT.



You really should go and try a CRT.
I'm 40 years old. I grew up on nothing but CRT's.

You cant just pretend that static image quality doesn't matter. It does. And CRT's are naturally softer(or BLURRIER as you insist) than LCD's in this respect. Not everybody is always playing Doom Eternal where they are in constant rapid motion basically the entire time. It's a tradeoff even at a given resolution, plus there's no real 4k CRT's in the first place.

And no, CRT's have no special technology that helps them deal with the kinds of aliasing that we regularly deal with nowadays, especially from complex shader aliasing.

Also, it's obviously very dishonest to suggest that a game running with 4x supersampling is running at 1080p. Come on now.
 
stalker-2-heart-of-chornobyl-geforce-rtx-2560x1440-nvidia-dlss-3-desktop-gpu-performance.png


A 4090 is 1fps faster than a 4080 Super. Bang-up job!
 
That says laptop 4090.
Focus not on the picture but on the comment of Sebbi. Desktop 4090 also doesn't reach 120fps on native 1080p (only 85fps!). Also the difference between 4090 and 4080 in performance is very small even at 4K, suggesting a CPU or engine bottleneck in the game.

 
Last edited:
Focus not on the picture but on the comment of Sebbi. Desktop 4090 also doesn't reach 120fps on native 1080p. Also the difference between 4090 and 4080 in performance is very small even at 4K, suggesting a CPU or engine bottleneck in the game.

The game scales terribly. The 4090 is just 49% faster than the 4070 when it's usually close to double the performance at 4K.
 
Lego Horizon launched with software Lumen only, yet it barely maintains 60fps at 4K DLSS Quality. Horizon Forbidden West runs measurably faster while also looking better.



Looking better is debatable as the game has a lot of obvious issues such as light leakage which can be quite serve in places.

Lego Horizon looks way more coherent visually.
 
Back
Top