Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

I don’t think we can stipulate what it should cost to run the latest graphics because that is an undefined workload that varies significantly from game to game. Maybe reviewers should back off their obsession with ultra max settings.

What we can expect is that games should look good and run well on affordable cards and they do for the most part.
I think we can have pretty set expectations here based on the existence of the similar and fixed spec consoles that games are ultimately being built around.

Of course not every game will run the same, but in the year 2024, it should not cost $600+ to run some new game at a >PS4/XB1 level of resolution just to have ray tracing and 60fps. This isn't because of some huge shift in relative demands over the years, it's because that $600 GPU that people need now would have only cost like $330 not so long ago. That's the real issue. People would be way more glad and accepting of ray tracing if it they could have it and not have to make sacrifices that essentially make it a sidegrade because they cant run a level of resolution that is beyond what most people expected over a decade ago already in 2013.
 
I'm not talking static images, unless you don't actually play games?



No it's not, it's all part of the same discussion.



Yes they do, maybe try picking one up and trying it.



It does, the remaster's TAA is very blurry, even at 4k.

The original 2007 release with 4xMSAA+4xTrSSAA absolutely trashes the remake in terms of image quality.



*Blurrier.



They rely on that because they ideally need to be 1:1 pixel mapped with the display.

That's not a problem on a CRT.



You really should go and try a CRT.
I'm 40 years old. I grew up on nothing but CRT's.

You cant just pretend that static image quality doesn't matter. It does. And CRT's are naturally softer(or BLURRIER as you insist) than LCD's in this respect. Not everybody is always playing Doom Eternal where they are in constant rapid motion basically the entire time. It's a tradeoff even at a given resolution, plus there's no real 4k CRT's in the first place.

And no, CRT's have no special technology that helps them deal with the kinds of aliasing that we regularly deal with nowadays, especially from complex shader aliasing.

Also, it's obviously very dishonest to suggest that a game running with 4x supersampling is running at 1080p. Come on now.
 
stalker-2-heart-of-chornobyl-geforce-rtx-2560x1440-nvidia-dlss-3-desktop-gpu-performance.png


A 4090 is 1fps faster than a 4080 Super. Bang-up job!
 
That says laptop 4090.
Focus not on the picture but on the comment of Sebbi. Desktop 4090 also doesn't reach 120fps on native 1080p (only 85fps!). Also the difference between 4090 and 4080 in performance is very small even at 4K, suggesting a CPU or engine bottleneck in the game.

 
Last edited:
Focus not on the picture but on the comment of Sebbi. Desktop 4090 also doesn't reach 120fps on native 1080p. Also the difference between 4090 and 4080 in performance is very small even at 4K, suggesting a CPU or engine bottleneck in the game.

The game scales terribly. The 4090 is just 49% faster than the 4070 when it's usually close to double the performance at 4K.
 
According to an unnamed source, S2 run amazingly well on the Series X. There are actually tons of console specific codes.
 
Lego Horizon launched with software Lumen only, yet it barely maintains 60fps at 4K DLSS Quality. Horizon Forbidden West runs measurably faster while also looking better.



Looking better is debatable as the game has a lot of obvious issues such as light leakage which can be quite serve in places.

Lego Horizon looks way more coherent visually.
 
Lego Horizon launched with software Lumen only, yet it barely maintains 60fps at 4K DLSS Quality. Horizon Forbidden West runs measurably faster while also looking better.


As usual, ray tracing isn't the be all, end all. Eschewing ray tracing offers a bunch of overhead that lets you push other areas quite strongly instead.

Obviously ray tracing is generally superior, all else being equal, but all else is not equal. There's an opportunity cost for it when designing your visuals with it as a base feature.

Looking better is debatable as the game has a lot of obvious issues such as light leakage which can be quite serve in places.

Lego Horizon looks way more coherent visually.
This place may disagree, cuz this place is very weird about this stuff, but I guarantee like 19 out of 20 ordinary gamers would say Forbidden West is so very clearly the better looking game.
 
You literally started out the argument saying that it was 'debatable' that Forbidden West looks better.

Which it is, as it's a subjective topic.

Not sure why you don't understand that.

And where's this argument you claimed I started as I don't see people arguing, on people discussing or debating.

Which very much implied you didn't agree.

So highlighting the obvious implied that I don't agree.
 
I think we can have pretty set expectations here based on the existence of the similar and fixed spec consoles that games are ultimately being built around.

We can and those expectations shift as PC GPUs advance over time vs a fixed console baseline.

Of course not every game will run the same, but in the year 2024, it should not cost $600+ to run some new game at a >PS4/XB1 level of resolution just to have ray tracing and 60fps. This isn't because of some huge shift in relative demands over the years, it's because that $600 GPU that people need now would have only cost like $330 not so long ago. That's the real issue. People would be way more glad and accepting of ray tracing if it they could have it and not have to make sacrifices that essentially make it a sidegrade because they cant run a level of resolution that is beyond what most people expected over a decade ago already in 2013.

I don’t know how you determine the “fair”price level for ray tracing at a given resolution. The consoles certainly aren’t giving us raytracing on the cheap. A more useful metric would be graphics card manufacturer margins. If Nvidia is making a ton more profit on a $300 4060 than they did on a $300 1060 then we would have some objective reason to complain.
 
Sure it soudns like it performs kinda slow even considering, but the lego game doesn't look unambitious to me. Not sure where you guys are getting "it should run on low end" from, I haven't seen a game with lighting, fog, reflections, etc like that run well on low end devices at high resolutions.

Were those high end graphical choices right for a lego game? Well, I don't know, but this is a technical forum.
 
I never said Lego Horizon was a better looking game, I said it looks more coherent.
It’s a gorgeous game. It has an animated movie like quality, HFW hasn’t broken through to that barrier despite being geometrically more complex.

I would agree that it can indeed be considered the better looking game. It may come down to what you value of course. But what I’ve seen on it, reminds me a lot of the many nvidia tech demos out there.
 
Which it is, as it's a subjective topic.

Not sure why you don't understand that.

And where's this argument you claimed I started as I don't see people arguing, on people discussing or debating.



So highlighting the obvious implied that I don't agree.
You, by responding to that other person with a disagreement, started an argument about what looks better. :/

And yes, it's all subjective, and I'm sure you'd have no problems if I said that Half Life 1 was a better looking game than Cyberpunk 2077, no siree. I'd definitely expect you to just swallow it and nod because it's 'subjective'. smh

All these Lego games have typically had some nice graphics to them, but again, if you asked 100 people, most all of them would still say Forbidden West looks better and I think it's pretty obvious why.

We can and those expectations shift as PC GPUs advance over time vs a fixed console baseline.



I don’t know how you determine the “fair”price level for ray tracing at a given resolution. The consoles certainly aren’t giving us raytracing on the cheap. A more useful metric would be graphics card manufacturer margins. If Nvidia is making a ton more profit on a $300 4060 than they did on a $300 1060 then we would have some objective reason to complain.
I've already been banned here for trying to argue anything with pricing related by somebody who insisted that there's literally no relationship between what it costs to make something and what it costs to us, none whatsoever, and made it clear I am not allowed to disagree or make any sort of common sense arguments against it or its obvious relation to value expectations and whatnot with modern hardware.

I'd have to make my arguments based around these things, and it's been made clear to me I'm simply not allowed to.
 
Back
Top