Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Yea, as frustrated as I am with the GPU pricing situation, I've struggled to decide to just get a PS5 instead cuz the value proposition of consoles has gotten quite a bit worse nowadays, too. Raised game prices, the push for digital, the need for paid subscription to play multiplayer, increasingly expensive controllers that are still as unreliable as ever, fewer actual console exclusive titles and lack of lowering of prices over time(heck, they're even going up!).

So this keeps me in the PC gaming sphere for the time being, even though I'm loathe to actually upgrade just yet. The pricing for Lovelace GPU's in particular is just one of the most scumbag things I've ever seen in this industry. I am thankfully very patient and have tons of existing games I can still play and enjoy for a good while yet. If I have to just be super late to the party to get any kind of decent value, that's what I'll do.
I dunno, I think paid subscriptions only apply if you play non f2p multiplayer games. It's not a mandatory add on, thats for sure. People argue that the total cost of ownership is similar but for me that's pretty much false. I guess it depends on the individual. Seeing as I paid $2000+ for my 4090 alone, the T.C.O will never be even close. On pc, i use an Xbox series controller. I have 2 of them and they seem to have been lasting for a while as I play fps with a mouse + keyboard. Even steam sales which people rave about are mostly worse than sales on PSN. Those happen more frequently and often have better discounts. The good thing on pc is that you have multiple marketplaces but if you leave steam, you lose your ability to get a refund.

All in all, I think both have it's uses depending on the individual but, I don't think the T.C.O are even remotely similar between both. I'd argue that they don't need to be as PC offers more utility ranging from games to productivity to etc. As a result, it doesn't matter if it costs more.
 
Steam sales haven't been great in years. PSN or Xbox Live sales are often on par or better. Long gone are the days of a 1-year-old game going for $8 in a flash sale.

Still, despite the terrible value proposition of PC hardware, it's the only platform where you can play Starfield, BG3, Rift Apart, TLOU Part I, Forza Horizon/Motorsport, Flight Sim 2020, Total War, City Skylines (for better or for worse), CS2, and a ton of other PC exclusives. You need to get a Series X and a PS5 to match PC's catalog and even then, you'll fall short.
 
Last edited:
I dunno, I think paid subscriptions only apply if you play non f2p multiplayer games. It's not a mandatory add on, thats for sure. People argue that the total cost of ownership is similar but for me that's pretty much false. I guess it depends on the individual. Seeing as I paid $2000+ for my 4090 alone, the T.C.O will never be even close. On pc, i use an Xbox series controller. I have 2 of them and they seem to have been lasting for a while as I play fps with a mouse + keyboard. Even steam sales which people rave about are mostly worse than sales on PSN. Those happen more frequently and often have better discounts. The good thing on pc is that you have multiple marketplaces but if you leave steam, you lose your ability to get a refund.

All in all, I think both have it's uses depending on the individual but, I don't think the T.C.O are even remotely similar between both. I'd argue that they don't need to be as PC offers more utility ranging from games to productivity to etc. As a result, it doesn't matter if it costs more.

Obviously when people talk about total cost of ownership being comparable they're talking about for a similar average performance level over time. Not for bleeding edge level performance on the PC.

On the games front, if you buy direct from Steam or any other major game store then you're always going to pay more. If you buy from the online game key sites like CD Keys though then more often than not the PC version of the games will be cheaper - although not by a massive amount these days so the impact is limited.
 
Obviously when people talk about total cost of ownership being comparable they're talking about for a similar average performance level over time. Not for bleeding edge level performance on the PC.

On the games front, if you buy direct from Steam or any other major game store then you're always going to pay more. If you buy from the online game key sites like CD Keys though then more often than not the PC version of the games will be cheaper - although not by a massive amount these days so the impact is limited.
That’s fair but if anyone were buying today to match consoles, I’d tell them to buy a 4070/7800xt. While those GPUs are more powerful, the extra overhead is needed to account for bad ports. Those GPUs are already more than a ps5 digital and are equal to the price of the xsx. Then you need to build the rest of the computer. CPU, mobo, case, ram, psu, etc.

Maybe there was a time when the T.C.O was lower. Nowadays, due to the economic conditions, it’s much more difficult. We don’t seem to be getting those generational price performances improvements in the mid range - low end.
 
That’s fair but if anyone were buying today to match consoles, I’d tell them to buy a 4070/7800xt. While those GPUs are more powerful, the extra overhead is needed to account for bad ports. Those GPUs are already more than a ps5 digital and are equal to the price of the xsx. Then you need to build the rest of the computer. CPU, mobo, case, ram, psu, etc.

Maybe there was a time when the T.C.O was lower. Nowadays, due to the economic conditions, it’s much more difficult. We don’t seem to be getting those generational price performances improvements in the mid range - low end.

You don't need either of those GPU's for parity with consoles, bad ports or not. A 3060Ti/4060Ti/6700XT will give you better performance in the vast majority of titles. Very similar in a few and worse in literally less than I can count on one hand.
 
You don't need either of those GPU's for parity with consoles, bad ports or not. A 3060Ti/4060Ti/6700XT will give you better performance in the vast majority of titles. Very similar in a few and worse in literally less than I can count on one hand.
3060ti 8gb is a solid no and the same goes for the 4060ti 8gb. Even Nvidia realized this and are released the 4060ti 16gb. They're also rumored to be release a super refresh of the lineup. The name of the game is longevity and all 3 of those gpu's will age like milk against consoles. We're just finally starting to see a steady stream of next-gen games being released. If I were a betting man, I would not bet on any of the gpu's you recommended. I consider that to be bad advice, no offence.
 
3060ti 8gb is a solid no and the same goes for the 4060ti 8gb. Even Nvidia realized this and are released the 4060ti 16gb. They're also rumored to be release a super refresh of the lineup. The name of the game is longevity and all 3 of those gpu's will age like milk against consoles. We're just finally starting to see a steady stream of next-gen games being released. If I were a betting man, I would not bet on any of the gpu's you recommended. I consider that to be bad advice, no offence.

For higher end PC gaming sure, the VRAM will be a big limit in that circumstance. But for console settings level gameplay there are very few, if any games were you would be unable to at least match the console performance with those GPU's, and pretty much the biggest compromise you would have to make in those where you can't would be to drop texture settings down a notch. Meanwhile there are literally hundreds of other cross platform games that will perform better on those GPU's. If that doesn't qualify for the definition of 'similar average performance' then I don't know what does.

Also the 6700XT doesn't have that VRAM limitation.
 
This isn't actually true if your comparing like for like, yes you can suck down more power when shooting for the moon but for example i've been playing lies of p lately max settings at 1440p with a frame cap at 90fps on a 4090 system and the power draw for the gpu is 110w and the cpu is topping out at 90w. If we include whatever else the rest of the system is using it wouldn't be over 240w and this is at settings slightly above console and 50% higher framerate.
I bet under real load for such hardware, power consumption would be little different ;) But Ok i got you, but even 100W difference does adds up with todays electricity prices.

For higher end PC gaming sure, the VRAM will be a big limit in that circumstance. But for console settings level gameplay there are very few, if any games were you would be unable to at least match the console performance with those GPU's, and pretty much the biggest compromise you would have to make in those where you can't would be to drop texture settings down a notch. Meanwhile there are literally hundreds of other cross platform games that will perform better on those GPU's. If that doesn't qualify for the definition of 'similar average performance' then I don't know what does.

Also the 6700XT doesn't have that VRAM limitation.

Oh come on Mr, like we did't learn generation after generation what happens with gpus which early on appeared "similar" in performance and had fraction of consoles ram. Heck, those already have vram problem with crossgen/ remasters. I think this generation vram pressure on such gpus will be even grater for two reasons.

1 Games with native RT needing to allocate more vram for BVH.
2 Console ssd solutions made their vram used where is truly needed instead playing nany for laptop hdd. If you look at DF insomniac spiderman 2 interview, for example streaming from ssd... bvh or fighting animation just in time for next frame I bet almost nobody on PC would want to make such risky aproches waiting for unforseen issuses and would rather just dump it to vram and/or ram, further increasing vram footprint.
 
Oh come on Mr, like we did't learn generation after generation what happens with gpus which early on appeared "similar" in performance and had fraction of consoles ram.

To be fair, we're 3 years into this gen though, it's not as if we're having this convo in late 2020. I certainly think 8GB is not going to age well for any card over $250 no doubt, but even 3 years in it's still a relative handful of games that will exhaust it at console equivalent settings, and many would still provide superior image quality with DLSS (and just stuff like 16X aniso!).

There is definitely a frustrating gap in the lineup between the $300 4060/7600XT and the 7800XT/4070, and the sticking point is indeed that vram. But if as BitByte said, you're going to 'need' a 7800XT/4070 to be console-equivalent because of a handful of 'poor ports' (and the ones that are subpar are not necessarily due to rendering load regardless), then it's just as fair to note that the 8GB of a $300 4060 is also going to only be truly restrictive in a minority of titles, at least to the point where you're significantly lowering graphical detail below the console versions. I mean for a 4070/7800XT, with extremely GPU-hungry titles like TLOU you'll still get superior image quality/performance, but then there's the vast majority of titles that deliver significantly better performance on those cards too. Doesn't make a $500 GPU a 'bargain' of course, but the gap is often more large than not.

I'm not letting Nvidia/AMD off the hook on this wrt 8GB cards mind you, especially Nvidia as you note, the features it hypes - frame gen and RT - require more vram! But let's also reign in expectations here, 8GB does not mean these cards are useless or all games will be showing TLOU1 pre-patch texture quality. If PS5/SX owners can stomach sub-900p native starting res for UE5 titles, PC gamers can accept some compromises in the sub-$400 category too. Yeah, historically the fact that we have to make these compromises at all is shit, but those GPU's can still deliver a decent experience.

Also should be noted that while the rendering load of UE5 games are quite demanding, one area which actually it has been quite good at managing is vram. Every UE5 game I've tried barely touches 8GB on my 4060 at 4K with DLSS, maybe this will change in the future, but it seems like UE5 is quite adaptable to vram so far.

If you look at DF insomniac spiderman 2 interview, for example streaming from ssd... bvh or fighting animation just in time for next frame I bet almost nobody on PC would want to make such risky aproches waiting for unforseen issuses and would rather just dump it to vram and/or ram, further increasing vram footprint.

Spiderman 2 will not be on the PC for a least 2 years though. Going by the vram demands of just Spiderman:RM, yeah - I'd say when it comes, it will very likely suffer on 8GB cards.
 
Last edited:
To be fair, we're 3 years into this gen though, it's not as if we're having this convo in late 2020. I certainly think 8GB is not going to age well for any card over $250 no doubt, but even 3 years in it's still a relative handful of games that will exhaust it at console equivalent settings, and many would still provide superior image quality with DLSS (and just stuff like 16X aniso!)....
We are 3 years in this generation and we have already titles that go sub 1080p (native). Yes there were always sich titles that (technically) totally underdelivered. But as game productions is now more and more complicated, I guess we will see such things more often.
And yes there are and will be games as spiderman 2 which definitely show what the hardware is capable of, it the game gets optimized for it. But I must say, the game looks really good, but the jump it made from last gen to this game seems to be really small. 8x the power of the last generation (well not 8x the bandwidth) are really fast used up. Just think of what will happen with the next generation of it again will "only" have 8x the power...
The steps really get smaller and smaller. But on the other hand, I'm still ok with the graphics standard of most games since late ps360 generation. It's not that bad on big TVs like PS1/2 games (as long as the gameplay holds up).
 
I feel that debating the nuances of the 8GB issue actually underscores the overarching issue. 3 years in with the previous generations we had 0 compromise and 0 worry console+ GPUs (really console+++, console+ you say already existed 1 year after last gen) at sub console prices.

That was great for consumers but on the flip side that was bad for the companies as it lowered the incentive to upgrade (and again I would not be surprised if there was some market segmentation planning with this in mind). They certainly don't want a situation in which everyone feels there is no need to upgrade from Pascal (or whatever) until well into well into the next gen consoles.

Especially with the gen on gen cost slow down, raw perf/$ gains going forward is likely going to be much less enticing as a upgrade driver compared to gens past. Need to upgrade is going to have to come from things like feature set, VRAM or clearing performance thresholds.
 
Last edited:
So it appears that the specs put out by Remedy do not align with the actual benchmarks? Tech power up seems to suggest that this is the case?
IMG_7323.png
 
WCCFTECH tested Alan Wake 2 with Path tracing at 4K DLSS/FSR quality.

The 4090 does 82fps here, making it 2.7x times faster than 7900XTX, the 4080 is also 2x times faster, even the 4060Ti is faster. Which might explain why it was omitted from the requitements.

Activating DLSS Ray Reconstruction also adds 10% more performance to NVIDIA GPUs.

 
Last edited:
I read it uses upscaling when you dont change the rendering resolution.

Looking at these Raytracing benchmarks this performs really good. Direct Lights and reflections are the hardest part and you get over 60 FPS with native 1440p on a 4090. UE5 cant even produce the same image quality. GI isnt even worth it when you dont have a day/night cycle. Makes more sense to use a faster solution and using Raytracing for these parts of the image which cant be replicated by other approaches.
 
The image quality for AW2 on consoles is very bad imo. The shimmering is way too much and the graining, etc. the shadow quality is also quite bad. It’s hugely detrimental to the visuals. I guess FSR 2 is to blame for that but it’s not good at all.
 
The image quality for AW2 on consoles is very bad imo. The shimmering is way too much and the graining, etc. the shadow quality is also quite bad. It’s hugely detrimental to the visuals. I guess FSR 2 is to blame for that but it’s not good at all.

This is Quality mode:

1698353152770.png

Man alive. It almost looks like they're not doing post-processing after reconstruction, or FSR2 is just this bad in spots. For an internal resolution of 1270p though, that's worse than most FSR2 tests I've done, wondering what is going on there. It looks like it's not working with specular highlights at all. You've got highlightings blinking in and out of existence, like you would see with a DLSS mod for games that never had reconstruction.

I suspect if it's this bad on consoles, even with DLSS Quality, the PC version will also show higher than normal artifacting in these spots too. It really shouldn't be this bad.
 
This is Quality mode:

View attachment 9899

Man alive. It almost looks like they're not doing post-processing after reconstruction, or FSR2 is just this bad in spots. For an internal resolution of 1270p though, that's worse than most FSR2 tests I've done, wondering what is going on there. It looks like it's not working with specular highlights at all. You've got highlightings blinking in and out of existence, like you would see with a DLSS mod for games that never had reconstruction.

I suspect if it's this bad on consoles, even with DLSS Quality, the PC version will also show higher than normal artifacting in these spots too. It really shouldn't be this bad.
FWIW Techpowerup's conclusion supports this theory.

"The settings menu of Alan Wake 2 has a long list of options for performance tuning, but there will be a ton of drama around the forced upscaling. Yup, you can't render at native. There's only options for "DLSS" and "FSR"—nothing else. Both upscalers come with the option to render at native resolution, but will still use the image enhancement techniques of the upscaler. Not sure why Remedy made such a choice—it will just antagonize players. What makes things worse is that the sharpening filter that's part of both upscalers is disabled, but can be enabled manually with a config file edit. Still, we want native, just as an option, even if it comes with a performance hit. The good thing is that you can enable this in settings manually, by changing m_eSSAAMethod to 0 or 1. During gaming, even with DLAA enabled, and Motion Blur and Film Grain disabled, I noticed that at sub-4K resolutions the game looks quite blurry, like there was a hidden upscaler at work. I played with all the settings options—no improvement. After digging through the config file I noticed that there's several important settings that aren't exposed in the settings menu, no idea why. Once I set m_bVignette, m_bDepthOfField and m_bLensDistortion to "false," the game suddenly looked much clearer. If you plan on playing Alan Wake 2 definitely make those INI tweaks manually. You can also change the field of view here (m_fFieldOfViewMultiplier). I found the default too narrow and prefer to play with a setting of 1.3."

 
Status
Not open for further replies.
Back
Top