Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

An RTX 3090 is incapable of doing current day RT at native 4k/60fps without having to resort to something like DLSS, and that is with current gen games.

In what world does not being able to run games at native 4k/60fps mean "not being fast enough"? Do you consider current generation consoles to have sufficient performance? Remind me how many games run at native 4k/60fps again?

And having to resort to 4K DLSS Quality makes a game unplayable in your estimation?

Heck it can't even lock CP2077 or Dylinf Light 2 to 60fps at native 1440p with full RT on so you're claim of it being demonstrably untrue is false.

So having to use DLSS Balanced at 1440p/60fps = unplayable to you?

Still no where as niche as an RTX3090 is.

But why the fixation on the RTX3090? A 3080 would be around 2.6x faster than the PS5 in RT enabled games. A 3070 would be around 2x faster. How much faster where the 780 or the 770 over the PS4? According to TPU 1.92x and 1.63x faster respectively.

Err... no it's not, it's no where close to being that.

I think you took my quote a little out of context there. I was simply referencing your original argument that PC gamers are expecting too much performance advantage this generation because they where expecting a performance advantage similar to last generation which you argue was greater in GPU terms, and I argue was not.
 
It's probably somewhere between the two as PS4 was was 1.84TF vs the 7850's 1.76TF and the 265's 1.89TF and both of those PC GPU's are GCN1.0 whereas PS4 was GCN1.1. It doesn't change the equation anyway as the 265 is only 9% faster than the 7850 according to TPU.



That's an ironic statement considering the demo that spawned this conversation makes great use of hardware RT. Would you buy a new GPU today with no hardware RT capability?



That's hardly an apples to apples comparison. From a technology perspective, the PS5 GPU is aging more quickly than the PS4 GPU did. If you need to compare the performance of two GPU's to 1 GPU in order to change that result then it kind of proves the point. The point which spawned this discussion was your assertion that PC gamers were used to getting higher performance vs consoles last gen than they're getting this gen and thus there's a placebo like effect happening here causing people to complain about PC performance not being good enough. So are you now saying that this general PC performance expectation was based solely on SLI setups? Was the average PC gamer running an SLI setup last gen?
I would say using SLI is fair as the price of multiple GPUs then was cheaper than a single GPU now. At PS4 launch, a 780ti was 2.5x faster on average over a wealth of games covering a myriad of scenarios. At PS5 launch a 3090 is 2x faster using the same criteria and 3x faster if you limit the comparison to specific implementations of specific effects. That is not the same.
 
Last edited:
I would say using SLI is fair as the price of multiple GPUs then was cheaper than a single GPU now. At PS4 launch, a 780ti was 2.5x faster on average over a wealth of games covering a myriad of scenarios. At PS5 launch a 3090 is 2x faster using the same criteria and 3x faster if you limit the comparison to specific implementations of specific effects. That is not the same.
Given the example given of dual Radeon 7950's, I think the MSRP on most of those was about $500. So 2 of them was about a 1000, which would put you at about an RTX 3080 right now, unless you adjust for inflation. In which case, are we normalizing to an average inflation rate, or something like the price of gasoline, or the price of anything at Dollar Tree.
 
At PS5 launch a 3090 is 2x faster using the same criteria and 3x faster if you limit the comparison to specific implementations of specific effects. That is not the same.

Even larger difference when calculating in reconstruction tech (dlss).
Why would tesselation, streaming etc not matter anymore btw?

100% Yes. In fact, as I've stated multiple times before, I'd run most games without RT which had RT even if I had an RTX 3080 or 3090 just because the RT isn't yet good enough (IMO) for the performance hit or IQ hit (DLSS) in order to run it at an acceptable (to me) framerate.

There are only a very few games where DLSS Quality doesn't result in (to me) unacceptable hits to image quality. Keep in mind that those hits aren't necessarily things that other people feel are bad or that they will notice during gameplay, so don't take this as me saying that DLSS is bad. Anything below DLSS Quality is basically worthless to me.

Also, I hope that people don't take this to mean that I'm against RT. I'm really looking forward to RT being eventually useable by me at better quality than what current hardware is capable of. Well, that assumes that GPUs capable of that aren't priced into the stratosphere where I can't justify the price of them. :p

Just like I loved the idea of good quality shadows in games, but up until just a few years ago, rarely did games have good enough shadows (too many artifacts or weird looking or distracting) that they weren't worth taking the performance hit to enable.

I'd love to have a new GPU better than my GTX 1070 for the general improvements in rasterization and compute. RT is just a bonus doodad that I can look at and say ooh and then promptly disable so that I can play the game. Basically what I do when I occasionally play games on a friend's PC with a 12900KS and a RTX 3090Ti.

Said friend's PC makes me REALLY want a HDMI 2.1 Graphics Card ASAP for locked 120 Hz gaming. 60 Hz unfortunately for me now feels like 30 Hz used to and 30 Hz is 100% unplayable now and ugly as all hell to look at. Luckily moving up to 240 Hz on his setup doesn't have the same effect (120 Hz still looks good and feels good to me), so I'm fairly confident that 120 Hz will be good enough for me until the day I die. :p

If I had to choose between 120 Hz or RT? 120 Hz all the time, every time.

Regards,
SB

I take you skip consoles this gen?
 
The obsolescence of hardware in consoles does not depend on the situation on the PC market, but on the stage of development of graphics technologies.
During the launch of PS4, graphics technologies were already ready to go significantly ahead, but they encountered an obstacle in the form of consoles hardware and rolled back.
Now the situation is reversed. Graphics engines are still only in the process of development and optimization for RT, MS, SSD and so on.

And for the record: not a single natively developed game has been released under RT yet.
 
Even larger difference when calculating in reconstruction tech (dlss).
Why would tesselation, streaming etc not matter anymore btw?


I take you skip consoles this gen?

I’m not sure what you’re asking WRT streaming and tessellation. Reconstruction is not an apples to apples comparison.

Given the example given of dual Radeon 7950's, I think the MSRP on most of those was about $500. So 2 of them was about a 1000, which would put you at about an RTX 3080 right now, unless you adjust for inflation. In which case, are we normalizing to an average inflation rate, or something like the price of gasoline, or the price of anything at Dollar Tree.
7950 launched January 2012 for 450. In July it dropped to 400. By the time PS4 launched it was sub 300. 10 months after PS4 a 980 was 550 offering 2.7x the performance across the board.
 
Last edited:
I’m not sure what you’re asking WRT streaming and tessellation. Reconstruction is not an apples to apples comparison.

Neither is SLI because it was widely understood to introduce frame pacing issues.

The discussion here was about how well console GPUs are aging this gen vs last gen relative to their PC counterparts.

If you're arguing they're aging better this generation but have to caveat that with a. Only when current gen PC GPU's aren't being fully utilised (RT and ML) and b. Only when comparing to 2x GPU's in SLI last generation, then I don't see that as a particularly strong position.

But hey, if that's the position you want to take, then yes, I concede that if you take only scenarios where current generation PC GPUs aren't fully utilised and compare that to 2, 3 or 4 GPUs running in SLI last generation then yes indeed, the performance delta to consoles is much smaller this generation for the moment.

That said, in a couple of years even those caveats won't help because SLI became next to useless later in the PS4 generations life (meaning performance effectively went down from the perspective of your argument) around Turing launch, while I expect some form of RT (which includes hardware Lumen) will be the default in most big games within the next couple of years.
 
Neither is SLI because it was widely understood to introduce frame pacing issues.

The discussion here was about how well console GPUs are aging this gen vs last gen relative to their PC counterparts.

If you're arguing they're aging better this generation but have to caveat that with a. Only when current gen PC GPU's aren't being fully utilised (RT and ML) and b. Only when comparing to 2x GPU's in SLI last generation, then I don't see that as a particularly strong position.

But hey, if that's the position you want to take, then yes, I concede that if you take only scenarios where current generation PC GPUs aren't fully utilised and compare that to 2, 3 or 4 GPUs running in SLI last generation then yes indeed, the performance delta to consoles is much smaller this generation for the moment.

That said, in a couple of years even those caveats won't help because SLI became next to useless later in the PS4 generations life (meaning performance effectively went down from the perspective of your argument) around Turing launch, while I expect some form of RT (which includes hardware Lumen) will be the default in most big games within the next couple of years.
I think last gen consoles were in a worse position even without SLI. SLI just makes it a complete non comparison. Getting into what constitutes full GPU usage is a very grey area. Last gen PC GPUs had plenty of features that saw little to no use. If they were performance advantages would have increased considerably. By this point into the generation a 980ti was just under 3.5x faster than a PS5 for 650 across the board. Without games using any of the features it offered over consoles. It actually offered less capability than the consoles due to DX11 not exposing much.
 
Last edited:
I think last gen consoles were in a worse position even without SLI. SLI just makes it a complete non comparison. Getting into what constitutes full GPU usage is a very grey area. Last gen PC GPUs had plenty of features that saw little to no use. If they were performance advantages would have increased considerably. By this point into the generation a 980ti was just under 3.5x faster than a PS5 for 650 across the board. Without games using any of the features it offered over consoles. It actually offered less capability than the consoles due to DX11 not exposing much.

The 980ti was still a couple of months away at this point in the previous generation. If you want to compare to that then we should probably be comparing to the 40x0 PC generation GPUs due in a few months. And according to TPU it was no more that 3x faster (using the 7850 comparison point) than the PS4. So right where the 3090 was over 18 months ago in RT enabled games.

Comparing unexposed hardware features of Maxwell to RT which is used today in multiple games is disingenuous. There are games on the market right now, and many more coming where the 3090 will see a real world 3x performance increase over the PS5. That is more than you could have ever achieved with the time equivalent single GPUs of the previous generation (780Ti and 980) over the PS4. And that gap is about to get much wider within 7 months.
 
The 980ti was still a couple of months away at this point in the previous generation. If you want to compare to that then we should probably be comparing to the 40x0 PC generation GPUs due in a few months. And according to TPU it was no more that 3x faster (using the 7850 comparison point) than the PS4. So right where the 3090 was over 18 months ago in RT enabled games.

Comparing unexposed hardware features of Maxwell to RT which is used today in multiple games is disingenuous. There are games on the market right now, and many more coming where the 3090 will see a real world 3x performance increase over the PS5. That is more than you could have ever achieved with the time equivalent single GPUs of the previous generation (780Ti and 980) over the PS4. And that gap is about to get much wider within 7 months.
We are 1 month away from when 980ti would have launched. It certainly isn’t a guarantee that we will see the 4000 series GPUs this year. Techpowerup 980ti launch review.

perfrel_2560.gif


A 7850 would be at 29%. That puts 980ti at 3.37x faster. There were instances of games using some Maxwell features. HFTS, VXAO etc. If consoles tried to run them performance differences would be far greater than 3.37x. There are more games using RT than Maxwell features for sure, but not all RT games scale performance up to that 3x number. And if history is any indication, those will not become the majority of games released. How many non Nvidia sponsored titles make heavy enough use of RT to cripple AMD GPUs?
 
Last edited:
So I strongly suspect many devs will just go with the flow and won't bother with baking some traditional raster effects anymore. This means RT is going to be an integral part of the game engine and you won't be able to turn it off anymore.

I’m not sure it’ll happen this generation. Will wait to see what Avatar and the next Metro game do on consoles but there may not be enough raw HW performance on the consoles to make RT a baseline requirement. Someone mentioned earlier that RT democratizes high quality lighting. That is so true and the greatest impact will be seen for smaller PC devs who no longer need to spend an inordinate amount of time and money baking light assets.

I am always baffled when I see people still downplaying HW-RT.

Honestly we have so many examples now of RT significantly improving IQ that you can safely ignore the “RT doesn’t matter” crowd. The problem is that there are also many examples of RT not significantly improving IQ that those folks can use to justify their position. Essentially cherry picking the worst examples. Doesn’t really matter though as RT is obviously here to stay and will (soonish) be as omnipresent as triangle rasterization is today. Why would anyone choose to do non-RT reflections, shadows or GI on PS6? They won’t.

@Silent_Buddha if your target is 1% lows above 120fps you will probably never see RT your lifetime :LOL:
 
How many non Nvidia sponsored titles make heavy enough use of RT to cripple AMD GPUs?

How do you determine whether a game is sponsored vs “neutral”? I wish more games had the option for crippling RT effects so that they actually scale on next generation hardware. Barely noticeable IQ improvements will still be barely noticeable on future hardware.
 
How do you determine whether a game is sponsored vs “neutral”? I wish more games had the option for crippling RT effects so that they actually scale on next generation hardware. Barely noticeable IQ improvements will still be barely noticeable on future hardware.
Ok it looks like Geforce.com no longer has a list of directly sponsored titles so its harder to differentiate. Is the Nvidia splash logo still a thing only in games they helped on?
 
How many non Nvidia sponsored titles make heavy enough use of RT to cripple AMD GPUs?
Just read the GPU Ray Tracing Performance topic, any title that uses RT to do a single effect properly will cripple AMD GPUs significantly. AMD GPUs get a pass only when they half ass the RT implementation to a very low resolution or do the majority of the implementation in screen space.
 
Just read the GPU Ray Tracing Performance topic, any title that uses RT to do a single effect properly will cripple AMD GPUs significantly. AMD GPUs get a pass only when they half ass the RT implementation to a very low resolution or do the majority of the implementation in screen space.
Metro EE doesn't cripple AMD GPUs and IMO is still one of the best RT use cases.
 
Metro EE doesn't cripple AMD GPUs and IMO is still one of the best RT use cases.
It cripples them HARD, a 3090 is 2X the performance of a 6900XT in Metro.

Overclock3d latest review shows the 2080Ti being 10% faster than the 6900XT in Metro Exodus Enhanced Edition, Control and Cyberpunk @4K.

As for the situation with the 3090Ti, it showed the same huge gaps vs the 6900XT in games with multiple RT effects at 4K resolution.

Metro Exodus Enhanced: 3090Ti is 2.2X times faster
Control: 3090Ti is 2X times faster
Cyberpunk 2077: 3090Ti is 2X times faster
Watch Dogs Legion: 3090T is 60% faster

https://www.overclock3d.net/reviews/gpu_displays/gigabyte_rtx_3090_ti_oc_review/6

PurePC reached almost the same conclusion as well, the 2080Ti is faster than the 6900XT in Control, Cyberpunk, and ties it in Metro Exodus Enhanced Edition @4K.

As for the situation with the 3090Ti, it continues the same story vs the 6900XT @4K.

Control: 3090Ti is 2X times faster
Cyberpunk 2077: the 3090Ti is 2X times faster
Metro Exodus Enhanced: 3090Ti is 80% faster

https://www.purepc.pl/test-karty-graficznej-nvidia-geforce-rtx-3090-ti?page=0,19

The 3080 12GB is 200% faster than 6900XT in Metro Exodus Enhanced Edition!






Updated Ray Tracing testing using the 3090Ti vs 6900XT (the regular one) done by TPU.

https://www.techpowerup.com/review/msi-geforce-rtx-3090-ti-suprim-x/34.html


Cyberpunk 2077:

3090Ti is 220% faster @2160p, and 200% faster @1440p than 6900XT (twice as fast)


Control:

3090Ti is 90% faster @2160p, and 75% faster @1440p than 6900XT


Metro Exodus:

3090Ti is 70% faster @2160p, and 50% faster @1440p than 6900XT


Doom Eternal:

3090Ti is 65% faster @2160p, and 45% faster @1440p than 6900XT


Deathloop:

3090Ti is 55% faster @2160p, and 50% faster @1440p than 6900XT


Resident Evil Village:

3090Ti is 40% faster @2160p, and 25% faster @1440p than 6900XT


Watch Dogs Legions:

3090Ti is 50% faster @2160p, and 25% faster @1440p than 6900XT


F1 2022:

3090Ti is 35% faster @2160p, and 25% faster @1440p than 6900XT


Far Cry 6:

A draw


We then double that down with testing from the Sweclockers.

https://www.sweclockers.com/test/34...0-ti-snabbt-dyrt-och-laskigt-effekttorstigt/5


Metro Exodus:

3090Ti is 70% faster @2160p than 6900XT


Control:

3090Ti is 90% faster @2160p than 6900XT


Battlefield V:

3090Ti is 65% faster @2160p than 6900XT


Some more tests from KitGuru.

https://www.kitguru.net/components/...ss/nvidia-rtx-3090-ti-review-ft-msi-palit/22/

https://www.kitguru.net/components/...ss/nvidia-rtx-3090-ti-review-ft-msi-palit/21/

https://www.kitguru.net/components/...ss/nvidia-rtx-3090-ti-review-ft-msi-palit/20/


Cyberpunk 2077:

3090Ti is 200% faster @2160p and @1440p than 6900XT (twice as fast)


Metro Exodus Enhanced:

3090Ti is 220% faster @2160p (twice as fast) and 95% faster @1440p than 6900XT


Resident Evil Village:

3090Ti is 30% faster @2160p and @1440p than 6900XT



Some more tests from Golem.

https://www.golem.de/news/geforce-rtx-3090-ti-im-test-nvidias-ampere-brechstange-2203-164068-2.html


Cyberpunk 2077:

3090Ti is 230% faster @2160p than 6900XT (twice as fast)


Metro Exodus Enhanced:

3090Ti is 85% faster @2160p than 6900XT


Lego Builder's Journey:

3090Ti is 90% faster @2160p than 6900XT


Riftbreaker:

3090Ti is 48% faster @2160p than 6900XT
 
Last edited:
Metro EE runs very well on every DXR capable GPU.

I still think it's one of the best showing of Raytracing to date, not only because the GI looks fantastic but also because it's so performant.

I still wish UE5's Lumen would be closer to the incredible solution 4A Games built here.

Similar to many other people, I'm afraid running Lumen at 60 FPS is next to impossible, given even that little multiplayer demo (Lyra is it called I believe) is very heavy.
 
Last edited:
Again, NOTE, that while I don't find DLSS quality to be adequate compared to DLSS off in most games,
I guess you are going to be out of luck this gen then, if UE5 demos are any indications, then many games will rely on TSR to deliver their next gen visuals. And TSR is often inferior to DLSS. Console gaming is out of luck for you too, as upscaling is prevalent there in almost EVERY title, with far inferior results to DLSS.
Please don't use a single link vs the many links I provided for your above. If you test a realtively light and small area in Metro as Computerbase did then AMD GPUs do fine, if you test any area with heavy use of RT GI and reflections (as done by other smart sites) then they are crippled hard.
 
Some more Metro EE tests for you.
A new ray tracing benchmark from BaseMark has been released, called Relic Of Life, at 4K, the 3090Ti is almost 3 times faster than 6900XT.

https://www.comptoir-hardware.com/a...-test-nvidia-geforce-rtx-3090-ti.html?start=5

Some other game tests from the same site at 4K resolution, focusing on the 3090Ti vs 6900XT.

Minecraft: 3X times faster than 6900XT
Quake 2 RTX: 2.5X times faster
Cyberpunk 2077: 2X times faster
Dying Light 2: 2X times faster
Doom Eternal: 65% faster
Watch Dogs Legion: 62% faster
Ghostwire Tokyo: 60% faster
F1 2021: 58% faster
Metro Exodus Enhanced: 54% faster
Control: 51% faster
Deathloop: 48% faster
Resident Evil Village: 15% faster

https://www.comptoir-hardware.com/a...-test-nvidia-geforce-rtx-3090-ti.html?start=4
They tested six ray tracing games: Bright Memory Infinite, Control Ultimate Edition, Cyberpunk 2077, Fortnite, Metro Exodus Enhanced, and Minecraft. The 3090Ti is essentially 2 times faster than 6900XT in these games.
https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
Updated Ray Tracing testing using the 3090Ti vs 6900XT LC (liquid cooled and overclocked), done by PCGH. Note: testing is done using FSR Ultra Quality where applicable, also ReBAR is enabled.

Dying Light 2 (FSR) :
3090Ti is twice as fast (200%) @ both 2160p and 1440p than 6900XT LC

Cyberpunk 2077 (FSR):
3090Ti is 80% faster @2160p, and 65% faster @1440p than 6900XT LC

Guardians Of The Galaxy:
3090Ti is 70% faster @2160p, and 65% faster @1440p than 6900XT LC

Lego Builder's Joruney (FSR):
3090Ti is 70% faster @2160p, and 55% faster @1440p than 6900XT LC

Doom Eternal:
3090Ti is 60% faster @2160p, and 45% faster @1440p than 6900XT LC

Ghostwire Tokyo (FSR):
3090Ti is 50% faster @2160p, and 40% faster @1440p than 6900XT LC

Metro Exodus Enhanced:
3090Ti is 50% faster @2160p, and 38% faster @1440p than 6900XT LC

Riftbreaker (FSR):
3090Ti is 50% faster @2160p, and 60% faster @1440p than 6900XT LC

F1 2022 (FSR):
3090Ti is 35% faster @2160p, and 30% faster @1440p than 6900XT LC

Far Cry 6:
A draw

https://www.pcgameshardware.de/Gefo...UVP-Release-Benchmark-Specs-kaufen-1391700/2/
 
Back
Top