Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

True on the CPU front but as noted above the 3090 can achieve 3x the real world performance of the PS5 in games that make decent use of RT. I'm not sure we had any GPU that came close to that at the PS4's launch.

RT wasn't around back then so that's an unfair comparison, but......

At PS5's launch the fastest GPU was the RTX3090 that offered 2x PS5's raster performance, 15 months after PS5 released the fastest GPU was still the RTX3090 offering 2x the raster performance.

At PS4's launch the fastest GPU was the GTX780ti that offered 2.3x PS4's performance, 15 months after PS4 released the fastest GPU was the GTX980 that offered 2.6x the performance........ and the 980ti wasn't far behind that offered 3x PS4 performance.

And that's without going to Crossfire and SLI where you could get 7x the performance or more with quad-GPU's at PS4's launch.

PS4's GPU performance aged at a much faster rate than PS5's currently is.
 
Last edited:
RT wasn't around back then so that's an unfair comparison, but......

At PS5's launch the fastest GPU was the RTX3090 that offered 2x PS5's raster performance, 15 months after PS5 released the fastest GPU was still the RTX3090 offering 2x the raster performance.

At PS4's launch the fastest GPU was the GTX780ti that offered 2.3x PS4's performance, 15 months after PS4 released the fastest GPU was the GTX980 that offered 2.6x the performance........ and the 980ti wasn't far behind that offered 3x PS4 performance.

And that's without going to Crossfire and SLI where you could get 7x the performance or more with quad-GPU's at PS4's launch.

PS4's GPU performance aged at a much faster rate than PS5's currently is.

If you pretend RT doesn't exist, sure. Convenient for the argument but not so much for reality. When fully utilised (and I'm ignoring the tensor cores here), at PS5 launch the 3090 was 3x faster. 15 months later, it was 3x faster. According to TPU the 780Ti was 2.2x faster than the R7 265 which was basically the PS4 GPU on PC, and the 980 was 2.44x faster after 15 months.

19 months later the 980Ti was 2.8x faster. 30 months later the 1080 was 3.7x faster.

Within 6 months from now, or ~24 months on from the PS5's launch we'll have the 40x0 generation which is rumoured to be up to twice as fast as the 30x0 generation but being conservative and assuming only a 50% uplift still gives us something that's 4.5x faster after 24 months vs 3.7x faster after 30 months last gen.

So no, I'd say the PS4's GPU performance aged at a noticeably slower rate than PS5's in large part due to the current gens reliance on the underwhelming RDNA2 RT capabilities. CPU performance and certainly IO performance should fair much better though.
 
RT wasn't around back then so that's an unfair comparison, but......

At PS5's launch the fastest GPU was the RTX3090 that offered 2x PS5's raster performance, 15 months after PS5 released the fastest GPU was still the RTX3090 offering 2x the raster performance.

At PS4's launch the fastest GPU was the GTX780ti that offered 2.3x PS4's performance, 15 months after PS4 released the fastest GPU was the GTX980 that offered 2.6x the performance........ and the 980ti wasn't far behind that offered 3x PS4 performance.

And that's without going to Crossfire and SLI where you could get 7x the performance or more with quad-GPU's at PS4's launch.

PS4's GPU performance aged at a much faster rate than PS5's currently is.

Seems its the other way around, the ps4 actually stood better i think. Cpu wise were better off now but then the 8th gen actually got a downgrade compared to 7th gen. Still a downclocked, cut down 2019 zen2 isnt all that… i mean if ue5 is anything to go by. Zen3 was around at launch. Ram wise the ps4 had more.
The SSD are fast but got outpaced very quickly. We were at 7gb/s raw before the consoles were.
 
If you pretend RT doesn't exist, sure. Convenient for the argument but not so much for reality. When fully utilised (and I'm ignoring the tensor cores here), at PS5 launch the 3090 was 3x faster. 15 months later, it was 3x faster. According to TPU the 780Ti was 2.2x faster than the R7 265 which was basically the PS4 GPU on PC, and the 980 was 2.44x faster after 15 months.

19 months later the 980Ti was 2.8x faster. 30 months later the 1080 was 3.7x faster.

Within 6 months from now, or ~24 months on from the PS5's launch we'll have the 40x0 generation which is rumoured to be up to twice as fast as the 30x0 generation but being conservative and assuming only a 50% uplift still gives us something that's 4.5x faster after 24 months vs 3.7x faster after 30 months last gen.

So no, I'd say the PS4's GPU performance aged at a noticeably slower rate than PS5's in large part due to the current gens reliance on the underwhelming RDNA2 RT capabilities. CPU performance and certainly IO performance should fair much better though.

PS4 was closer to a HD7850 due to its bandwidth and clock speed.

And I am pretending RT doesn't exist because while it's adoption has picked up it's still very much a tick box feature.

But let's add SLI in the mix shall we, it was very much a thing that worked in 2013 meaning you could easily build a PC with 7x (or more) the performance of PS4 before PS4 even launched.

With SLI and multi GPU in the state it's currently in you don't get that massive boost you used to get in 2013.

So it still stands that PS4's performance aged much faster than PS5 is as you could build a PC with a real world performance advantage you simply can't build now.
 
Interesting that when demo arrived I read here we shouldn't read too much into benchmarks results as demo is not very optimized on consoles, now we cant read benchamarks results too much because apparently demo is only optimized on consoles ;d
To be fair, I don't think it's well optimized for gameplay. It looks good and everything, it just isn't geared towards performance like a shipping game is. I don't think anyone here knows exactly what The Coalition did specifically to optimize the demo for shipping (I know there are quotes about memory footprint and whatnot, I'm talking under the hood type stuff), but we know they did something important enough for them to be involved and recognized for it after the fact. I've also seen enough of the "I clicked compile and made myself some tea and a sandwich and watched some TV and now I'm in the matrix demo" videos to know that they didn't have a talented team of experienced game developers do a once over on that to tune it up for presentation.

We know the demo was optimized for console. We know at least one of the teams involved. But Epic didn't ship the demo on PC, they shipped the assets. And people are building something that analogous to that demo, but it's still isn't the same level of polish.
 
We know the demo was optimized for console. We know at least one of the teams involved. But Epic didn't ship the demo on PC, they shipped the assets. And people are building something that analogous to that demo, but it's still isn't the same level of polish.

DF said this exact same thing ;)
 
Sorry Im an oaf, but is it possible to have Lumen but not Nanite?
Or is Lumen dependant on Nanite?
and visa versa.

Obviously disabling Lumen is easier than disabling Nanite I assume.
 
PS4 was closer to a HD7850 due to its bandwidth and clock speed.

It's probably somewhere between the two as PS4 was was 1.84TF vs the 7850's 1.76TF and the 265's 1.89TF and both of those PC GPU's are GCN1.0 whereas PS4 was GCN1.1. It doesn't change the equation anyway as the 265 is only 9% faster than the 7850 according to TPU.

And I am pretending RT doesn't exist because while it's adoption has picked up it's still very much a tick box feature.

That's an ironic statement considering the demo that spawned this conversation makes great use of hardware RT. Would you buy a new GPU today with no hardware RT capability?

But let's add SLI in the mix shall we, it was very much a thing that worked in 2013 meaning you could easily build a PC with 7x (or more) the performance of PS4 before PS4 even launched.

With SLI and multi GPU in the state it's currently in you don't get that massive boost you used to get in 2013.

So it still stands that PS4's performance aged much faster than PS5 is as you could build a PC with a real world performance advantage you simply can't build now.

That's hardly an apples to apples comparison. From a technology perspective, the PS5 GPU is aging more quickly than the PS4 GPU did. If you need to compare the performance of two GPU's to 1 GPU in order to change that result then it kind of proves the point. The point which spawned this discussion was your assertion that PC gamers were used to getting higher performance vs consoles last gen than they're getting this gen and thus there's a placebo like effect happening here causing people to complain about PC performance not being good enough. So are you now saying that this general PC performance expectation was based solely on SLI setups? Was the average PC gamer running an SLI setup last gen?
 
Sorry Im an oaf, but is it possible to have Lumen but not Nanite?
Or is Lumen dependant on Nanite?
and visa versa.

Obviously disabling Lumen is easier than disabling Nanite I assume.

you can have both, one or the other, or none(fortnite). In the very first PS5 demo they toggle off lumen at the beginning of the presentation to show the difference.
 
Would you buy a new GPU today with no hardware RT capability?

Happily as I can easily live without RT until the hardware is fast enough to actually handle RT

And especially so if not having RT hardware means I get a cheaper GPU.

That's hardly an apples to apples comparison. From a technology perspective, the PS5 GPU is aging more quickly than the PS4 GPU did. If you need to compare the performance of two GPU's to 1 GPU in order to change that result then it kind of proves the point. The point which spawned this discussion was your assertion that PC gamers were used to getting higher performance vs consoles last gen than they're getting this gen and thus there's a placebo like effect happening here causing people to complain about PC performance not being good enough. So are you now saying that this general PC performance expectation was based solely on SLI setups? Was the average PC gamer running an SLI setup last gen?

It's very much an apples to apples comparison, we're comparing high end gaming PC's of the time to consoles of the time and PC's containing multiple GPU's were very much a thing in 2013.

I myself was running Crossfire HD7950 boost editions clocked to 1.25Ghz and was close to 3x the performance of PS4 and I had this system
before PS4 even released.

That ability made the consoles age very very quickly and SLI/Crossfire systems were likely much more common back then than people who own RTX3090's are now.

According to STEAM the PC average gamer is still rocking a GTX1060 with a quad core.
 
Happily as I can easily live without RT until the hardware is fast enough to actually handle RT

And especially so if not having RT hardware means I get a cheaper GPU.



It's very much an apples to apples comparison, we're comparing high end gaming PC's of the time to consoles of the time and PC's containing multiple GPU's were very much a thing in 2013.

I myself was running Crossfire HD7950 boost editions clocked to 1.25Ghz and was close to 3x the performance of PS4 and I had this system
before PS4 even released.

That ability made the consoles age very very quickly and SLI/Crossfire systems were likely much more common back then than people who own RTX3090's are now.

According to STEAM the PC average gamer is still rocking a GTX1060 with a quad core.

Most console gamers are still stuck on 2013 hardware. Instead of SLI we have gone to singular solutions. A 3090 24GB comperatively is a better solution than 2x 680/780 backthen.
 
Happily as I can easily live without RT until the hardware is fast enough to actually handle RT
.

You do not seem to understand the implications of RT HW-acceleration in modern GPUs. Like, at all.

In the near future, RT will replace some traditional raster effects such as SSGI and SSR in next generation games. Why? Because it looks much better and, more importantly, it saves devs a ton of time and money because they don't have to bake light anymore and can change their lighting scene in real time, this in turn means much greater interactivity with the game world.

So I strongly suspect many devs will just go with the flow and won't bother with baking some traditional raster effects anymore. This means RT is going to be an integral part of the game engine and you won't be able to turn it off anymore.

To not alienate the huge chunk of PC market without HW-RT GPUs, the games will still run but in software mode. This means if you have a card capable of HW-RT the games will automatically run MUCH faster compared to cards without HW-RT. And you sure care about performance, right? A card with RT acceleration will perform a lot better than one without HW-RT if RT is always on. The next gen Avatar game by Ubisoft is already confirmed to do exactly that, RT only for certain effects and a slower Software-RT mode for cards that do not have HW-RT. Also there may be games that straight up might not run at all (see Minecraft RTX and Metro Exodus Enhanced Edition)

UE5 is a bit different though, their Software-Lumen solution is just as fast as HW-RT, but I think that's going to be an exception for next gen. And even with UE5, if you have a RT-capable card you basically much improved reflections and GI for free (on the GPU)

HW-RT is so incredibly important. I am always baffled when I see people still downplaying HW-RT.
 

"The versions tested were 1.000.005 on PS5 consoles and 2.0.0.5 on the Xbox Series consoles. These are the latest versions on each platform and are taken from the respective console dashboards.

PS5 and Xbox Series X both render at a native resolution of 2560x1440. PS5 and Xbox Series X both appear to use 4xMSAA and both render the UI at 3840x2160.

Xbox Series S renders at a native resolution of 1920x1080 and renders the UI at 1920x1080.

Frame pacing issues can happen during FMVs on all three consoles https://bit.ly/36EcZ6S

Stats: https://bit.ly/37xmuFD
Frames Pixel Counted: https://bit.ly/36DZeFc"

cD1ulbM.jpg
 
Would you buy a new GPU today with no hardware RT capability?

100% Yes. In fact, as I've stated multiple times before, I'd run most games without RT which had RT even if I had an RTX 3080 or 3090 just because the RT isn't yet good enough (IMO) for the performance hit or IQ hit (DLSS) in order to run it at an acceptable (to me) framerate.

There are only a very few games where DLSS Quality doesn't result in (to me) unacceptable hits to image quality. Keep in mind that those hits aren't necessarily things that other people feel are bad or that they will notice during gameplay, so don't take this as me saying that DLSS is bad. Anything below DLSS Quality is basically worthless to me.

Also, I hope that people don't take this to mean that I'm against RT. I'm really looking forward to RT being eventually useable by me at better quality than what current hardware is capable of. Well, that assumes that GPUs capable of that aren't priced into the stratosphere where I can't justify the price of them. :p

Just like I loved the idea of good quality shadows in games, but up until just a few years ago, rarely did games have good enough shadows (too many artifacts or weird looking or distracting) that they weren't worth taking the performance hit to enable.

I'd love to have a new GPU better than my GTX 1070 for the general improvements in rasterization and compute. RT is just a bonus doodad that I can look at and say ooh and then promptly disable so that I can play the game. Basically what I do when I occasionally play games on a friend's PC with a 12900KS and a RTX 3090Ti.

Said friend's PC makes me REALLY want a HDMI 2.1 Graphics Card ASAP for locked 120 Hz gaming. 60 Hz unfortunately for me now feels like 30 Hz used to and 30 Hz is 100% unplayable now and ugly as all hell to look at. Luckily moving up to 240 Hz on his setup doesn't have the same effect (120 Hz still looks good and feels good to me), so I'm fairly confident that 120 Hz will be good enough for me until the day I die. :p

If I had to choose between 120 Hz or RT? 120 Hz all the time, every time.

Regards,
SB
 
I'm glad Xbox and PS are dipping their toes into RT this generation, but we won't see serious RT until next-generation IMO. I definitely think it should be more of a focus than 8k, going forward.
 
Happily as I can easily live without RT until the hardware is fast enough to actually handle RT

So on the one hand you argue that console GPU's are aging better this generation than they did last, while on the other hand you claim that even Ampere with it's vastly higher RT performance to those consoles is still "not fast enough to actually handle RT" (despite this being demonstrably untrue).

So where does that leave your first argument? There are over 2 dozen games on the current gen consoles right now that support RT - including some of the biggest names on those systems. There will be many, many more to come as we transition out of the cross gen period and UE5 in particular (with it's hardware RT based lighting) becomes prevalent.

In a few months we'll have a new generation of GPU's in the PC space which even RT's most ardent detractors will be hard pressed to claim are "not fast enough to actually handle RT".

How well do you expect the swathes of RT enabled games that we'll see over the next 3-4 years to to perform on the current gen consoles vs the likes of Ampere and Lovelace? Or should we just ignore those?

It's very much an apples to apples comparison, we're comparing high end gaming PC's of the time to consoles of the time and PC's containing multiple GPU's were very much a thing in 2013.

Very much a niche thing. Their existence in no way suggests that PC gamers should have expected higher performance vs consoles last gen compared with this gen. The baseline for the average PC users performance expectations has always been single GPU's because that's what the average PC gamer was using. And it's the only real way to measure the actual technological gap.

Think about it, your argument is that PC gamers are being fooled into expecting more performance now vs the current consoles because they were getting more performance last generation vs those last gen consoles. But you're saying that extra performance came from dual GPU's. So are you suggesting that your average PC gamer today is expecting the same performance premium over current gen consoles from a single GPU as they were getting last generation from 2 GPU's? Because if that's you're argument then I agree that would be an unreasonable expectation.

According to STEAM the PC average gamer is still rocking a GTX1060 with a quad core.

I don't see the relevance to this?
 
I don't understand the claim the hardware is not fast enough either. If you are not setting everything to ultra and running at native resolution, even a 2060 will give you great Raytracing performance as demonstrated by Dictator's videos many times.

I've been running nearly every RT supported game there is on my RTX 2060 laptop (which just uses around 80 watts) and I struggle to come across a game that is not running at a locked 60 FPS with my personal optimized settings (usually around high base settings and medium Raytracing). I've been getting great looking experiences using DLSS Performance at a stable 1440p60. RT in games where it's implemented correctly adds a ton to visual fidelity.

Cyberpunk being an exception though, that one is definately not possible at 60 FPS with RT. Maybe it is at 1080p with DLSS Performance, but my CPU is too weak for that.
 
I don't understand the claim the hardware is not fast enough either. If you are not setting everything to ultra and running at native resolution, even a 2060 will give you great Raytracing performance as demonstrated by Dictator's videos many times.

I've been running nearly every RT supported game there is on my RTX 2060 laptop (which just uses around 80 watts) and I struggle to come across a game that is not running at a locked 60 FPS with my personal optimized settings (usually around high base settings and medium Raytracing). I've been getting great looking experiences using DLSS Performance at a stable 1440p60. RT in games where it's implemented correctly adds a ton to visual fidelity.

Cyberpunk being an exception though, that one is definately not possible at 60 FPS with RT. Maybe it is at 1080p with DLSS Performance, but my CPU is too weak for that.

Keep in mind that's with DLSS enabled, which for me isn't an option for most titles that support DLSS.

Here's my requirements for any game I'd run with a modern HDMI 2.1 graphics card. NOTE - this only applies to me and there is no implications other than that these are my requirements. Other people may or may not share similar requirements, but this particular list is just for me.
  • 120 Hz. This includes not more than about 1% framerates (frame times would be more accuracte) being lower than 120 Hz.
    • Very rare exceptions for games that I REALLY like which can't hit that requirement. For example, Elden Ring.
  • Resolution of 3200x1800
    • If that isn't possible then 2560x1440p as a compromise resolution for most titles.
I will adjust any and all other settings in order to achieve those requirements (obviously this includes RT). It's the same with my current GTX 1070 card, except it can only do 60 Hz rendering over HDMI at my display's native 4k resolution (games run in a window). So, I'm quite definitely not running anything at Ultra because I don't care enough about Ultra graphics to degrade my playing or viewing experience.

Since I use a 55" 4k display as my monitor, it means that at the distance I sit from the display it has a similar pixel density to a 30" 2560x1600 PC monitor. This means that detail loss is more visible than if I was using a 27" - 32" 4k monitor.

Thus DLSS Quality is only usable on a few titles due to rendering anomalies introduced to the game versus without DLSS Quality enabled. So, there are a very few titles where I could use DLSS Quality in hopes of having RT at 1440p or 1800p run at a locked or near locked 120 Hz. So right off the bat a lot of titles are just not going to be worth even trying to run with RT on. For example, Dyling Light 2 had a pretty bad DLSS implementation at launch which I'm not sure whether they've fixed yet or not. That's also a title that without DLSS can't even hit my framerate requirements at 1080p with RT enabled, much less 1440p or 1800p with an RTX 3090. Granted I have no idea what settings TechRadar are using other than RT enabled.

Dying Light 2 PC performance: a new benchmark in ray tracing | TechRadar

Even they acknowledge that many people might not find the quality of RT in that game to be worth the performance hit for enabling RT.

Basically not all people will choose to disable or enabled certain graphic IQ settings in order to achieve a playable experience. Some people might choose to sacrifice more quality options in order to have RT. Some people might choose to sacrifice RT in favor of other quality settings.

If I wanted to run at 1080p or lower, then RT certainly becomes more feasible. Alternatively, if I was willing to accept DLSS image quality compromises to hit the resolutions I require, especially the lower DLSS quality levels then it becomes more feasible.

Again, NOTE, that while I don't find DLSS quality to be adequate compared to DLSS off in most games, that doesn't mean I think DLSS is bad (I don't) and it certainly has no implications as to whether other people find DLSS quality to be good enough to be worth using all the time.

Regards,
SB
 
Last edited:
So on the one hand you argue that console GPU's are aging better this generation than they did last, while on the other hand you claim that even Ampere with it's vastly higher RT performance to those consoles is still "not fast enough to actually handle RT" (despite this being demonstrably untrue).

An RTX 3090 is incapable of doing current day RT at native 4k/60fps without having to resort to something like DLSS, and that is with current gen games.

Heck it can't even lock CP2077 or Dylinf Light 2 to 60fps at native 1440p with full RT on so you're claim of it being demonstrably untrue is false.

Very much a niche thing.

Still no where as niche as an RTX3090 is.

Think about it, your argument is that PC gamers are being fooled

Err... no it's not, it's no where close to being that.
 
Back
Top