What level of GPU are the consoles equivalent to? *spawn

About this, he conceded that the RTX 3080 had an average fps by 25% but that the lows were much lower than on PS5 and attributed this to the I/O and memory management subsystems. He might not be entirely wrong because in my case, my 2080 Ti performs much better on my 13900K with DDR5 than it does on my 9900K with DDR4. Apparently, R&C really likes bandwidth.

If I recall correctly wasn't he measuring the first multi portal transition sequence on a SATA SSD, then attributing the fps lows during that sequence to whatever GPU he was using and then concluding something like it would need a 3080 to match the PS5 lows? I remember doing at least a couple of fairly extensive posts on the ridiculousness of that analysis at the time but God knows where they are now.
 
I think a more interesting question is what will the hardware be like for PS6, etc..

In the past, I've found a lot of console zealots making outright silly claims of 4090 level GPUs for $500. The evolution of GPUs coming from Nvidia are on another level compared to the same evolution on AMD GPUs.

Back on topic!

I agree with most here, the only way to get a glimpse of hardware comparison of PC and console is looking at the data DF provides.
 
I think a more interesting question is what will the hardware be like for PS6, etc..

In the past, I've found a lot of console zealots making outright silly claims of 4090 level GPUs for $500. The evolution of GPUs coming from Nvidia are on another level compared to the same evolution on AMD GPUs.

Back on topic!

I agree with most here, the only way to get a glimpse of hardware comparison of PC and console is looking at the data DF provides.
The 4090 will be over 5 years old by the time the PS6 comes out. Why wouldn’t it sport a faster GPU?
 
I think a more interesting question is what will the hardware be like for PS6, etc..

In the past, I've found a lot of console zealots making outright silly claims of 4090 level GPUs for $500. The evolution of GPUs coming from Nvidia are on another level compared to the same evolution on AMD GPUs.

Back on topic!

I agree with most here, the only way to get a glimpse of hardware comparison of PC and console is looking at the data DF provides.
If Sony and Microsoft are going with AMD next gen, amd will have a very good architecture bankrolled by both of them.
Maybe behind on efficiency and features by one or two years compared to Nvidia, just like rdna 2 (Which was very good at release).
A ps6 will be as powerful as a 4090 or more, if mid end gpu's in 2029 are weaker than a GPU from 6 years prior then goodbye đź‘‹
 
The 4090 will be over 5 years old by the time the PS6 comes out. Why wouldn’t it sport a faster GPU?

Because the tier/price of GPU consoles typically go for doesn't advance in terms of performance any where near as fast as the top tier does.

Unless Sony are willing to add a couple hundred dollars on to PS6's cost to put towards a larger GPU they'll be lucky to get to 4090 level.

And that's not even factoring in being able to power and cool such a GPU as consoles don't seem to want to go more than 230w power draw.
 
Because the tier/price of GPU consoles typically go for doesn't advance in terms of performance any where near as fast as the top tier does.
1. That's completely untrue. The jump from the 2080>3080>4080 is larger than the jump from 2080 Ti>3090>4090. There was some major stagnation with the 60 series but this is the exception, not the norm. The performance differentials from 960>1060>2060>3060 were quite good. It wasn't until the 4060 that it fell apart but I'm not expecting a repeat of this.
2. We're talking 3-4 years from now, late 2027/early 2028. The 4090 will be as old as the 2080 Ti is now. We'll be at the RTX 60 series and the 4090 will be two generations removed from its top tier status. Do you think a console released today for $500 would have a GPU weaker than the 2080 Ti? The PS5 Pro will be released at the end of the year and we're expecting it to have a GPU on the level of a 3080/4070. If that comes to pass but the PS6 is still weaker than a 4090, then it means it won't even be twice the performance of the PS5 Pro which I find extremely unlikely.
Unless Sony are willing to add a couple hundred dollars on to PS6's cost to put towards a larger GPU they'll be lucky to get to 4090 level.
No, they won't. A $500 console released in 2028 will have a GPU that outperforms the top PC one from the year 2022. It would be ridiculous if it didn't. Look at when the PS5 was released, it managed to be close to a 2080 just two years later.
And that's not even factoring in being able to power and cool such a GPU as consoles don't seem to want to go more than 230w power draw.
Which is nice and all but in 3-4 years from now, there will be significant performance/watt improvements. If the trend continues, the PS6 will have a GPU on the level of a 5070 but will be out against the 6060/6070. I'm fully expecting it to be not only faster than the 4090 but to also have a much more advanced feature set. That's in gaming only though. It won't come all that close in compute.

Don't be blinded by the current way the 4090 crushes everything. It looks mighty impressive but it'll be old in 2028. The 1080 Ti also seemed unstoppable, yet the consoles managed to have better GPUs less than 4 years later.
 
It just depends on whether or not you believe Nvidia can't sell faster chips or lower prices because their margins would be untenable. Today's market must be an unavoidable side effect of transistor cost.
 
That's completely untrue. The jump from the 2080>3080>4080 is larger than the jump from 2080 Ti>3090>4090.

Those tiers are above what Sony and Microsoft go for, so irrelevant.

There was some major stagnation with the 60 series but this is the exception, not the norm.

It's the norm, and this is the tier Sony and Microsoft typically aim for.

The performance differentials from 960>1060>2060>3060 were quite good. It wasn't until the 4060 that it fell apart but I'm not expecting a repeat of this.

I am.

We're talking 3-4 years from now, late 2027/early 2028. The 4090 will be as old as the 2080 Ti is now. We'll be at the RTX 60 series and the 4090 will be two generations removed from its top tier status. Do you think a console released today for $500 would have a GPU weaker than the 2080 Ti?

Yes, I suggest you go and look at AMD's power consumption numbers for RDNA3.

For the ~180w power budget they have, they're not getting a 2080ti.

The PS5 Pro will be released at the end of the year and we're expecting it to have a GPU on the level of a 3080/4070. If that comes to pass but the PS6 is still weaker than a 4090, then it means it won't even be twice the performance of the PS5 Pro which I find extremely unlikely.

The closest GPU to the 3080/4070 from RDNA3 is the 7800XT, which according to Techpowerup consumes 250w during gaming.

So unless Sony are willing to pull 300w and also able to cool it you won't get 3080/4070 as they'll need to downclock to reduce power draw, which will reduce performance.

And even then, in RT, the 3080/4070 (especially the 4070) will batter the 7800XT.

No, they won't. A $500 console released in 2028 will have a GPU that outperforms the top PC one from the year 2022. It would be ridiculous if it didn't. Look at when the PS5 was released, it managed to be close to a 2080 just two years later.

And in ray tracing (which will be PS5 Pro's and PS6's main focus) the 2080 batters PS5.

And unless AMD massively increase their RT performance, PS6 will arguably, still struggle to match a 4090.

Which is nice and all but in 3-4 years from now, there will be significant performance/watt improvements.

AMD have yet to show this, RDNA3 was a step backwards.

If the trend continues, the PS6 will have a GPU on the level of a 5070 but will be out against the 6060/6070.

Actually if the current trend continues, PS6 will be no where close to that due to pricing, GPU's cost is going up, not down.

Heck they might only to be able to afford a 5050.

I'm fully expecting it to be not only faster than the 4090 but to also have a much more advanced feature set. That's in gaming only though. It won't come all that close in compute.

Don't be blinded by the current way the 4090 crushes everything. It looks mighty impressive but it'll be old in 2028.

Again, unless AMD massively increase their RT performance (which again, will be next gens focus) then they'll struggle to match the 4090.

The 1080 Ti also seemed unstoppable, yet the consoles managed to have better GPUs less than 4 years later.

Looking at Techpowerup the 1080ti is 6600XT/2070 Super performance in raster, so no, the consoles didn;t manage to have better GPU's less than 4 years later.

The argument could easily be made that they're slightly slower than the 1080ti.

You need to come a little bit, GPU's are getting more and more expensive, and next gen is going to be purely about RT performance, maybe even PT performance.

And in RT/PT loads the 4090 is so far a head of the 7900XTX and AMD's other efforts it's not even funny, unless AMD mange to triple RT performance in RDNA4's mid-range offerings you're not getting 4090 performance in PS6.

This should put it in to context how far behind AMD are behind Nvidia in next gen RT/PT....notice that 2080ti you've been talking about, beats AMD's current best GPU? PS5 in RT is the 6600XT.....at 0.5fps

performance-pt-2560-1440.png
 
It's the norm, and this is the tier Sony and Microsoft typically aim for.
No, it absolutely is not the norm. 960>1060 is a 70% performance improvement. 1060>2060. 1060>2060 is a 50% performance improvement and added ray tracing.
And you'll be wrong.
Yes, I suggest you go and look at AMD's power consumption numbers for RDNA3.
Because they operate in a PC environment with unbridled power consumption.

Look at how the 7700 XT which is faster than the 2080 Ti operates at ~170W.


Almost the same performance as stock and that's a card that's already 14% faster than the 2080 Ti. You can absolutely get 2080 Ti-level of performance on RDNA3 on 180W today.
For the ~180w power budget they have, they're not getting a 2080ti.
Yes, they are. Console clocks are fine-tuned to get the best performance/watt. Maintaining 200W total for a machine sporting a 2080 Ti-class GPU for $500 would be possible today.
The closest GPU to the 3080/4070 from RDNA3 is the 7800XT, which according to Techpowerup consumes 250w during gaming.

So unless Sony are willing to pull 300w and also able to cool it you won't get 3080/4070 as they'll need to downclock to reduce power draw, which will reduce performance.
The Pro will be RDNA 3.5/4 on a smaller node. It's expected to be derivated from N48 with 60/64 CUs and offer performance on the level of a 6800 XT/4070/3080.
And even then, in RT, the 3080/4070 (especially the 4070) will batter the 7800XT.

And in ray tracing (which will be PS5 Pro's and PS6's main focus) the 2080 batters PS5.

And unless AMD massively increase their RT performance, PS6 will arguably, still struggle to match a 4090.
RT is a different ballgame. Until AMD gets it together, I'm not expecting their offering to match NVIDIA pound-for-pound. And aren't you admitting here that the PS6 will be faster indirectly? You're saying that even assuming that AMD doesn't massively improve their ray tracing, the PS6 will struggle to match a 4090. If it struggles to match it (but still comes reasonably close) in ray tracing, then you can bet it'll beat it in rasterization. You can bookmark/save these posts. I guarantee you without a doubt that the PS6 will easily beat the RTX 4090 in rasterization. Ray tracing is definitely not impossible but this is all contingent on AMD getting their shit together.
Actually if the current trend continues, PS6 will be no where close to that due to pricing, GPU's cost is going up, not down.

Heck they might only to be able to afford a 5050.
Just because NVIDIA overcharges doesn't mean that AMD will do the same to their biggest partners. We both know that the 4080 isn't worth anywhere near its asking price of $1200 (down to $1000 for the 4080S) and that NVIDIA has ludicrous profit margins. AMD won't scam Sony like NVIDIA rips off their customers. They rely on volume, not high margins.
Looking at Techpowerup the 1080ti is 6600XT/2070 Super performance in raster, so no, the consoles didn;t manage to have better GPU's less than 4 years later.
Yes, they did. The PS5 beats the 6600 XT 100% of the time, especially at higher resolutions against the 6600 XT's paltry 256 GB/s bandwidth. The 2070S also beats the 1080 Ti comfortably more often than not these days. That's without bringing up newer features such as Mesh Shaders that aren't even supported on Pascal. The PS6 will have a much more advanced feature set than the 4090 which will be ancient by then.

The argument could easily be made that they're slightly slower than the 1080ti.

You need to come a little bit, GPU's are getting more and more expensive, and next gen is going to be purely about RT performance, maybe even PT performance.

And in RT/PT loads the 4090 is so far a head of the 7900XTX and AMD's other efforts it's not even funny, unless AMD mange to triple RT performance in RDNA4's mid-range offerings you're not getting 4090 performance in PS6.

This should put it in to context how far behind AMD are behind Nvidia in next gen RT/PT....notice that 2080ti you've been talking about, beats AMD's current best GPU? PS5 in RT is the 6600XT.....at 0.5fps

View attachment 10609
AMD cards are currently garbage at ray tracing but it's rumored that RDNA4 will have pretty significant advancements with their dedicated ray tracing units. For one, ray intersections will no longer be done on the TMUs. I don't think AMD will match NVIDIA blow-for-blow anytime soon in ray tracing, but they won't remain 2 generations behind forever either. Ray tracing until RDNA4 was just an afterthought and almost a bonus thrown in because they could do it. Starting with RDNA4 and moving forward, it'll be a much bigger focus. It's becoming increasingly important so AMD will have to stop just repurposing their TMUs for rt tasks when they're clearly not up to snuff.

It would be nothing short of embarrassing for a 2027/28 console to not beat a 2022/23 GPU.
 
Last edited:
Isn't the Turing architecture from 2018 more advanced than RDNA 3 from 2023?
If a console is to come close to the performance of an RTX 4090 for patch tracing AMD really has to catch up.

Intel Arc, on the other hand, was able to catch up from a standing start.

You can't compare the GTX 1080 Ti very well. The GTX 1080 Ti only consumes 0,25 kWh while an RTX 4090 ends up at 0,40 kWh. The RTX 4090 also cost twice as much as a GTX 1080 Ti when it was released.

If a console is to achieve RTX 4090 path tracing performance at 0.2 kWh and $600 a lot still needs to be done. Very unlikely in 2026.

Also when I look at DLSS it has effectively doubled the performance of the Nvidia graphics cards. I would rather take an RTX 4080 with DLSS than an RTX 4090 without DLSS.

It would be nothing short of embarrassing for a 2027/28 console to not beat a 2022/23 GPU.
Let's hope that the PlayStation 5 Pro beats an RTX 2080 Ti in path tracing then. There are 6 years in between.
 
Last edited:
Isn't the Turing architecture from 2018 more advanced than RDNA 3 from 2023?
If a console is to come close to the performance of an RTX 4090 for patch tracing AMD really has to catch up.

Intel Arc, on the other hand, was able to catch up from a standing start.

You can't compare the GTX 1080 Ti very well. The GTX 1080 Ti only consumes 0,25 kWh while an RTX 4090 ends up at 0,40 kWh. The RTX 4090 also cost twice as much as a GTX 1080 Ti when it was released.

If a console is to achieve RTX 4090 path tracing performance at 0.2 kWh and $600 a lot still needs to be done. Very unlikely in 2026.
The PS6 is coming out most likely in 2028, not 2026. The PS3 was released in November 2006. The PS4 in November 2013, and the PS5 in November 2020. Exactly 7 years between each new console so the PS6 with a release date around November 2027 (almost 2028) is very likely. Obviously, a PS6 released in 2026 would have no chance of outperforming a 4090.

You've also seen how utterly irresponsible the power delivery of the 4090 is out of the box. You can massively undervolt it and retain performance close to the stock voltage. It's far beyond the optimal efficiency curve.

4090 Powerscaling-scaled.jpg

81% of the power consumption for 98% of the performance. It varies a bit but in general, you can maintain around 90-95% of the 4090's performance for around 80% of the TDP.
Also when I look at DLSS it has effectively doubled the performance of the Nvidia graphics cards. I would rather take an RTX 4080 with DLSS than an RTX 4090 without DLSS.

Let's hope that the PlayStation 5 Pro beats an RTX 2080 Ti in path tracing then. There are 6 years in between.
We'll have to see what N48 does because I don't think there will be any sort of path tracing on the PS5 Pro. It's not viable on the vast majority of GPUs, including ones significantly stronger than whatever the PS5 Pro will have such as the 7900 XTX. Without frame generation, path tracing is only really playable on a 4080-class GPU and above.
 

Attachments

  • 1705252089334.png
    1705252089334.png
    7.5 KB · Views: 4
Last edited:
Back
Top