Next-gen console versus PC comparison *spawn

Well, since a 7870 class gpu still tags along for the most, i assume going with any mid end AMD gpu 6000 series will tag along too just fine. Especially since now they are actually holding as much ram as the entire consoles do.
 
So, what GPU (amd variant), CPU, ram etc would one want to match the PS5?

My take would be that for the next 2-4 years a 3700x+6800 would be overkill. After that as driver support drops off and games take more advantage of low level access to the consoles there's a chance you might need more, at least on the CPU front. I imagine 16GB system memory should last all generation if you also have 16GB VRAM, and you'll obviously need very fast SSD too. I'd be going for one of the 7GB/s models as we still don't know how decompression is going to work on the AMD side so can't necessarily rely on it.
 
So, what GPU (amd variant), CPU, ram etc would one want to match the PS5?
CPU should be at least 8c/16t, 16GB of RAM should be enough. The GPU will be tricky. I think a 12TF (extra overhead to account for console being more optimized) GPU should be enough to at least match PS5, but that means a 2080Ti (and it is a different architecture, thus it might or might not be enough. I think it will be perfect). On AMD side, there is no 12TF GPU. You have 16TF GPU in 6800. The rumored 6700XT is around 10TF. There is a big empty space between 6700XT and 6800. Of course there is 3060, but Ampere flops for gaming doesn't exactly scale, thus 13TF 3060 is probably not going to perform as well as 10TF Navi for gaming. So if you after next gen GPU to match PS5 (or XsX), either you undershoot or overshoot. Of course the VRAM is also important, thus even if you want to go with 3060, the VRAM is only 6GB.
Personally my ideal GPU to match PS5 would be a 12TF RDNA2 or a 15TF Ampere, both don't exist.
For me personally, I'm not aiming to match PS5 or XsX. I'm aiming to be able to play next gen games at 1080p with PS5/XsX setting (not XsS setting), thus a 6700 or 6700XT should be good enough for me as long as it has 8GB (preferably more) VRAM. The rumor is that those will have 12GB VRAM, so it should be perfect for my needs. Again, based on the rumor, the bigger one will only be 150W (not sure if it is TDP or TBP) so hopefully a single 8pin power is enough.
I do use a 4K TV as my monitor, but from when I sit when I'm gaming, 4K is almost useless (and I only sit less than 2m from the TV) so I don't really care about 4K.
If 6700 is within my budget, then my worry is shifted from GPU to CPU since I'm currently on Ryzen 3600. I'm afraid I will get less than ideal performance for next gen games with my CPU being the biggest bottleneck. Like if devs starts to target PS5/XsX level of CPU performance, I have no doubt I will be CPU limited at 1080p even if I only want a stable 60fps. I don't expect devs to ignore 6c/12t CPU, but for sure there will be games even at 1080p it will run at less than stable 60fps regardless of GPU because of CPU limitation. Of course you can always turn off stuff to make it more stable, but that means you're wasting your GPU performance.
 
CPU should be at least 8c/16t, 16GB of RAM should be enough. The GPU will be tricky. I think a 12TF (extra overhead to account for console being more optimized) GPU should be enough to at least match PS5, but that means a 2080Ti (and it is a different architecture, thus it might or might not be enough. I think it will be perfect). On AMD side, there is no 12TF GPU. You have 16TF GPU in 6800. The rumored 6700XT is around 10TF. There is a big empty space between 6700XT and 6800. Of course there is 3060, but Ampere flops for gaming doesn't exactly scale, thus 13TF 3060 is probably not going to perform as well as 10TF Navi for gaming. So if you after next gen GPU to match PS5 (or XsX), either you undershoot or overshoot. Of course the VRAM is also important, thus even if you want to go with 3060, the VRAM is only 6GB.
Personally my ideal GPU to match PS5 would be a 12TF RDNA2 or a 15TF Ampere, both don't exist.
For me personally, I'm not aiming to match PS5 or XsX. I'm aiming to be able to play next gen games at 1080p with PS5/XsX setting (not XsS setting), thus a 6700 or 6700XT should be good enough for me as long as it has 8GB (preferably more) VRAM. The rumor is that those will have 12GB VRAM, so it should be perfect for my needs. Again, based on the rumor, the bigger one will only be 150W (not sure if it is TDP or TBP) so hopefully a single 8pin power is enough.
I do use a 4K TV as my monitor, but from when I sit when I'm gaming, 4K is almost useless (and I only sit less than 2m from the TV) so I don't really care about 4K.
If 6700 is within my budget, then my worry is shifted from GPU to CPU since I'm currently on Ryzen 3600. I'm afraid I will get less than ideal performance for next gen games with my CPU being the biggest bottleneck. Like if devs starts to target PS5/XsX level of CPU performance, I have no doubt I will be CPU limited at 1080p even if I only want a stable 60fps. I don't expect devs to ignore 6c/12t CPU, but for sure there will be games even at 1080p it will run at less than stable 60fps regardless of GPU because of CPU limitation. Of course you can always turn off stuff to make it more stable, but that means you're wasting your GPU performance.

Thanks for your constructive post. Makes alot of sense, my 7870 and 7950 still perform atleast aswell as the consoles at same settings, so looks good. I would say go for a 6700XT at the least, but preferable 6800/XT for that extra power to enjoy a major step up over the PS5.
 
"At least as well."


To be fair he did say a 7870 which is quite a lot faster than a 7850. And the game does require 4GB minimum so there's definitely going to be an impact there. 2GB GPU's at the start of last gen were doubtless hampered by their VRAM size regardless of the higher core power which doesn't look like it will be a problem this generation.

The ratio between the PS5 and the RX6800 is more akin to the 7950 in comparison to the PS4 - if the 7950 had 8GB VRAM. I'd expect such a GPU to be more than capable of matching or exceeding the base PS4 performance in WD:L.
 
To be fair he did say a 7870 which is quite a lot faster than a 7850. And the game does require 4GB minimum so there's definitely going to be an impact there. 2GB GPU's at the start of last gen were doubtless hampered by their VRAM size regardless of the higher core power which doesn't look like it will be a problem this generation.

The ratio between the PS5 and the RX6800 is more akin to the 7950 in comparison to the PS4 - if the 7950 had 8GB VRAM. I'd expect such a GPU to be more than capable of matching or exceeding the base PS4 performance in WD:L.

But the two GPUs(7870 and 7850) are VRAM limited. This is what we said. We don't talk about RX6800. This is multiples time, he said something false and repeat it like a robot. We don't care of an hypothetical 7950 or 7870 with more RAM, the product was never available.

At least this time all GPU out of probably the 3070 and 8 GB of VRAM will probably not be VRAM limited.

EDIT:
Makes alot of sense, my 7870 and 7950 still perform at least aswell as the consoles at same settings, so looks good

false
 
To be fair he did say a 7870 which is quite a lot faster than a 7850. And the game does require 4GB minimum so there's definitely going to be an impact there. 2GB GPU's at the start of last gen were doubtless hampered by their VRAM size regardless of the higher core power which doesn't look like it will be a problem this generation.

The ratio between the PS5 and the RX6800 is more akin to the 7950 in comparison to the PS4 - if the 7950 had 8GB VRAM. I'd expect such a GPU to be more than capable of matching or exceeding the base PS4 performance in WD:L.
7870 is 15% faster. It's not coming close to bridging the gap. 960x540 is less than half of the lowest resolution PS4 hits at 1440x810. PS4 also has much higher visual settings while having a minimum framerate 67% higher.
 
But the two GPUs(7870 and 7850) are VRAM limited. This is what we said. We don't talk about RX6800. This is multiples time, he said something false and repeat it like a robot. We don't care of an hypothetical 7950 or 7870 with more RAM, the product was never available.

At least this time all GPU out of probably the 3070 and 8 GB of VRAM will probably not be VRAM limited.

EDIT:


false

I was referring back to the earlier question of what you'd need to last out this generation at PS5 settings and my response around the 6800. I thought PSman was using the 7870/7950 performance as a proxy, so my point was that the 7950 would be the better proxy but you'd still have to give it more VRAM.

There was actually a version of the 7970 with 6GB RAM which would probably be a reasonable proxy for the 6800XT today. It's be interesting to see how that fairs in current games.

7870 is 15% faster. It's not coming close to bridging the gap. 960x540 is less than half of the lowest resolution PS4 hits at 1440x810. PS4 also has much higher visual settings while having a minimum framerate 67% higher.

But it had 45% more shader throughput so it depends what the bottleneck is in this game (outside of VRAM). And besides, this is hardly an apples to apples comparison. We have no idea how the settings compare across the two versions and while the PC version doubtless looks worse overall on the lowest settings, it's entirely possible that some settings don't scale as low as the consoles (this is fairly common) and those settings could be bottlenecking performance.

Better to try to match settings as much as possible, including resolution, and then just measure frame rate differences. Don't get me wrong, the 7850 would still come out horribly in that comparison, but at least it would be a reasonable basis for comparison. That said, the lack of VRAM makes such a comparison almost worthless because increasing the settings increases the pressure on VRAM too.
 
I was referring back to the earlier question of what you'd need to last out this generation at PS5 settings and my response around the 6800. I thought PSman was using the 7870/7950 performance as a proxy, so my point was that the 7950 would be the better proxy but you'd still have to give it more VRAM.

There was actually a version of the 7970 with 6GB RAM which would probably be a reasonable proxy for the 6800XT today. It's be interesting to see how that fairs in current games.



But it had 45% more shader throughput so it depends what the bottleneck is in this game (outside of VRAM). And besides, this is hardly an apples to apples comparison. We have no idea how the settings compare across the two versions and while the PC version doubtless looks worse overall on the lowest settings, it's entirely possible that some settings don't scale as low as the consoles (this is fairly common) and those settings could be bottlenecking performance.

Better to try to match settings as much as possible, including resolution, and then just measure frame rate differences. Don't get me wrong, the 7850 would still come out horribly in that comparison, but at least it would be a reasonable basis for comparison. That said, the lack of VRAM makes such a comparison almost worthless because increasing the settings increases the pressure on VRAM too.

You'd definitely need to compensate for the IO too. I don't think it's simply a case of mapping one GPU and saying this other GPU matches it roughly because of performance and VRAM. An equivalent PC would need to be much more performant in flops, have a similar VRAM allocation (will 8GBs really be enough for the whole gen?) and be able to be fed new data quicker than the PS5's IO in order to make up for other deficits.

I think we can all agree that the 7850/7870 are not comparable to the PS4 when you consider all parameters and not just one dimension. PSMan is suggesting that they are and using it as a means to show how you'd need to apply the same logic here, which is patently false and prevents further informed discussion.
 
But it had 45% more shader throughput so it depends what the bottleneck is in this game (outside of VRAM). And besides, this is hardly an apples to apples comparison. We have no idea how the settings compare across the two versions and while the PC version doubtless looks worse overall on the lowest settings, it's entirely possible that some settings don't scale as low as the consoles (this is fairly common) and those settings could be bottlenecking performance.

Better to try to match settings as much as possible, including resolution, and then just measure frame rate differences. Don't get me wrong, the 7850 would still come out horribly in that comparison, but at least it would be a reasonable basis for comparison. That said, the lack of VRAM makes such a comparison almost worthless because increasing the settings increases the pressure on VRAM too.
Did you watch the video? It actually looks like an Xbox 360 game at these settings. I'd say its incredibly unlikely that some higher than console settings are bottlenecking performance.
 
You'd definitely need to compensate for the IO too. I don't think it's simply a case of mapping one GPU and saying this other GPU matches it roughly because of performance and VRAM. An equivalent PC would need to be much more performant in flops, have a similar VRAM allocation (will 8GBs really be enough for the whole gen?) and be able to be fed new data quicker than the PS5's IO in order to make up for other deficits.

I think we can all agree that the 7850/7870 are not comparable to the PS4. PSMan is suggesting that they are and using it as a means to show how you'd need to apply the same logic here, which is patently false and prevents further informed discussion.

I think 10 GB is the minimum, this is the amount of fast RAM in Xbox Series X, behind you risk to be VRAM limited at long term.
 
I think 10 GB is the minimum, this is the amount of fast RAM in Xbox Series X, behind you risk to be VRAM limited at long term.

Even then you'd need to have a fast IO system flushing that data constantly. The next gen is going to see significant changes in how data are managed.

I don't think we're going to see huge amounts of games that simply load 8-16GBs for a level and do no further loading of data during the time spent on that level. Fighting games perhaps, but those are few and far between these days and even those could potentially have warping and changing levels that'd load data constantly at rates exceeding the best HDD.
 
Even then you'd need to have a fast IO system flushing that data constantly. The next gen is going to see significant changes in how data are managed.

I don't think we're going to see huge amounts of games that simply load 8-16GBs for a level and do no further loading of data during the time spent on that level. Fighting games perhaps, but those are few and far between these days and even those could potentially have warping and changing levels that'd load data constantly at rates exceeding the best HDD.

I agree a good NVME SSD PCIE 3 will be mandatory.
 
I think we can all agree that the 7850/7870 are not comparable to the PS4 when you consider all parameters and not just one dimension.





Watchdogs is considered a terrible port, same goes for HZD according to many (depending on configuration).

Yes, the 7870 still tags along fine. You can still game on it, pretty close to base one/ps4 settings. Actually, settings on base consoles can be lower then low on pc. Also, its not like the base ps4 never drops any frames (seen RDR2 for example?), its kinda bad at times, in special modern games.
Seeing tests videos, for what it is the 7870 2gb isnt that far off from what the base ps4 is doing. Despite it being a 2012 mid range gpu with a puny 2gb ram and on drivers that almost sure are not optimized anywhere close for a 7870.
Someone that had bought a 7870 back in 2012 had the option to use it for the whole generation, you have to go lower on settings it ages, but so did the PS4.
You have to see over the whole seven years aswell, in the beginning that 7870 was actually outperforming the PS4, with as time passed on, the 7870 having to go lower and lower in settings.

Seeing how it went for the comparable GTX660.... AMD GPUs have aged much better (due to compute, console etc?).

A 3700X/6800 (XT), fast pci3 or 4 nvme with direct storage will last a generation, just like a 7870 is still servicable even today.
All in all, im surprised a 7870 2gb from 2012 still can even run anything. Its probably the first time hardware has lasted this long, in special lower end stuff.

Think that a 6800 or its XT variant sports 16GB vram dedicated to the GPU, where the PS4 in 2013 had a ram advantage, that now goes to the AMD GPU if talking 6800 or higher. The PS5 is not going to use all of its 16GB ram just for vram.
Aside from that, those gpus are twice as capable in raw TF (17 respective 20)TF, and probably more advanced due to their huge infinity cache pool, dedicated bandwith and full RDNA2 feature set. Almost double CU's at close to PS5 clocks.

Even then you'd need to have a fast IO system flushing that data constantly. The next gen is going to see significant changes in how data are managed.

Assuming NV, MS etc arent lying, we will see 14GB/s down the line on pc. For now, we will have to do with 7gb/s pci4 ssd nvme solutions.
 
You'd definitely need to compensate for the IO too. I don't think it's simply a case of mapping one GPU and saying this other GPU matches it roughly because of performance and VRAM. An equivalent PC would need to be much more performant in flops, have a similar VRAM allocation (will 8GBs really be enough for the whole gen?) and be able to be fed new data quicker than the PS5's IO in order to make up for other deficits.

Yeah I did address that in my earlier post. My suggestion was that you'd want an RX 6800 along with as fast a PCIe 4.0 SSD as you can lay your hands on. The RX 6800 I suspect will be overkill during its "useful PC life" while it's still supported in drivers and by developers. But once it falls of that treadmill (like the first gen GCN GPU's did some time ago) it'll have to rely on raw horsepower to make up the efficiency deficit. Whether it has enough of that to last the entire generation is up for debate but my suspicion is that in 95%+ of case it will.

On the IO side, we're still yet to find out exactly what Direct Storage does on the PC side and whether AMD GPU's will handle decompression like Nvidia's with RTX-IO. If they will, then you probably won;t need to go bleeding edge with the SSD, but I would anyway just in case GPU based decompression isn't a default part of the Direct Storage standard.

Did you watch the video? It actually looks like an Xbox 360 game at these settings. I'd say its incredibly unlikely that some higher than console settings are bottlenecking performance.

Oh yeah it looked terrible. But often you will find that increasing settings from low to medium has a relatively low impact on performance. So it may only be a handful of settings that you need to bump up with a minor performance impact to equal the base console settings. Ditto with the resolution. The res used was ridiculously low so I wouldn't be surprised if increasing it to something still ridiculously low like 840p wouldn't have a huge impact. It's possible that that isn't the bottleneck, the tester is simply turning everything down to minimum to try and maximise frame rates rather than looking for optimal settings of visual quality vs frame rate.

I think 10 GB is the minimum, this is the amount of fast RAM in Xbox Series X, behind you risk to be VRAM limited at long term.

I'm not entirely convinced of this at the moment. While 10GB will definitely limit you at maxed out PC settings (it already can in some games), it might be enough to last the generation at console settings. This was looked at in another thread recently with the 4GB GTX 980 acting as a proxy (much more core power but half the VRAM of the current gen consoles) and the result seemed to be that the 980 was never bottlenecked by it's VRAM at console settings with no examples found were it didn't perform better (usually significantly so) than the PS4. Of course the fast IO could change that dynamic this gen though.

Even then you'd need to have a fast IO system flushing that data constantly. The next gen is going to see significant changes in how data are managed.

I don't think we're going to see huge amounts of games that simply load 8-16GBs for a level and do no further loading of data during the time spent on that level. Fighting games perhaps, but those are few and far between these days and even those could potentially have warping and changing levels that'd load data constantly at rates exceeding the best HDD.

Yes I agree with this. Worth noting though that PC's make things less straight forward to compare as they also have system RAM pools. System RAM can be used as a fast cache (faster than the console SSD's) to store data outside of VRAM for fast retrieval without having to rely on fast IO. Granted it can't replace some aspects of fast IO, but insofar as the fast IO being a VRAM multiplier because you have to pre-cache less far ahead into the future, the system RAM could full fill a similar role. As a very simplified example you could say cache the next 30 seconds in 16GB of VRAM, but then another 30 seconds after that in system RAM before having to worry about having to go back to the IO.

I still see fast IO as essential this gen for a console matching experience though.
 
Last edited:
This was looked at in another thread recently with the 4GB GTX 980 acting as a proxy (much more core power but half the VRAM of the current gen consoles) and the result seemed to be that the 980 was never bottlenecked by it's VRAM at console settings with no examples found were it didn't perform better (usually significantly so) than the PS4.

Mind you the comparison is to 5GB title memory on consoles, so 4GB ought to be more than enough to cover the GPU/VRAM allocation at similar settings.
 
Back
Top