NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
In fact, you could build a PC with similar or better performance at the same cost just a few months after their release (750 ti+cheap i3).
That sort of setup was only ever 'comparable' with a PS4 back in like 2014 when people were comparing early cross-gen titles.

I still agree with your overall point, it just bugs me that people still believe this whole '750Ti + i3' thing. If anything, I think this really demonstrates that consoles tend to be more capable than people initially suspect. I'm remembering Gamers Nexus did a video last year assessing that a PS5 was really only as powerful as like a GTX1070 or 2060 or some nonsense. Devs will get a lot out of these new systems, all while more powerful PC's will get put through harder workouts than these current cross-gen games, and so will get put to good use. Definitely nothing 'depressing' about the situation.
 
That sort of setup was only ever 'comparable' with a PS4 back in like 2014 when people were comparing early cross-gen titles.

I still agree with your overall point, it just bugs me that people still believe this whole '750Ti + i3' thing. If anything, I think this really demonstrates that consoles tend to be more capable than people initially suspect. I'm remembering Gamers Nexus did a video last year assessing that a PS5 was really only as powerful as like a GTX1070 or 2060 or some nonsense. Devs will get a lot out of these new systems, all while more powerful PC's will get put through harder workouts than these current cross-gen games, and so will get put to good use. Definitely nothing 'depressing' about the situation.
Yeah, you don't really need a 8C CPU and a 2070S to play modern console games on pretty much console settings either. If not for covid and crypto it would probably be very similar to previous gen from PC cost perspective.
 
Actually it's not that bad this time because the last console generation leap was smaller and the hardware was even weaker where contemporary mid-range GPUs were able to outperform PS4/XBO before their release at very affordable prices. In fact, you could build a PC with similar or better performance at the same cost just a few months after their release (750 ti+cheap i3). Besides, games continued to push PC hardware as they always have. This time around the console CPUs get a decent upgrade as well so it will be interesting to see how that translates into CPU usage in upcoming games.

Lots of developers are still targeting last gen consoles so once they move to a single hardware generation target we'll really see a difference. I mean even last gen we had massive jumps in visual fidelity in the same series from the start of the gen to the end or so (COD Ghosts > COD MW, BF4 > BFV, AC4 > ACUnity, GTA V > RDR2). Basically what I'm saying as time goes on, we will see significant jump in visual fidelity as we've seen in every console generation.

PS3>PS4 leap was larger then PS4 to PS5 in hardware terms. The 750Ti was and still is a weird comparison to begin with, a 7870-class gpu holds up much better today. Kepler wasn't really going the direction it should have been going. Mid-range gpu's today (2020 and past) are actually outperforming consoles today too. Consoles are in the low-range end now GPU wise (RTX3060-ballpark. or RX6600XT). In RT its below that.
CPU-wise well below a 3700x zen2 (2019-cpu).


I think this really demonstrates that consoles tend to be more capable than people initially suspect.

They have been performing quite close to what the specifcations indicate. If people expected much below that then they were indeed expecting too litt.e
 
They have been performing quite close to what the specifcations indicate. If people expected much below that then they were indeed expecting too litt.e
Again, judging what the consoles can do based on cross-gen titles is silly. They will be capable of some pretty amazing things that GPU's like a 2070 or whatever will ultimately start struggling with down the line.
 
Again, judging what the consoles can do based on cross-gen titles is silly. They will be capable of some pretty amazing things that GPU's like a 2070 or whatever will ultimately start struggling with down the line.

There's no obvious reason for that to be the case. They are a match feature set wise and there doesn't seem to be any specific architectural strength about RDNA2 that could put it ahead of Turing if focused on. That differs to the previous generation where GCN1.0 had a clear advantage over Kepler in GPGPU/compute capability which was heavily used as the generation moved on.

In fact if anything things have gone the opposite way this generation with Turing being considerably more capable in Ray Tracing which should get more focus throughout the generation. So in fact it's quite possible we'll see the 2070 and similar performing relatively better against the consoles as this generation progresses.
 
There's no obvious reason for that to be the case. They are a match feature set wise and there doesn't seem to be any specific architectural strength about RDNA2 that could put it ahead of Turing if focused on. That differs to the previous generation where GCN1.0 had a clear advantage over Kepler in GPGPU/compute capability which was heavily used as the generation moved on.

In fact if anything things have gone the opposite way this generation with Turing being considerably more capable in Ray Tracing which should get more focus throughout the generation. So in fact it's quite possible we'll see the 2070 and similar performing relatively better against the consoles as this generation progresses.
This isn't about AMD vs Nvidia, this is about developers always being able to squeeze more out of a fixed spec device over time, especially when they are usually the target baseline for development to begin with.

But yes, if we're talking ray tracing workloads, obviously RTX stuff will change the equation given RDNA2's weakness there.
 
Again, judging what the consoles can do based on cross-gen titles is silly.

'Next-gen' games taking advantage of the console RDNA1.5 hardware will take advantage of RDNA/Turing etc hardware on pc equally as much.

They will be capable of some pretty amazing things that GPU's like a 2070 or whatever will ultimately start struggling with down the line.

A RTX2070 will keep up perfectly fine and then some. Its probably going to be the other way around, a 2070 class gpu has around the same raster performance, but has the upperhand in ray tracing and AI reconstruction. More often than not that 2070 will most likely be teamed to something more capable then the consoles CPU's which are basically downclocked/cut down 2019 zen2 3700x cpus. Turing gpu's are having the compute advantage too, hence FSR2 actually runs faster on NV hardware.

this is about developers always being able to squeeze more out of a fixed spec device over time, especially when they are usually the target baseline for development to begin with.

That didnt really show last generation. Yes if we compare to Kepler, but as mentioned by another user, thats a different matter. GCN gpu's of equal class/power to consoles (7870 etc) actually hold up very well even today compared to the base consoles.
 
'Next-gen' games taking advantage of the console RDNA1.5 hardware will take advantage of RDNA/Turing etc hardware on pc equally as much.
XSX|S is fullblown RDNA2, Infinity Cache isn't in all of AMDs own RDNA2 parts either. Only Sony stuck to older frontend.
 
XSX|S is fullblown RDNA2, Infinity Cache isn't in all of AMDs own RDNA2 parts either. Only Sony stuck to older frontend.

XSX is much closer to full RDNA2 indeed, but as ive understood its not the full PC RDNA2 either, mainly infinity cache which 12TF dGPUs seem to sport. I think it was DF mentioning this.
 
Again, judging what the consoles can do based on cross-gen titles is silly. They will be capable of some pretty amazing things that GPU's like a 2070 or whatever will ultimately start struggling with down the line.

That was true back when PC apis were a lot fatter. Now with DX12 and Vulkan there isn’t much difference.

The myth of consoles doing more with less is also partially due to the fact consoles typically target lower resolutions and frame rates. DLSS / FSR etc are closing that gap as well.
 
That was true back when PC apis were a lot fatter. Now with DX12 and Vulkan there isn’t much difference.

The myth of consoles doing more with less is also partially due to the fact consoles typically target lower resolutions and frame rates. DLSS / FSR etc are closing that gap as well.
I think the narrative is perpetuated by shoddily-optimized PC ports.

But let's be honest, we will continue to get more such garbage ports. So in practice it's sadly true that PCs will continue to punch a little below their weight (see what I did there?). But of course they can easily make up for it due to much more rapid iteration.
 
I cant believe we still have people who deny the advantages of fixed spec optimization. Knowing exactly what architecture you are building for and the level of resources on offer lets you build around that in order to get better overall results.

Focusing on the GPU side also ignores critical memory and CPU optimization advantages.

I mean, even reading/watching devs talk about how they've made some normally quite demanding games work on Switch makes it abundantly clear that it wasn't all about just recklessly cutting everything down, it was about knowing the Switch's specific capabilities that could be targeted and optimized for and targeting their cuts based on that. Like, knowing the Switch has 3GB of usable RAM for instance - useful stuff to know so you can optimize around that, making precisely the necessary changes required, no more, no less. In the end, you get a better overall result than you would if you just gutted everything for some nebulous lower end hardware of various architectural designs, cores, RAM, cache setups, storage capabilities, etc. The same concept applies for building a game from scratch. It's no surprise that Sony's first party developers tend to achieve better results than multiplatform developers on average. This isn't just cuz their studios are just so much more talented. Knowing exactly the hardware they are targeting helps a lot.
 
I cant believe we still have people who deny the advantages of fixed spec optimization. Knowing exactly what architecture you are building for and the level of resources on offer lets you build around that in order to get better overall results.

Focusing on the GPU side also ignores critical memory and CPU optimization advantages.

I mean, even reading/watching devs talk about how they've made some normally quite demanding games work on Switch makes it abundantly clear that it wasn't all about just recklessly cutting everything down, it was about knowing the Switch's specific capabilities that could be targeted and optimized for and targeting their cuts based on that. Like, knowing the Switch has 3GB of usable RAM for instance - useful stuff to know so you can optimize around that, making precisely the necessary changes required, no more, no less. In the end, you get a better overall result than you would if you just gutted everything for some nebulous lower end hardware of various architectural designs, cores, RAM, cache setups, storage capabilities, etc. The same concept applies for building a game from scratch. It's no surprise that Sony's first party developers tend to achieve better results than multiplatform developers on average. This isn't just cuz their studios are just so much more talented. Knowing exactly the hardware they are targeting helps a lot.

I don’t see anyone denying the benefits of targeting a single fixed spec. Fact is though that the number of exclusives is dwindling and most games today target multiple platforms. And all those platforms are basically PCs. So those targeted optimizations aren’t so platform specific anymore like they were in prior gens.
 
There is nothing special about the last two generations. They using off the shelfs x86 CPUs and standard desktop GPUs.
Close but not quite. They use pretty much "off the shelf" architectures but implementations are unique and in case of Sony there isn't anything else with Vega-derived frontend and RDNA2 rest of the GPU.
Also no CPU out there uses GDDR6, consoles do.
And even if they used bog standard Ryzen 7 3800X and RX 6700 (as examples) Devs would still get the benefit of fixed spec(s) with fixed capabilities to target.
 
And even if they used bog standard Ryzen 7 3800X and RX 6700 (as examples) Devs would still get the benefit of fixed spec(s) with fixed capabilities to target.
These benefits are being severely overestimated generally though. They are there but we're talking about ~20% of additional performance at best here when compared to similar PC parts. And while this was somewhat of a big number back in the days of PS3 now it's not that much - it's basically a gap between PS5 and XSX - does that translate into much actually visible advantage for the latter?

I agree that the narrative here should be more about these badly optimized PC releases (and these well optimized ones too, of which there are plenty really) and less about console h/w having magical performance capabilities because of its fixed nature.

Also I feel that 6700XT and 3060 will be two interesting PC performance points for any future comparison to console h/w. 2070/S may eventually run into VRAM limitation.
 
Also no CPU out there uses GDDR6, consoles do.

Fortunately PC's dont. GDDR6 doesnt play all that nice in non-gaming tasks teamed to a CPU. Latency does hurt certain performance tasks.

These benefits are being severely overestimated generally though. They are there but we're talking about ~20% of additional performance at best here when compared to similar PC parts. And while this was somewhat of a big number back in the days of PS3 now it's not that much - it's basically a gap between PS5 and XSX - does that translate into much actually visible advantage for the latter?

I agree that the narrative here should be more about these badly optimized PC releases (and these well optimized ones too, of which there are plenty really) and less about console h/w having magical performance capabilities because of its fixed nature.

Also I feel that 6700XT and 3060 will be two interesting PC performance points for any future comparison to console h/w. 2070/S may eventually run into VRAM limitation.

This. Looking back, a 7850/7870 does play PS4 exclusives quite well, the main drawbacks being VRAM limitations and to some extend driver support. Theres basically no extra performance from the PS4 over matching 2012 hardware. VRAM wont be a disadvantage for most GPU's this time around though. Driver support for gpu's seem to have increased too (10 years for Nvidia regarding last generation).

Its indeed more about how ports are done these days, but then also remember that PC ports where in a much worse shape 10 to 20 years ago. These days are long gone, also to add is that Sony ports are excellent generally, which is practically the only console that sees true-console exclusives these days bar the switch.

For the last part, Turing is a 2018 product launched well before the consoles did (more in line with the premium consoles of last gen time-line wise). Ampere and RDNA2 indeed will be intresting comparison points. So far two years in, RX6600XT and 3060 do very, very well.
 
Last edited:
I cant believe we still have people who deny the advantages of fixed spec optimization. Knowing exactly what architecture you are building for and the level of resources on offer lets you build around that in order to get better overall results.
I don't think anyone is denying it, we're just saying that the reality is messier due to several factors.

Only a handful of dev houses have the will, capability and budget to squeeze every last ounce of performance from those fixed specs. Naughty Dog and Insomniac are the exemplars. But then you have id and Nixxes on the PC side as well.

The vast majority of developers have to target a sliding window of multiple console platforms along with PC. Today you have PS4-Pro, PS5, Xbox One X, Xbox Series S, Xbox Series X (Switch is a special case). By the time the previous generation is abandoned, we will begin to see PS5 and XBS "pro" refreshes. Yeah this space is still 10x smaller than the space for PCs but it's not quite a clean fixed spec either, and very few dev houses have the means and impetus to optimize that hard.

It's just that most games are console-first (with good reason), and PC ports are often half-hearted efforts because the devs know that higher-specced PC hardware can compensate for sloppy ports by sheer brute force. So yes, the phenomenon does exist, though to me it's less of consoles punching above their weight and more of Class-X hardware being prioritized over Class-Y hardware.

On a side note, I'm not sure the DX12 and VK direction has worked out as intended. Exposing PC bare metal to devs only works if they are willing to take advantage of it. I'm not sure that's actually happening. DX11 allowed the hardware vendors to work their secret sauce under the covers.

... Switch ...
Switch is a special case and fixed-spec optimization is certainly at play here, firstly out of sheer necessity and secondly because it actually is a single fixed spec (kinda). But I'm struggling to extrapolate the observations from this special case to the much higher-spec new Sony/MS consoles.

... CPUs ...
I suppose this is also true, my argument was definitely more GPU oriented. I'll bet there were huge CPU optimizations in the prior gen due to (a) the severe suckage of the Jaguar CPUs and (b) their in-order nature which lends itself to more careful optimization.

However, I think the situation is going to be dramatically different for the current gen. Those Zen2 CPUs are very capable, and they are modern superscalar out-of-order architectures with immensely capable hardware schedulers, branch predictors and cache prefetchers that you really don't have to manhandle, and the macroscopic code optimizations (tiling etc.) are already built into modern compilers. Basically it's a "welcome to 1995" situation for console CPUs (I'm being facetious (but only partially)).
 
Status
Not open for further replies.
Back
Top