AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
I've already told him that it scales both wide and fast, see consoles, yet he tried to akshually me.
Mobile is NOT desktop, different power, heat, and noise characteristics. Polaris was shown to be pretty power efficient when scaled low enough, Vega was the same thing, yet when you pushed their clocks and core counts things went to south quickly.
 
Mobile is NOT desktop
Yes it is.
Goddamit Max-Q is literally big desktop GPUs at lower clock because it's nice (if expensive).
Polaris was shown to be pretty power efficient when scaled low enough, Vega was the same thing
Neither were even remotely close to nVidia in mobile; not even a competition.
5000M are like the first usable AMD mobile GPUs since what, 7k mobile?
when you pushed their clocks and core counts things went to south quickly.
PS5 pushes the clock alright, and it is essentially a beefy mobile config from power target POV (think MUSCLEBOOK instead of something 20-ish mm thin).
 
Polaris was on GF 14nm process which had notorious issues with scaling with clocks, Navi 2x is on latest TSMC 7nm process which scaled exceptionally well with Vega20's latest iteration in both mobile and desktop, with very low power consumption at same clocks of 5700XT. I think this argument is quite invalid as far as it goes at the moment. And frankly I see on both sides more a e-penis measurement than actual contribution with these latest posts.
 
PS5 pushes the clock alright, and it is essentially a beefy mobile config from power target POV (think MUSCLEBOOK instead of something 20-ish mm thin).
Consoles have ZERO relevance to desktop chips, performance profile and power characteristics are different, the Xbox One X Polaris GPU had zero relevance to either Polaris or Vega dGPUs, max known console specs is 52CU@1825MHz at UNKOWN power, how this scales to 80CU @1825MHz is another matter. PS5 clocks are variable not fixed because it's "power" limited.
Goddamit Max-Q is literally big desktop GPUs at lower clock because it's nice (if expensive).
Downclocked to hell and beyond to achieve good power and acoustics profile, the best Max-Q 2080 chip is below 2070 desktop performance.
 
Consoles have ZERO relevance to desktop chips
Lol dude they're desktop-grade IP put on a single die and binned like shit because they have to take *everything*.
the Xbox One X Polaris GPU had zero relevance to either Polaris or Vega dGPUs
Dude what?
It was Polaris.
Much the same way PS4p GPU was Polaris.
Or like PS4 GPU was GCN2.
PS5 clocks are variable not fixed because it's "power" limited
They're variable in case of funny power virus scenario a-la 8c running fancy 900GFLOPS worth of SIMD.
Downclocked to hell and beyond to achieve good power and acoustics profile, the best Max-Q 2080 chip is below 2070 desktop performance.
Guess what N22 is!
 
Last edited:
There were also actual figures (later deleted), around 130-140W (referring probably to the whole APU without the RAM).
Whole APU with DRAM should account for ~200W since the CPU part has power statically allocated, and it is a pretty infernal 3.66GHz Renoir config (why did they clock it so high?).
 
Lets keep discussion focused on how AMD will scale compared to AMD.
Also, enough with the one liners that are detrimental to open discussion as they're pure noise with no signal.

That way posts won't need to be moved to the other topic or removed entirely. That's a win-win for everyone.
 
Alternative viewpoints is you can ballpark around 2.3 x 5600xt if the concern is memory bandwidth scaling (384x16/192x14).

Yea that is acually kind of what I was getting at myself. And this is without any architectural improvements. While not the best possible, I think it should be good enough.

One would hope they could with Navi 3x. I'd hate to see them using vanilla G6 for the next gen.
I mean, you could easily debate with a "no I did not see the desktop marketing material but the very same persons showing that to me and other various sources I have told me that the same is valid in the desktop, too" instead of mocking the interlocutor, and your contribution could be much more appreciated by everyone.

Agreed. Some articulation or rebuttals with details would go a long way.
Mobile is NOT desktop, different power, heat, and noise characteristics. Polaris was shown to be pretty power efficient when scaled low enough, Vega was the same thing, yet when you pushed their clocks and core counts things went to south quickly.

Polaris and Vega aren't really comparable IMHO, they are completely different chips and on perhaps inferior processes (We don't know how GFs' 14nm compares to Samsung/TSMC). Vega 20/Renoir on TSMC 7nm is a far more apt comparison if you want to make one.
There is zero evidence that Navi scales.

Conversely, is there evidence Navi dosen't scale?
Consoles have ZERO relevance to desktop chips, performance profile and power characteristics are different, the Xbox One X Polaris GPU had zero relevance to either Polaris or Vega dGPUs, max known console specs is 52CU@1825MHz at UNKOWN power, how this scales to 80CU @1825MHz is another matter. PS5 clocks are variable not fixed because it's "power" limited.

Yet again I dont get your logic here. You claimed 2x a 5700XT would be "barely faster" than a 2080ti earlier. Still haven't heard back from you on that one. Of course the Consoles have relevance this gen as they're the exact same architecture and on the same process as the desktop chips. The XSX power is NOT UNKNOWN at at all and we have a fair idea of the power consumption as they compared it directly to Xbox One X. Are you saying you cant reasonably extrapolate from 52 CU to 80 CU? Obviously no-one can tell you exact numbers, but we can get a fair idea.
Context matters. This 1x GPU power involves everything i.e. VRS, RTA units etc.

Which is even better right? It shows that even with all the added features, its still 1X power. Or are you implying something different?
 
5600XT is already on the verge of losing perf due to bandwith. With the numbers above, Big Navi had +142% bandwith, but # of CUs alone is already +122% - without µarch improvements and (way?) higher clocks.

Why do you say 5600XT is on the verge of losing performance due to B/W? The 5700 with 33% more B/W is only faster by approx 6-8%. If anything, it shows it has plenty of bandwidth.
 
You sure, that's not normalized to power in order NOT to have to give that number away? Because going from 1080p to 8k is 16x more pixels. True, they could refer to the 9x that's the pixel fillrate, but they could throw smokes just as well.
Nah, fits all too well adjusted for higher CPU and mem power.
Doubt they're pretending 8k is primary target for XSX during HC brief of all things.
 
Status
Not open for further replies.
Back
Top