AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
I suspect the XSX only actually needs that bandwidth for XB1X BC.
- the XB1X had 326GB/s of bandwidth. (vs a paltry 218GB/s on the PS4 pro)
- a focus of the console (or PR focus anyway) is 'better BC' - doubling title performance to 60/120fps etc.
- if RDNA2 is using cache/compression to make up for bandwidth, then that probably isn't going to provide much of a reliable boost in BC mode.

The XSX also uses an APU, the CPU needs bandwidth too..
 
That's consistent with what ROPs have always done for AMD/ATI.

Do ROPs have internal caches? 128 ROPs with a 256 bit GDDR6 memory bus seems low, but if ROPs already have internal caches they probably don't need as much external bandwidth. Not to mention advances in colour compression etc.
 
Do ROPs have internal caches? 128 ROPs with a 256 bit GDDR6 memory bus seems low, but if ROPs already have internal caches they probably don't need as much external bandwidth. Not to mention advances in colour compression etc.
From the GCN whitepaper, 16KB color and 4KB depth per RBE. I didn't see a reference in the RDNA whitepaper for the capacity, but there's a diagram that mentions an RB cache as a client of the L1.
The ROP caches hold tiles of pixels being worked on or soon to be worked on, like a sort of conveyor belt that can burn a significant amount of bandwidth to cover for memory latency.
 
The XSX also uses an APU, the CPU needs bandwidth too..
From a quick google - memory bandwidth of a random Zen 2 processor is only around 60GB/s. Noticeable, but dwarfed by the bandwidth requirements of the GPU (where I think Navi 21 should require ~1.8x the bandwidth of the XSX?).
 
That might be a bit of a misnomer as it might be limited by the software that was available to run. I still hold out reservations until we're a year into next-gen console software titles or we have specific PC titles using AMD RT-RT.

Yes utilization and power consumption will go up during console lifespan. Still impressive stuff.
 
Here they compare same game (Red Dead Redemption 2) running on old One X and new Series X:
180 Watt on former and 128 Watt on latter
Real new games should tax harder CPU/GPU, but quite impressive for RDNA2 / Zen2 on 7 nm already.
English translation.

Does S3 sleep really draw 28W on XSX, or does it go into lower power sleep mode after a while? Maybe I’m out of the loop and that’s how just much 16GB GDDR6 draws.

Edit: Maybe the Google-translated “Sleep Mode” means idle rather than standby? AnandTech measured ~50W idle (which matched that Dutch article) but ~15W 10W standby on X1X.
 
Last edited:
Well then fine, if the xsx is really that power efficient I'd believe it possible the benchmarks we have are from the cut down die.

72ish CUs and 2ish ghz does match up pretty well with what is expected. Could be with Nvidia pushing the 3080 so hard AMD decided to take the same tack. Especially if they want the full die to sell for $1-1.5k, having a $600-700 gpu as a headliner feels more consumer friendly.
 
Is any console data actually relevant? If it's not under target FPS, it's probably clock-gating itself to hell (and I guess old-gen games are not that demanding, considering they have to run on a gimped 7850) so the power consumption data would be rather meaningless (unless you can repeat the same sequence with the same optimisation/settings on a PC)
 
but ~15W standby on X1X.
https://ec.europa.eu/info/energy-cl...roducts/mode-standby-and-networked-standby_en

Network-connected standby devices
Modern appliances are increasingly connected to the internet during standby mode, consuming higher amounts of energy. This includes networked televisions and decoders, printers, game consoles and modems.

  • Specific requirements for network-connected standby devices were introduced in 2013.
  • Since January 2017 networked standby devices must not consume more than 3 to 12 Watts depending on the product.
This compares to 20 to 80 Watts previously.
 
Is any console data actually relevant? If it's not under target FPS, it's probably clock-gating itself to hell (and I guess old-gen games are not that demanding, considering they have to run on a gimped 7850) so the power consumption data would be rather meaningless (unless you can repeat the same sequence with the same optimisation/settings on a PC)

Absolutely relevant. As stated Dirt 5 has already been tried, a 60fps 4k crossgen title not even out yet, it's a title that can already cause dynamic resolutions scaling below 4k on the highest settings on Series X. While Series X goes relatively wide and a tad slow, and thus assumedly efficient in the powerdraw to performance department,and since the CPU and other components are no doubt not being maxed out for Dirt 5, it's not some easy comparison. But it's certainly enough to infer a general idea from.
 
Absolutely relevant. As stated Dirt 5 has already been tried, a 60fps 4k crossgen title not even out yet, it's a title that can already cause dynamic resolutions scaling below 4k on the highest settings on Series X. While Series X goes relatively wide and a tad slow, and thus assumedly efficient in the powerdraw to performance department,and since the CPU and other components are no doubt not being maxed out for Dirt 5, it's not some easy comparison. But it's certainly enough to infer a general idea from.
Clocks and voltage is very low comparatively to what you’ll see on the PC space however. Something larger than the Series X with a boost clock above 2200 is likely going to double the power of series x. When it’s hitting up there.
 
Last edited:
Status
Not open for further replies.
Back
Top