AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
Oh boy, seeing the hype train of AMD not teasing with the topend...

I'm really looking forward to reveal of the memory configuration used.

Pricing for sure shouldn't be painted rosy - AMD competes internally for TSMC 7nm capacity. This weights margins of a 500+mm Navi vs chiplet-based EPYC CPUs.
 
People also need to take AMD's numbers with salt. They choose those 3 games presumably because they do well in them. And there is little reason they would not show their top part now that Nvidia has already released cards above what they are showing.
 
People also need to take AMD's numbers with salt. They choose those 3 games presumably because they do well in them. And there is little reason they would not show their top part now that Nvidia has already released cards above what they are showing.
They showed slides with a lot more than 3 games seeing an uplift.

Edit: oops wrong thread. Was Zen 3 perf in games, not Navi
 
Last edited:
People also need to take AMD's numbers with salt. They choose those 3 games presumably because they do well in them. And there is little reason they would not show their top part now that Nvidia has already released cards above what they are showing.
By the selection of games, I would assume the opposite. Gears of War 5 is an Unreal Engine title, which has (traditionally) run better on NVidia cards.
 
They choose those 3 games presumably because they do well in them.
No presumption needed. BL3 DX12 for example gimps NV h/w performance considerably compared to DX11 while providing about zero benefits.

Gears of War 5 is an Unreal Engine title, which has (traditionally) run better on NVidia cards.
Gears5 is a rare UE4 title which actually fares a bit better on AMD h/w - likely because of its 1st party Xbox origins. But it's the closest in the list to being an "average type" title, yeah.
 
By the selection of games, I would assume the opposite. Gears of War 5 is an Unreal Engine title, which has (traditionally) run better on NVidia cards.
Gears uses a custom path optimized for GCN and AMD hardware, the path is also isn't gimped with DX12.
 
I don't see why these wouldn't be from the fastest GPU in their lineup. Historically aren't these types of vague performance tidbits a sign of a slower product?
 
Gears uses a custom path optimized for GCN and AMD hardware, the path is also isn't gimped with DX12.
Erm, what now? You're saying there's some separate API-less renderer for AMD hardware (which isn't even GCN anymore) when you select DirectX 12?
 
By the selection of games, I would assume the opposite
Sure, nice unbiased selection of titles, one game is an AMD title, where they did god knows what so that the game ends up 15% slower with DX12 renderer in comparison with the same settings and DX11 renderer on competitor's GPUs - https://tpucdn.com/review/nvidia-ge...rs-edition/images/borderlands-3-3840-2160.png
Other than that, the game seems to be slightly broken with DX12 - "we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability."

Another title seems to be limited by CPU performance to some extend, some reviews with faster CPUs have slightly higher performance for 3080 than others - https://tpucdn.com/review/nvidia-geforce-rtx-3080-founders-edition/images/gears-5-3840-2160.png
17% perf difference in comparison with techspot's results with "Ryzen 9 3950X test system for testing" - https://www.techspot.com/review/2099-geforce-rtx-3080/
What a coincidence! What a nice fit for the best gaming CPU, but wait, why are they showing gaming GPU in CPU limited scenario?:rolleyes:

Also, Gears 5 is the first and foremost an XBox title, which, unlike other UE4 titles, was optimized quite well for both GCN and RDNA
 
People also need to take AMD's numbers with salt. They choose those 3 games presumably because they do well in them. And there is little reason they would not show their top part now that Nvidia has already released cards above what they are showing.

Well shoot, AMD might as well cancel the event they have lined up on the 28th of October then. You should apply to AMD's marketing team.
I don't see why these wouldn't be from the fastest GPU in their lineup. Historically aren't these types of vague performance tidbits a sign of a slower product?

Well I'm sure you're gonna mention Vega at some point so I'll do it for you. Yes, yes it is.
 
Also, Gears 5 is the first and foremost an XBox title, which, unlike other UE4 titles, was optimized quite well for both GCN and RDNA

My memory is a bit hazy, which prior Xbox used RDNA again? And are you basing this optimized for GCN/RDNA bit off anything other than your comment?
 
My memory is a bit hazy, which prior Xbox used RDNA again? And are you basing this optimized for GCN/RDNA bit off anything other than your comment?
Both architectures share enough of common stuff so that one can benefit from optimizations for other - the same command processors, the same async microcontrollers, similar ISA, the same intrinsics, etc, etc. Lots of low level optimizations are portable between the two.
Gears 5 is the first party XBox title, so what makes you think it is any less optimized for GCN than any other first party titles?
 
Both architectures share enough of common stuff so that one can benefit from optimizations for other - the same command processors, the same async microcontrollers, similar ISA, the same intrinsics, etc, etc. Lots of low level optimizations are portable between the two.
Gears 5 is the first party XBox title, so what makes you think it is any less optimized for GCN than any other first party titles?

If it was so well optimized for GCN, you'd think a Radeon VII would soundly beat a 2070 wouldn't it?
 
It's a serialized step in the post-process chain, so I don't see why it would be "stolen". There's just a lot more theoretical TOPs with the amount of tensor cores packed on-chip.

Yes but I'm sure it would have at least an impact on frame times? That post processing will take some time for sure? While with tensor cores the DLSS may happen while the normal cores are processing the next frame?
 
Yes but I'm sure it would have at least an impact on frame times? That post processing will take some time for sure? While with tensor cores the DLSS may happen while the normal cores are processing the next frame?
I had the impression the compute units were idle during DLSS, though I could be wrong. Can't remember!
 
Also, Gears 5 is the first and foremost an XBox title, which, unlike other UE4 titles, was optimized quite well for both GCN and RDNA
There's only GCN2 and GCN4 consoles out there so far. Considering how Pascal GPUs compete so well against GCN4 / Polaris in this game (e.g. the GTX1060 going shoulder-to-shoulder with the RX580), when in other recent games Polaris got significantly ahead, I wonder where this assumption is coming from.



By the way, neither Gears 5 nor Borderlands 3 seem to especially favor RDNA1 GPUs. The 5700XT seems to average above the RTX 2070 by about 5%, and in Borderlands 3 it's a draw.
If AMD wanted to show games that favor RDNA, I think they'd be better off showing e.g. Battlefield V.
 
Status
Not open for further replies.
Back
Top