AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

The Hawaii GPU doesn't support frame-buffer compression, that could be one of the reasons for the poor performance scaling in this case.

Why?

If anything, a GPU that has lower effective bandwidth should gain more out of a 20% increase in raw bandwidth, not less.
What I think is the R9 290, being a GCN2 4.8TF GPU, has very little to gain when increasing bandwidth from 320GB/s to 384GB/s.
AMD kept the same 512bit bus from the full 290XT while cutting back compute by 16%, so the result was a GPU with more bandwidth than it actually needs.

Furthermore, trying to render higher resolutions on that card may not put a bottleneck on memory bandwidth either because it has only 4GB of VRAM. Fiji didn't age very well because of that, either (though it might be an excellent 1080p card, at this point).
 
Yeah, that's what I meant - the GPU was completely ROP-bound or compute bound, Hawaii was probably the last "even-out" architecture by ATi/AMD, later we got Fiji and Vega compute-heavy monstrosities with the same amount of ROPs. And AMD still went for overkill, these huge ~500Gbs bandwidth were not really in the play (and still I get about 5% improvement in Superposition extreme from 945-> 1085 OC and tuned timings on Vega 56) unless you went for 4K, which is highly dubious for Fiji GPUs (and now vega as well, doom eternal straight can't run with 4K and everything maxed out because it needs more than 8 Gb of VRAM - decreasing texture quality just by a notch gives about 10-15 fps improvement and makes the game much smoother)

What's even more fun is the fact that AMD marketed R9 290s as "4K Gaming" GPU (it says so on the box of my reference card) and I'd say we are not quite there yet even with the current generation of the chips from both vendors.
 
What's even more fun is the fact that AMD marketed R9 290s as "4K Gaming" GPU (it says so on the box of my reference card) and I'd say we are not quite there yet even with the current generation of the chips from both vendors.

Well the card released in late 2013 when most game engines were still optimized for the 512MB gen7 consoles, so at that point it ran a number of games at 4K pretty decently.
The launch of the PS4 and XBone increased VRAM footprint by a lot, which is why AMD relaunched the cards as the R9 390 series with 8GB instead.
 
Could be why Nvidia stuck with 384bit and pushed speeds?
I'm going by commentary that came from der8auer during a PCB analysis at one point, indicating that the requirements for the high end memory speeds were nearing the limit of what was practical at 384 bits.
I'm not sure if speeds were pushed because of the difficulty in implementing a broader bus, the speeds being pushed made the limit.
Nvidia's use of somewhat niche memory types like the X variants of GDDR5 and GDDR6 could also come with other motivations, like needing to use the memory of a partner company if they were involved in the design effort. AMD's initial HBM GPUs may have had a element of that, since the original HBM in Fury didn't see much adoption outside of that.

Then if a giant cache loses efficiency at a 4k resolution, either from cache overflow despite being 128mb or from some pass needing a lot of access to main memory (has it been figured out which one it is?) then what's the solution to bandwidth scaling? Hoping HBM becomes cheaper doesn't seem the most likely. How cheap is Intel's EMIB style connect supposed to make it, I've seen claims that it's supposed to be cheaper, but not numbers.
I'm not sure there are any good answers, hence the giant cache despite a generally poor hit rate as far as giant caches go.
I haven't seen EMIB numbers, although interposers themselves may not be the deal-breaker as much as HBM serves markets that are willing to pay far more than can be asked of consumer graphics components.
 
Well the card released in late 2013 when most game engines were still optimized for the 512MB gen7 consoles, so at that point it ran a number of games at 4K pretty decently.
The launch of the PS4 and XBone increased VRAM footprint by a lot, which is why AMD relaunched the cards as the R9 390 series with 8GB instead.

Which is why Nvidia's "8k!" marketing is hilarious bullshit, but seems to have been effective. Look we can run just about the most well optimized 60fps current gen game in the world at 8k, mostly. Meanwhile a 3090 needs upsampling to hit 4k in Watchdogs Legion, and now apparently Cyberpunk as well.

Still, its hard to blame the PR guys at either company. People see "high number!" and think "good" and pay the company for a product. It's their job and they're doing well at it.
 
That's just sort of how the video card advertising thing works. More resolution at more speed and magical imaginary future proofing vibes.
 
Last edited:
OcUK says it has about 100 RX 6900XTs but has decided not to put them on sale until some random time when it feels like it.
 
6900xt really doesn't seem to have any reasonable use cases that justify the price. 3090 has a very small niche of professional users, but is otherwise a rip off. 6900xt just seems to be a very expensive minor performance boost.
 
RX 6900 Reviews

4Gamer AMD Radeon RX 6900 XT Reference Card
Benchmark AMD Radeon RX 6900 XT Reference Card
Bitwit AMD Radeon RX 6900 XT Reference Card
ComptoirHardware AMD Radeon RX 6900 XT Reference Card
ComputerBase AMD Radeon RX 6900 XT Reference Card
Coreteks AMD Radeon RX 6900 XT Reference Card
Eteknix [video] AMD Radeon RX 6900 XT Reference Card
GamerMeld AMD Radeon RX 6900 XT Reference Card
GamersNexus AMD Radeon RX 6900 XT Reference Card
Geeknetic AMD Radeon RX 6900 XT Reference Card
Golem AMD Radeon RX 6900 XT Reference Card
Guru3D AMD Radeon RX 6900 XT Reference Card
Forbes AMD Radeon RX 6900 XT Reference Card
HardwareBattle AMD Radeon RX 6900 XT Reference Card
HardwareLuxx AMD Radeon RX 6900 XT Reference Card
HardwareUnboxed AMD Radeon RX 6900 XT Reference Card
HardwareUpgrade AMD Radeon RX 6900 XT Reference Card
HKEPC AMD Radeon RX 6900 XT Reference Card
Hot Hardware AMD Radeon RX 6900 XT Reference Card
Hexus AMD Radeon RX 6900 XT Reference Card
igor’sLAB [video] AMD Radeon RX 6900 XT Reference Card
JayzTwoCents AMD Radeon RX 6900 XT Reference Card
Joker Productions AMD Radeon RX 6900 XT Reference Card
KitGuru [video] AMD Radeon RX 6900 XT Reference Card
Lab501 AMD Radeon RX 6900 XT Reference Card
LinusTechTips AMD Radeon RX 6900 XT Reference Card
Noticias3D AMD Radeon RX 6900 XT Reference Card
MadBoxPC AMD Radeon RX 6900 XT Reference Card
Optimum Tech AMD Radeon RX 6900 XT Reference Card
Overclock3D AMD Radeon RX 6900 XT Reference Card
Paul’s Hardware AMD Radeon RX 6900 XT Reference Card
PC Games Hardware [video] AMD Radeon RX 6900 XT Reference Card
PCWatch AMD Radeon RX 6900 XT Reference Card
PCWorld AMD Radeon RX 6900 XT Reference Card
SweClockers AMD Radeon RX 6900 XT Reference Card
Tech Critter AMD Radeon RX 6900 XT Reference Card
TechPowerUP AMD Radeon RX 6900 XT Reference Card
Techtesters AMD Radeon RX 6900 XT Reference Card
Tech YES City AMD Radeon RX 6900 XT Reference Card
TweakTown AMD Radeon RX 6900 XT Reference Card
Tom’s Hardware AMD Radeon RX 6900 XT Reference Card
UNIKO’s hardware AMD Radeon RX 6900 XT Reference Card
Wccftech AMD Radeon RX 6900 XT Reference Card
XFastest Taiwan AMD Radeon RX 6900 XT Reference Card

Thanks to Videocardz!
 
3080 and 6800XT is where its at, the value of 3090 and 6900 isnt worth it over those two. Then your left with the decision 30x0 or one of the rnda2's, if you prefer 1440p / no RT id take that. If you want RT and like 4k go for ampere. The reviewer noted AMD is a whole gen behind in RT. DLSS too if that is considered.

Competitive gamers probably go for something like a 6800XT, which is a big market.
 
Really the 6900 is nowhere in the charts. ur much better with a custom 6800xt with OC. and much cheaper once the MSRP gets real.

So 128bm of IC is not suitable for 4k gaming...
 
I still really want to see Apex, Fortnite, COD Warzone, COD Cold War, Valorant, CS GO all at competitive settings. I really think the 6800xt/6900xt should crush those games. All the competitive gamers buy nvidia, but maybe that's not the play anymore. But it's very possible these are all cpu limited anyway.
 
Back
Top