AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

Hmmm, that Sapphire 5600 XT is pretty impressive.

Compared to the 2060
  • Basically the same speed.
  • Lower idle power.
  • Lower load power.
  • Lower noise.
  • Lower price.
Even with the factory overclocked BIOS. Potential drawback is no RT. But considering the only RT title that currently shows off RT as an overall benefit (to me) is Control, that's fine. So as long as you don't need RT at this performance level or overclock, the 5600 XT is just better. Obviously for people where those things matter, the 2060 is likely to be the better choice.

If it drops down to 250 or less, I might pick one up to stick in my SFF. Alternatively I may get one to replace the 1070 in my main machine as it's constantly driving me nuts with little things that it doesn't like.

The 1070 and NV's drivers just sucks arse with the display coming out of sleep. HDMI display just constantly flickers coming out of sleep until I power the display on and off a few times. It's fine on DP, thankfully, but it hates displays coming out of sleep on HDMI. I have no idea why Intel which sucks at drivers is so much better than NV at this. No problem there with waking the display from sleep.

Regards,
SB
 
Personal aside: while I get why AMD made the last-minute BIOS changes that they did, I am rather concerned that it's going to make a mess of things in the long run. Not all RX 5600 XT cards are getting the 14Gbps factory overclock on the memory. Those cards perform a lot better, but they are well above what AMD actually guarantees as far as performance and specifications are concerned.

(The TechSpot graph embedded above is a great example of why this is a potential problem)
Do you know which 5600 XT models are not getting the upgrade to 14Gbps?
 
So Navi has VRS, mesh shaders and HDMI VRR support?
Sadly there has not been any news on new features with Navi. (Or at least I haven't heard of them.)
So it's most likely missing features from ps4 pro or any of the new ones or similar to what is in Turing.
 
Which makes no differences. Xbox One S/X was the first with the pseudo HDMI 2.1 VRR, not Turing, and it also supported Freesync which covered pretty much all the bases. It outputs HDMI VRR when fed onto LG 2019/2020 TVs, it outputs Freesync when fed onto select freesync monitors and Samsung TVs. With the news of LG 2020 OLEDs supporting FreeSync in near future too, it matters not whether AMD does it through HDMI VRR or FreeSync. Both are pretty much identical as proven by Xbox One X and Turing. Both have 1440p @ 120Hz limitations, both support HDR, both are destined to be relegated to garbage bin once true HDMI 2.1 VRR hits. Wake me up when there is a GPU that supports 4K @ 120Hz over HDMI, not this unofficial patchwork.
 
HDMI VRR can officially be added to HDMI 2.0 devices. The difference between Freesync-over-HDMI and HDMI VRR is pretty simple: the latter is an industry standard which most display devices with HDMI 2.0/2.1 inputs will support by delault, the former is AMD's own HDMI extension which needs to be implemented separately - and not many will do that to support variable refresh rate on older AMD cards which won't be able to run anything in 4K anyway.
 
Can anybody point me to video encode benchmarks of Navi utilising the built-in ASIC? Ever since RX 5700 launched I've never been able to get any of the 5000 series cards to hit the advertised mark of 360 FPS in H.264 and HEVC 1080p. For me performance goes up and down after major driver updates, sometimes quite significantly, but it's never what it's supposed to be. AMD went silent on my questions, and now I have a suspicion that video encoding for Navi is half-broken kind of the same way that VP9 decoding for later GCN parts was. Still, I wonder if it is instead a software idiosyncrasy between the Navi driver (GCN chips hit their performance targets just fine) and FFmpeg, which I use for benchmarks. Is there any benchmark data out there that would clarify this?
 
Back
Top