AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

False. The LG 9 series OLED TVs required firmware update to support GSync HDMI VRR. This year’s Sony HDMI 2.1 TVs don’t even support GSync at all. Vizio does support GSync but their TVs also support FreeSync too. I am telling like is, it’s still a hack, no different from FreeSync. The only HDMI VRR device that’s truly universal as of now is Xbox One S/X, as it can output HDMI VRR to every HDMI 2.0/2.1 TVs, but it too did not go without a hitch in the beginning. (LG and Microsoft needed to do a lot of workaround together)

I will put current FreeSync/GSync implementation similiar as PS4/Pro HDR era hacks. Usually, HDMI 1.4 chipset does not support HDR hence the reason Microsoft needed to refresh it as HDMI 2.0 supporting Xbox One S, but Sony managed to get away with their HDMI 1.4 supporting PS4 because it was a special custom type of HDMI 1.4 chipset co-developed with Panasonic which was future-proof enough to support HDR, but because it was an unofficial hack, many HDR TVs did not play with it nicely, dwarfing any compatibility hell MS has suffered thus far.
 
Last edited:
Potential drawback is no RT
Also the 2060 still enjoys a vastly superior decoder, compatibility with DXR, VRS, also DirectML, Mesh Shaders, (if those ever became a thing), and VR compliance, the 2060 also runs every VRR gaming monitor (whether G-Sync or FreeSync), and runs OLED screens thanks to it's HDMI 2.1 VRR compliance.

That on top of all those always forgotten NVIDIA features, like CUDA, GameWorks/PhysX, Ansel, FreeStyle, Highlights, for those who care. The 2060 is a much more complete package compared to the bare bone 5600XT offering. IMO the 5600XT needs to be significantly cheaper than the 2060 to be considered an equal.
 
Turing wasn’t the first time nVidia has shown off on OLED TVs though. They used to support Dolby Vision during Pascal era too. It’s currently now a b**** to run, but with Geforce 1080 Ti and right set of drivers, Dolby Vision supported games like Mass Andromeda run amazingly on Dolby Vision supporting LG OLED TVs, with vastly increased Rec. 2020 palette to sample from. Too bad the license did not continue through Turing. The 1080 Ti is not quite up for 4K 120Hz.

I will still make my next HTPC’s GPU RDNA2 or 3 though. Mark Redjon’s future BFI pattern is currently AMD exclusive.
 
Last edited:
Latest RX 5600XT bios information ... what a mess! :LOL:

Last Thursday I had finished up most reviews, then an email from AMD arrived. AMD would be allowing the OC SKUs, the factory tweaked version to be able to run a TGP (total graphics power) of 160W and/or 180W. AMD supplied BIOS updates for the cards they distributed (mostly Sapphire) and urged media to test with the new settings.

The Sapphire Pulse OC model we tested took the BIOS update well, no problems. The same for the Gigabyte Gaming OC. Both ASUS and MSI BIOS updates, however, took longer to arrive, and the final ones arrived as late as Monday for ASUS and Tuesday for MSI. ASUS immediately indicated that the STRIX OC model we received would get the TGP increase but not the memory increase, for reasons unexplained. A new SKU would now become the STRIX TOP with both the TGP increase as well as that memory at 14 Gbps. A new SKU means more product segmentation, and well, they can add a tenner on pricing. They, however, have not explained themselves as to why they needed to create a new SKU.

MSI, on the other hand, we've been talking with ever since Friday, their internal testing showed that while for review purposes the Gaming X model supplied with a TGP + Memory tweak would suffice, they did expect issues long term. Running the cards through internal test software indicated stability issues with GDDR6 at 14 GBps in the long stretch. So long term they are fearing RMA issues, which they want to prevent.

MSI made the call to leave the Gaming X model for what it is, it'll get a TGP increase but they are leaving GDDR6 at 12 GBps. The Gaming Z model as such is to be released with both the TGP and thus clock increase to 160W + 14 Gbps GDDR6. However, this product is not yet available, as it requires a hardware revision alteration, a B spec board. This is going into production and as such only the Gaming Z models will get that 14 Gbps GDDR6 bump as MSI only then can guarantee a long life-span and absolute stability. So again, there will be a new revision of the card, the B spec and that will end up being the Gaming Z being the only 5600 XT from MSI running 14 Gbps on GDDR6. We can only assume this logic applies to that ASUS TOP SKU.

So what does this all mean SKU segmentation wise? Well, all brands aside from MSI and ASUS apparently seem to run 14 Gbps fine. A side-note, we've seen both the Gaming X and STRIX OC selling for $339,- which is just too much. If you can find one with a non-inflated price, grab a Sapphire Pulse regular or OC for 279 and 289 respectively. It'll get the job done just as well nice and silently.

https://www.guru3d.com/news-story/b...-xt-sku-at-gddr6-with-14-gbps-we-explain.html
 
Last edited by a moderator:
What a shit show. Who at AMD would have thought Nvidia would be adjusting their pricing to pressure an AMD new release. I mean that never happens, right?
 
What a shit show. Who at AMD would have thought Nvidia would be adjusting their pricing to pressure an AMD new release. I mean that never happens, right?
Of course they were prepared for many counters, but possibly not that big of a counter. It's original performance at the price would have beven just fine if NVIDIA had cut the RTX 2060 price to something like 329 or 339 instead of 299.
 
Of course they were prepared for many counters, but possibly not that big of a counter. It's original performance at the price would have beven just fine if NVIDIA had cut the RTX 2060 price to something like 329 or 339 instead of 299.
Not really, 5600XT's original positioning was rather lackluster too, as can be seen from Techspot's cost per frame graph.
 
Of course they were prepared for many counters, but possibly not that big of a counter. It's original performance at the price would have beven just fine if NVIDIA had cut the RTX 2060 price to something like 329 or 339 instead of 299.
Maybe the larger cost cut by Nvidia also had something to do with some 2060's now being TU104 salvage chips and that brought the costs down not having to make as many TU106.
 
There are no reference models of Radeon RX 5600 XT and several OCed models will be available at MSRP. There is no rational reason not to compare them. Why should we compare GeForce RTX 2060 FE to a hypothetical product with reference clocks, which will be never available, at least for retail?
 
I see 5600XT price going down quickly. BIOS update is not ideal, but it gives performance target for the MSRP. Slower models will have to drop in price once the market saturate.

It probably would be cleaner for AMD to slash the price of cards following Nvidia price reduction, but changing cards spec was more profitable option as they still will sell at $279 price point till enough 2060 hit that $300 price mark.

I'm kind of glad that we are back to seeing cards launch price going down soon after release, not like the mining years where they launched with inflated prices and just kept going up and up afterwards.

I see myself upgrading my Radeon VII once next gen AMD and NV cards launch! Yes, I want RT from my next GPU as well as I would like 16GB VRAM ;)
 
Of course they were prepared for many counters, but possibly not that big of a counter. It's original performance at the price would have beven just fine if NVIDIA had cut the RTX 2060 price to something like 329 or 339 instead of 299.
Apart from countering with 2060 FE, the cheapest 2060 were at 300 Euro for over a month prior to RX 5600 XT's launch. If AMD did not see that, their comp analysis just made an error.

It probably would be cleaner for AMD to slash the price of cards following Nvidia price reduction, but changing cards spec was more profitable option as they still will sell at $279 price point till enough 2060 hit that $300 price mark.
The other possibility is, that it is hard for AMD to get to that price point in the first place. RX 5600 XT has both expensive components right now: GDDR6 memory and 7 nm GPU.
 
Last edited:
Some issues spotted on the XFX 5600XT.
Looks okay.
MG_2353-1024x683.jpg


Closer inspection.
MG_2358-1024x683.jpg

Here we see the power stages previously mentioned, but one thing you may also notice is the inductor/choke, which is cracked. There are three of the eight primary VGPU inductors which are cracked, and this means one of two things. Either the components used were of low quality, or the inductor was damaged or bad and has cracked from the inherent vibration of the coil, which can lead to eventual coil/inductor whine issues that can plague a GPU.

XFX-5600-XT-Thicc-II-Pro-cracked-inductor-1024x814.png

Here we got an even closer look at the inductor damage. I cannot say that a poor quality part causes this, but I can say that it does not look like any exterior force damaged them. The reason I make this assertion is that there is a bulging outward apparent, as you can see in the image, which tells me that the failure is from internal stresses.
https://bjorn3d.com/2020/01/amd-rx-5600-xt-launch-new-vbios/3/#split_content
 
Some issues spotted on the XFX 5600XT.
Looks okay.
MG_2353-1024x683.jpg


Closer inspection.
MG_2358-1024x683.jpg

Here we see the power stages previously mentioned, but one thing you may also notice is the inductor/choke, which is cracked. There are three of the eight primary VGPU inductors which are cracked, and this means one of two things. Either the components used were of low quality, or the inductor was damaged or bad and has cracked from the inherent vibration of the coil, which can lead to eventual coil/inductor whine issues that can plague a GPU.

XFX-5600-XT-Thicc-II-Pro-cracked-inductor-1024x814.png

Here we got an even closer look at the inductor damage. I cannot say that a poor quality part causes this, but I can say that it does not look like any exterior force damaged them. The reason I make this assertion is that there is a bulging outward apparent, as you can see in the image, which tells me that the failure is from internal stresses.
https://bjorn3d.com/2020/01/amd-rx-5600-xt-launch-new-vbios/3/#split_content
Probably just a bad part, or we'll be hearing a lot more popping soon.
 
Back
Top