AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

From the Digital Foundry review:
But in the here and now, whether the RX 5700 series can truly challenge Nvidia remains to be seen. The pricing adjustment was sorely needed, but AMD needed to come out of the gate at E3 with something much more ambitious. Even the cut we have is not really disruptive and that's surprising bearing in mind that the firm knows how to challenge the status quo effectively after its experiences in the CPU market. Ryzen has demonstrated that to truly challenge an incumbent, a radical strategy is required, and as solid as they are, the Navi products do not provide that.

And similar to Radeon 7, the price of getting AMD competitive with Nvidia's upper mid-tier products is daunting. The RX 5700 offers a performance bump over RTX 2060 and consumes a little more energy - so Team Red is basically competitive on efficiency, but it has required a jump in process technology here to get the job done, similar to prior GPU generations. Nvidia is using what amounts to a refined 16nm production technology and hasn't needed to drop down to 7nm. On top of that, AMD's problems in challenging at the top-end in performance terms haven't gone away either - the firm has no price-appropriate answer to RTX 2070 Super, while the RTX 2080 Ti remains king of the hill.

So there are positives and negatives to the Navi products: if you're not interested in DXR features and you're willing to bet that ray tracing won't be an important part of the gaming generation to come, you can't argue that the RX 5700 is anything other than a very nice deal - it's obviously faster than the RTX 2060 and those two extra gigs of GDDR6 may have limited real world use in 1080p and 1440p gaming, but they do help with future-proofing and certainly sweeten the deal. The RX 5700 XT isn't quite as attractive competitively, but certainly takes the fight to the RTX 2060 and delivers good results. The lack of Ryzen-style disruption remains a bit of a shame, but the value here is solid enough.
https://www.eurogamer.net/articles/...5700-xt-review-head-to-head-with-nvidia-super
 
PSA: V-Ray GPU is EOL. It has been replaced by V-Ray Next GPU (benchmark here) which is unsurprisingly CUDA only because $$$ (VRay GPU's OpenCL path was 6 years too late and totally useless with literally no development effort put into it after the initial release)
Also..V-Ray benchmark should have never been used as a tool to compare Nvidia vs AMD GPUs given the state of its OpenCL support (doesn't matter anymore given that V-Ray has been replaced by V-Ray Next which is CUDA only):
Annotation-2019-07-08-151449.jpg
 
Last edited:
With regard to the eurogamer review, if its beating a 2060 super now, that lead is only going to increase going forward with improved drivers in general and game specific optimization.

When nvidia releases their 7nm chips then you can reevaluate its price/performance. By that time the big navi should be on the horizon if not already out, and the 5700xt will be reduced in price to remain competitive, and I wouldn't be surprised to see it outperforming the 2070 super by then.
 
Last edited:
And so will a 2060/super. The only difference is that with that card you get the choice whether you want to sacrifice performance for arguably prettier pixels.
Exactly, don't know why some people seem to be ignoring the ability to choose, on one card you can have pure performance if you want, or locked fps with extra graphics quality if you want. Also potentially even more performance with Variable Rate Shading, DirectML and Mesh Shaders (once their DX12 support is finalised).
 
And so will a 2060/super. The only difference is that with that card you get the choice whether you want to sacrifice performance for arguably prettier pixels.

As with any other GPU in the history of GPUs. If you wanna sell the idea 2060 is viable for RT that's fine, just don't expect everyone to buy into it. 2080Ti is borderline capable if you like 1080p, 2060 is not and never will be.
 
Navi in Forza Horizon 4 :oops:
I've been waiting for that type of benchmark for a long time now as some sort of proof that software optimized specifically for AMD's strengths will eventually perform over nvidia - or at least reason enough to get rid of the concept of nvidia flops != AMD flops which keeps coming up all over the place.
 
Exactly, don't know why some people seem to be ignoring the ability to choose, on one card you can have pure performance if you want, or locked fps with extra graphics quality if you want. Also potentially even more performance with Variable Rate Shading, DirectML and Mesh Shaders (once their DX12 support is finalised).
Because we know we won’t use it?
I know what my priorities are when playing with a mouse and keyboard. I want a high consistent frame rate on my main monitor which only supports 60Hz. Then I strongly prefer to run at native resolution which is 2560x1440. I always turn all the ”fake film” processing off such as distortion, chromatic aberration, grain filters, motion blur and DOF effects other than in cut scenes.
In the unlikely event that I need to turn anything off after that, the first to get lowered quality is shadows and reflections because they just don’t matter much to how I perceive the world, real or rendered.

There is no room there for lighting calculations that aren’t performant. Particularly if they specifically deal with the less important aspects of a scene.

The Nvidia 2060 Super seems like a decent product, I could just as easily go for that as for the equivalent AMD product if I wanted to burn some money pointlessly. But over the last two decades+ that I’ve been buying graphics cards since the first S3 graphics decelerator, there has been a bunch of features available that were never relevant when they were introduced. Some fell by the wayside, some became part of the mainstream 5-10 years down the road. I know for a fact that RTX won’t mean diddly squat for my gaming enjoyment, and I get more out of reading papers or watching lectures on the subject than actually playing with it enabled.

My tech nerdery is distinct from my gaming enjoyment. And unless you get massive pleasure out of enabling RT and then standing around looking at scenes searching for artifacts, I just don’t think it’s worthwhile. IMHO, obviously. Each to their own.
 
I've been waiting for that type of benchmark for a long time now as some sort of proof that software optimized specifically for AMD's strengths will eventually perform over nvidia - or at least reason enough to get rid of the concept of nvidia flops != AMD flops which keeps coming up all over the place.

In what way is Forza Horizon specifically optimized for AMD?
 
In what way is Forza Horizon specifically optimized for AMD?
that it was designed specifically for X1X first before down porting to XBO. Thinking out loud, I would design the game to work with Xbox's hardware to ensure that the relatively weak consoles were performing well.
 
I've been waiting for that type of benchmark for a long time now as some sort of proof that software optimized specifically for AMD's strengths will eventually perform over nvidia - or at least reason enough to get rid of the concept of nvidia flops != AMD flops which keeps coming up all over the place.
What are those strengths in Navi though, compared to GCN-only? One would assume that a title taking advantage of AMDs strengths would utilize compute a lot more, but Navi seems to be much weaker in compute compared to pre-Navi GCN.
 
What are those strengths in Navi though, compared to GCN-only? One would assume that a title taking advantage of AMDs strengths would utilize compute a lot more, but Navi seems to be much weaker in compute compared to pre-Navi GCN.
off the top of my head, a lot more use of compute and async compute. Increasing the loads where Xbox was good at, decreasing the loads where it wasn't.
 
Last edited:
Yeah, that one has been bugging me. Unfortunately I don't have a good explanation right now. I consistently get those results. It may be another driver oddity.

I recall Vantage's pixel fillrate being memory bandwidth dependant, maybe GDDR6 controller is not good enough for now?

Also, now that the reviews are out, can AMD do 3 shader engines and 96ROPs? :?:

I'm also wondering if AMD are willing to do an XTX and go up against 2080, considering that Hardware Unboxed got the 5700XT to stay north of 2Ghz, often hitting 2.1G with more power and 100% fan(11m30s),

 
I'm not even sure it should be compared with the 20x0 cards considering it lacks feature parity. I wish it had targeted 1080 Ti level finally. Looks like a safe mid-range design though which may be the smart call for them.
 
As much as I'm happy to see AMD has cards than can compete on price/perf again, I do wonder why you would buy a 5700/5700XT for two reasons.

1. the blower cooler. I understand there will be no custom cards anytime soon?
2. No raytracing. It will be in consoles next year. It will be in navi cards next year. It already is on Nvidia cards. Essentially you're buying a card that doesn't perform much different, nor is a lot cheaper than the competion but does lack features you know will be used by many games come next year. I assume that price/perf is very important in the mid end market so why wouldn't you pay a couple of bucks more for a 2060 super knowing you can play with all bells and whistles turned on for the next couple of years?
1. August.
2. That's over a year from now, general gamers don't think that far ahead when planning purchases. And it's not like NVIDIAs RTRT-performance is anything to write home about, until we see what the consoles (and RDNA2 and NVIDIAs next gen) can do on that front we have no clue if their current RTRT-implementation is even relevant when console RT-games appear. (and both parties have features which other lacks, I don't think there has ever been a point in video card industry where one manufacturer would offer everything everyone else does and then some)
 
Back
Top