AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

RTX won't be "usable" with a 2060 super... Hell even 2080ti is only good for like 1080p right now. And it's not a shot at nVidia, it's just the conseqence of being first at a new tech (for gaming). First gen are not usually fast enough.

So, I'm cool with 5700XT not having RT. They are mid level card anyway. I wonder how they will produce bigger gpu with this "bad" power efficiency...
 
Not sure why hardware accelerated ray tracing is important in a midrange product in 2019. They will get the handle of the technology through the consoles and then release new cards next year that incorporate all that know-how.
 
Trick question: Which card needs another 25% in perf/watt to nullify the node difference?
They are made on the same node, after all, but their performance/W is a fair distance apart.
Both of them. 5700xt is similar in performance and wattage to a RTX 2070 and the same with 5700 and 2060. So in both cases Nvidia is more efficient in perf/wat for the node difference(25-30%?).
 
Last edited:
As much as I'm happy to see AMD has cards than can compete on price/perf again, I do wonder why you would buy a 5700/5700XT for two reasons.

1. the blower cooler. I understand there will be no custom cards anytime soon?
I’m not sure this is the forum for these discussions really, but if I’ll put on my PC game player hat for a moment.
Blowers have the advantage that they dump generated heat outside the cabinet. They have the disadvantage that they can’t get rid of a lot of heat through the small PCI-slot holes without high air pressure and noise.
If you have a PC cabinet that focuses on being quiet you will typically have covered front/sides/top to reduce the sound escaping from hard drives and fans. Your airflow will be low.
A blower cooler suits this scenario as the system will still be very quiet under all conditions apart from max graphics card load. And even when gaming these cards are rarely running full out.
I’ve been using a Vega 56 blower for soon two years now, and apart from running benchmarks, I rarely hear it. (When I do though, I find it irritating and would appreciate it being more quiet.) I don’t even consider overclocking or running higher heat output cards with my setup.

2. No raytracing. It will be in consoles next year. It will be in navi cards next year. It already is on Nvidia cards. Essentially you're buying a card that doesn't perform much different, nor is a lot cheaper than the competion but does lack features you know will be used by many games come next year. I assume that price/perf is very important in the mid end market so why wouldn't you pay a couple of bucks more for a 2060 super knowing you can play with all bells and whistles turned on for the next couple of years?
You believe that the 2060 Super will allow you to ”play with all bells and whistles turned on for the next couple of years”? This requires you to have predictive abilities that would put Nostradamus to shame unless you are talking about 720p at 30Hz.
The fact of the matter is that we simply don’t know what titles will be produced, or when, or how RT will be utilized for some lighting calculation or what benefits and drawbacks that will bring.

What we do know is that all titles offered on PC will run not only on currently available hardware but on hardware a few years old and a couple of steps down the market ladder. The main benefit of the PC as a platform is that it typically allows you to adjust the rendering parameters to suit your preferences. So a RX5700 or my Vega will be able to play all new titles for the next several years.

Not everyone who is in the market for these cards are rendering technology enthusiast. As a game player I don’t care at all how the graphics rendering is achieved. Why should I? I care about results, performance and the price I need to pay for the privilege of chasing pixels on my screen. Regardless of my future cards abilities I doubt I will use raytracing for much of anything unless efficiency improves drastically. It simply doesn’t enter into my purchase decisions at all.
 
shader-coresaok94.jpg
 
https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

Not well enough for me, right now... It can be good in some games/situations, but it can go south of 60fps too... 1080p is the safer bet imo.
Said thousand of times, but there is a life out of max/ultra settings. These reviewers are so lazy that people tend to forget this evidence :???:
Usually, the visual difference between high and ultra is nearly non-existent and you get 20~40% more fps. Lowering few settings here and there by 1 or 2 degrees and everything is playable even with RT at 1080p 60fps on RTX 2060S (in no CPU limited scenario of course)
 
I think it's even more interesting that AMD is able to fit 2560 stream-processors in 10,3 bil. transistors, while Nvidia needs 10,6 bil. transistors for 2304 stream-processors. If we substract the transistors needed for RT-Cores (3 %), resulting number of transistors (~10,3 bil. for both GPUs) is even closer and make such comparision even more valid.

Anyway, I believe that AMD could have been a bit more bold and use the same configuration as Hawaii: 2816 stream-processors (+64 ROPs), which would be a bit more balanced.
 
From GCN tflops efficiency to turing tflops efficiency is really an achievement.
In gaming. Because GCN has always been on part & sometimes better than Nvidia's archs in compute workloads (minus the use of Tensor Cores) . From the looks of it Navi switched it up (better at gaming workloads & worst at compute).
 
In gaming. Because GCN has always been on part & sometimes better than Nvidia's archs in compute workloads (minus the use of Tensor Cores) . From the looks of it Navi switched it up (better at gaming workloads & worst at compute).

We don't really know anything about RDNA's performance in compute workloads right now.
 
So a RX5700 or my Vega will be able to play all new titles for the next several years.

And so will a 2060/super. The only difference is that with that card you get the choice whether you want to sacrifice performance for arguably prettier pixels.

Again, as there is little between the AMD and Nvidia cards when it comes to pricing and performance, I wonder why anybody looking for a mid range card would pick AMD over Nvidia.

RT performance looks decent enough on the hand full of games that support it, no? Over time even a steady 30fps at 1080p wouldn't be that horrible for single player games. And if you want that 60fps, turn RT off.

Seems this card should have launched last year.
 
Back
Top