AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

Seems underhanded to only do one vendor but not the other.
The cost of the cooler increases the product cost and their comparison is against similarly priced products. IMO the review would have been more interesting to also compare OC's of similarly priced products.
 
I haven't watched the video, but I think that for a comparative review to make sense, it has to be made at iso-something. It can be iso-price, iso-power, iso-performance, iso-noise, but some variable has to be the same on both sides, or you're just comparing things that can't be compared.

So comparing GPU A + custom cooler with GPU B with default cooler makes sense if, for example, the price of GPU A+cooler is the same as that of GPU B alone. Then, you either compare both at stock settings, or you overclock both, to the extent that each GPU will let you. Obviously, GPU A's custom cooler may provide more headroom, but that's fair, because the overall price is still the same.
 
The video is just an overclocking test to see if they can push the rx5700xt to rtx2080 performance with some kind of v-table modding. The end result is no, they can't. It's not a value comparison. It's just a test to see what overclocking results are possible. There's nothing deceptive or unfair about it.
 
I'm looking for a review showcasing power draw for video playback, h.264, HEVC, Bluray etc, preferably with MadVR as video renderer. Reviews seem to skip this mostly now days. Any assistance would be appreciated!
 
I'm looking for a review showcasing power draw for video playback, h.264, HEVC, Bluray etc, preferably with MadVR as video renderer. Reviews seem to skip this mostly now days. Any assistance would be appreciated!
TechPowerUp alway tests power draw for media playback. Doesn’t fulfill all your criteria, but still.
 
Found that computerbase and pcgh has 4k 60fps wattage:
https://www.computerbase.de/2019-07...st/3/#abschnitt_messung_der_leistungsaufnahme
https://www.pcgameshardware.de/Rade...ase-Benchmark-Preis-Kaufen-Vega-64-1293229/4/
https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/31.html

Techpowerup doesn't list settings for media playback but it's lower then above. Which probably means something not as demanding, FullHD 60 fps, 4k 30 fps? Anyway the results are good.

I guess MadVR power draw has to be derived from game (3d) Watt draw for now, since most of what it does are shader related.
 
I thought that navis primitive shaders are on. When i see that activating Game Culling (instead of GPU Culling) i can't belive it.

https://www.computerbase.de/2019-07/wolfenstein-youngblood-benchmark-test/

Dude we realy need a frontend benchmarktool............................
When you let the hardware cull rather than compute all parts are faster. That says you can't conclude anything about primitive shader culling from this experiment. Performance characteristics have seemingly changed since the developer made the recommendation to enable GPU culling (compute) on AMD hardware.
 
I would wait for the custom models. 5700XT seems to have some oc potential with a better cooler.
TechSpot tested performance of the 5700XT with a EK water block at 2.1 Ghz. I'd expect the AIB custom models performance to be somewhere between reference 5700XT models and the tested water block versions using Igor's Lab method to bypass frequency limits.
For what was about a 7% boost in performance on average you’re looking at an almost 40% increase in power consumption as the GPU power draw increased from just 186 watts right up to a Vega-like 258 watts (!).

Looking at total system consumption we see a 25% increase in power draw for the overclocked 5700 XT, making it slightly more power hungry than Vega 64 and around 30 watts more than the RTX 2080 Ti. In other words, the 2.1 GHz overclock destroys RDNA’s efficiency.
...
By simply looking at the power consumption of the otherwise very efficient 7nm Navi architecture, we've learned why AMD has put the restrictions on the frequency these cards can run at.
https://www.techspot.com/review/1883-overclocking-radeon-rx-5700/
 
I thought that navis primitive shaders are on. When i see that activating Game Culling (instead of GPU Culling) i can't belive it.

https://www.computerbase.de/2019-07/wolfenstein-youngblood-benchmark-test/

Dude we realy need a frontend benchmarktool............................

When you let the hardware cull rather than compute all parts are faster. That says you can't conclude anything about primitive shader culling from this experiment. Performance characteristics have seemingly changed since the developer made the recommendation to enable GPU culling (compute) on AMD hardware.
So if I got this right: Navi is basically a unfucked version of Vega on 7nm, with working Primitive Shaders, or am I talking BS?
 
There is a Navi architecture thread that goes into more detail on what has changed. There are more substantial changes than primitive shaders and fixes to unspecified Vega issues.
 
Back
Top