A1xLLcqAgt0qc2RyMz0y
Veteran
Using the same cooler what does the RTX 2080 (EOL) and RTX 2080 Super gain.
Seems underhanded to only do one vendor but not the other.
Using the same cooler what does the RTX 2080 (EOL) and RTX 2080 Super gain.
Seems underhanded to only do one vendor but not the other.
i9-9900K@4.9GHz.What's the CPU used?
It's Shadow of the Tomb Raider, not Rise. I've given it one more try, BTW. The FPS gain is still within the 0–2 % range, considerably worse than any other game.Are you testing in geothermal valley?
The cost of the cooler increases the product cost and their comparison is against similarly priced products. IMO the review would have been more interesting to also compare OC's of similarly priced products.Seems underhanded to only do one vendor but not the other.
Why is that underhanded? They're testing to see if you can overclock the rx 5700xt to get gtx2080 stock performance.
The video is just an overclocking test to see if they can push the rx5700xt to rtx2080 performance with some kind of v-table modding. The end result is no, they can't. It's not a value comparison. It's just a test to see what overclocking results are possible. There's nothing deceptive or unfair about it.
The video has this in it's title "Radeon RX 5700 XT Overclocked, RTX 2080-Like Performance"
TechPowerUp alway tests power draw for media playback. Doesn’t fulfill all your criteria, but still.I'm looking for a review showcasing power draw for video playback, h.264, HEVC, Bluray etc, preferably with MadVR as video renderer. Reviews seem to skip this mostly now days. Any assistance would be appreciated!
When you let the hardware cull rather than compute all parts are faster. That says you can't conclude anything about primitive shader culling from this experiment. Performance characteristics have seemingly changed since the developer made the recommendation to enable GPU culling (compute) on AMD hardware.I thought that navis primitive shaders are on. When i see that activating Game Culling (instead of GPU Culling) i can't belive it.
https://www.computerbase.de/2019-07/wolfenstein-youngblood-benchmark-test/
Dude we realy need a frontend benchmarktool............................
TechSpot tested performance of the 5700XT with a EK water block at 2.1 Ghz. I'd expect the AIB custom models performance to be somewhere between reference 5700XT models and the tested water block versions using Igor's Lab method to bypass frequency limits.I would wait for the custom models. 5700XT seems to have some oc potential with a better cooler.
https://www.techspot.com/review/1883-overclocking-radeon-rx-5700/For what was about a 7% boost in performance on average you’re looking at an almost 40% increase in power consumption as the GPU power draw increased from just 186 watts right up to a Vega-like 258 watts (!).
Looking at total system consumption we see a 25% increase in power draw for the overclocked 5700 XT, making it slightly more power hungry than Vega 64 and around 30 watts more than the RTX 2080 Ti. In other words, the 2.1 GHz overclock destroys RDNA’s efficiency.
...
By simply looking at the power consumption of the otherwise very efficient 7nm Navi architecture, we've learned why AMD has put the restrictions on the frequency these cards can run at.
I thought that navis primitive shaders are on. When i see that activating Game Culling (instead of GPU Culling) i can't belive it.
https://www.computerbase.de/2019-07/wolfenstein-youngblood-benchmark-test/
Dude we realy need a frontend benchmarktool............................
So if I got this right: Navi is basically a unfucked version of Vega on 7nm, with working Primitive Shaders, or am I talking BS?When you let the hardware cull rather than compute all parts are faster. That says you can't conclude anything about primitive shader culling from this experiment. Performance characteristics have seemingly changed since the developer made the recommendation to enable GPU culling (compute) on AMD hardware.
https://forum.beyond3d.com/threads/amd-navi-speculation-rumours-and-discussion-2019.61042/There is a Navi architecture thread that goes into more detail on what has changed. There are more substantial changes than primitive shaders and fixes to unspecified Vega issues.
Yes. While there's mostly speculation at the start, more updates later in the thread start sourcing from LLVM and driver changes, and then the announcements from AMD. Within the last day or so, there's a link to a slide presentation outlining a decent amount of the CU and architectural modifications.