AMD Vega Hardware Reviews

Performance/watt is IMO one of the best performance metrics, since it enables you to compare how much each chip is doing useful work from end user perspective, per watt. You can then compare those to the theoretical numbers with ease, too.
It's a metric that wants to be useful, but unless you equalize performance or wattage it's meaningless. A large chip could have the same perf/watt as a small one, but clocked a bit lower be way out in front in performance and perf/watt. Cut Vega down from even power saving and they'd likely be ahead of Pascal.

Another intresting side effect of the LN2 is that VEGA's power draw fell of a cliff. Like 100W less than air cooled on +50% power limit. So the GPU core evidently is very happy with LN2 and could probably go well in excess of 2GHz with more voltage(I did all my testing at stock which is 1.2V).

https://cxzoid.blogspot.co.uk/2017/08/first-impressions-of-vega-fe-on-ln2.html?m=1
Definitely some leakage issues, don't suppose anyone has tested with a thermal limit as opposed to power?
 
While it's not published (since that architecture section wasn't done in time), I did run that same benchmark. I'm getting much better numbers than that on Vega 64. About 20% better, to be precise.

Thank you for providing that, good to see they've fixed the apparent regression on that test.

Yep, that's been fixed. Same here on time constraints. Should I take it as a coincidence that Vega 64's effective texture bandwidth appears to have imrpoved by roughly the same ~60 GB/s that the raw memory bandwidth did?

I would like to point out though, that the AIDA-GPU-Benchmarks that have been shown below, were part of (our I guess?) Vega-FE-Review and not originally present in the RX Vega article. I've added them since, with the new and improved numbers for Vega RX - cannot rebench FE atm though.

Thanks to you as well. Do either of you have any ideas for why Vega 64 still shows a significant regression in effective texture bandwidth vs. Fiji? Or to what extent primitive shaders, primitive culling and DSBR are actually working in drivers at the moment (which may be a closely related question)?
 
The most relevant power numbers are at the default setting (balanced I guess) where most users will remain. Once you start saying "but it can do this and this if you do this and this" well guess what you can do that stuff on other cards too. Since this is an infinitely deep rabbit hole reviewers have to stick with out of box experience barring some extraordinary circumstances.
 
Because who would buy a high-end GPU and then neuter performance.
At least the same ones who keep nagging about power consumption.



It's not like power saving puts the Vega 64 in a different performance bracket. The 3-4% performance difference doesn't make Vega 64 perform substantially lower than the 1080, at least not in the titles where they're close.
 
The most relevant power numbers are at the default setting (balanced I guess) where most users will remain. Once you start saying "but it can do this and this if you do this and this" well guess what you can do that stuff on other cards too.
There's an easily accessible power saving mode toggle in Nvidia drivers for Pascal cards?
 
AdoredTV benchmarked Liquid Cooled edition and said he'd likely stick to Power Saving mode, not even Balanced. As with most reviews there's not much information yet other than standard benchmarks due to AMD only giving everything 2 days to test. In Prey, Vega was pulling 200w more in Turbo over Power Saving, at 25% higher core frequency with 10% higher performance.

imo the decision to stay with the same 4 compute engine and rop ratio as Fiji has killed Vega and makes any new GCN doa as well.

 
Last edited:
The most relevant power numbers are at the default setting (balanced I guess) where most users will remain. Once you start saying "but it can do this and this if you do this and this" well guess what you can do that stuff on other cards too. Since this is an infinitely deep rabbit hole reviewers have to stick with out of box experience barring some extraordinary circumstances.
The issue is the low power modes having little to no effect on performance. Balanced=Turbo-50W. Savings was costing 3fps while taking consumption even lower. Take a title Vega is strong in and perf/watt and equivalent performance likely exists with 1080.
 
The slide said it was heavily used by the resolve pass, but the context was FP16 optimizations in general,
Nope, they specifically use FP16 just for the Checkerboarding, base PS4 game don't use it due to lack of support. And the slide context was about 4K checkered as well.
Many other areas to use FP16 not covered in that presentation that would be passing data to that shader. Not all of them would have been covered.
They were not outlined in that presentation, even though it covered several other optimization techniques, they only mentioned F16 within the context of their upscaling methodology. So 30% more performance for their upscaler only, the upscaling itself is only a small part of a full frame, so saving 30% of it's rendering time might only translates to a tiny fraction of performance improvement in the full frame.
 
It is a shame, but I wouldn't get too attached to VP9. Only youtube uses it and it's not scaleable at all to the future, AV1 should be replacing it... (waits impatiently for the colossal slowness that is AV1...)

YouTube's kind of a big deal though.....

It's going to be a while before AV1 gets into hardware decoders for GPUs. By that time I expect that there will be a mainstream GPU that can do 4k@60fps with HDMI 2.1 & VRR support, so I think I'll be able to feel good about buying a new card by then.
 
There's an easily accessible power saving mode toggle in Nvidia drivers for Pascal cards?
Yep, under Manage 3D Settings. Power Management Mode (Optimal Power (default), Adaptive, or Prefer Max. Performance).
 
It's a metric that wants to be useful, but unless you equalize performance or wattage it's meaningless. A large chip could have the same perf/watt as a small one, but clocked a bit lower be way out in front in performance and perf/watt. Cut Vega down from even power saving and they'd likely be ahead of Pascal.


Definitely some leakage issues, don't suppose anyone has tested with a thermal limit as opposed to power?

Remind me of Fiji. Cooler = less power draw, in a noticeable way. My Fury X loves my custom loop, big decrease in temps, even from the AIO it came with, and less power draw. LN2 is extreme, but I'm curious to see the card under a custom loop with a XSPC or EK waterblock.
 
It's clearly not a priority for them, which is a shame. I hope they have a better solution by the time they launch Raven Ridge.

As an aside, this was the tipping point that made me pull the trigger today on buying a custom OC'd 1080 @ $499 over waiting and trying to land a Vega 56 reference card (miner demand putting the cheapest 1070 @ $429 took that out of consideration). There were lots of factors that ultimately led to that decision, but Vega having hardware acceleration for VP9 profile 2 @ 4K would have been enough for me to be willing to wait. But adding nothing over Polaris when Nvidia have added both VP9 up to 8K (1060) and further added VP9 profile 2 up to 8K (1050/1030) to products released since was just one fail too many.
I don't get why people care about 4k DXVA when madvr+ LAV offers image quality far superior to any DXVA solutions.
 
Yep, under Manage 3D Settings. Power Management Mode (Optimal Power (default), Adaptive, or Prefer Max. Performance).
You mean nvidia runs their cards on power saving by default.
And what are the performance and power differences between those modes?
 
https://www.overclock3d.net/news/gpu_displays/amd_s_rx_64_launch_pricing_was_only_for_early_sales/1

Below is a quote from Overclockers UK's Gibbo, who reported that AMD's £449.99 price tag for their RX Vega 64 was "launch only" pricing. This price only applied to the retailer's initial allotment of RX Vega 64 Black standalone GPUs, with AMD giving gamers a £100 discount as some form of an "early adopter" discount.

WTF. So the $499 price for Vega 64 was only a launch discount for under an hour and now the actual price is $599? That's disgusting.
 
I don't get why people care about 4k DXVA when madvr+ LAV offers image quality far superior to any DXVA solutions.

What are you talking about? You can (and I do) use DXVA decoding in LAV and output that via madVR just fine. Have you tried software decoding HEVC or VP9 with high-bitrates? Not so great. Also, streaming video. I can watch even 8K (non-HDR) videos on YouTube on my 1060 without a single stutter. Try that with CPU decoding.
 
You mean nvidia runs their cards on power saving by default.
And what are the performance and power differences between those modes?
From Reddit:
Prefer max performance: Locks the GPU into a higher voltage and higher clock state, your GPU will stay at its '3D application/game' clocks in all situations and not lower itself into an idle state
Adaptive: The GPU will reduce clock speeds and voltages when it isn't under heavy load, i.e. when browsing the web/watching a video
Optimal Power: Basically everything adaptive does, however if your GPU is doing nothing (i.e. on the desktop) Optimal power won't keep re-rendering the frame over and over, if it's the same frame it'll just pull it from memory and display that frame.
 
Back
Top