AMD Vega Hardware Reviews

Vega FE performs a lot better undervolted than using stock settings and power consumption decreases as well.
It is difficult to precisely pinpoint the issue causing HBM2’s brutal downclocking of -445MHz, but we have seen it happen routinely and on multiple systems with multiple environments.
It seems AMD's firmware team will never learn. It's always been like a bunch of juniors with no QA.

They got brilliant engineers like Mr. Naffzieger inventing all shiny power/AVFS stuff like adaptive clocking, fine grained sensors, voltage adaptation on boot, silicon aging probing/calibration, etc. On the other hand they have Vega running its HBM2 about 0.5GHz lower, horrible Ryzen launch AGESA, Kaveri dropping its speed exactly to 3.0GHz any time you put any load to GPU, etc.
 
Notice that they said for absolute stability, they needed 1130mV. A 70mV safety band is pretty tame. I wouldn't call Vega overvolted. NVIDIA has an even larger safety band from what I've seen but you don't see anyone running around blaming them for overvolting at stock.

Simply put, asking for lower voltage is unrealistic.
 
NVIDIA has an even larger safety band from what I've seen but you don't see anyone running around blaming them for overvolting at stock.
First of all NVIDIA has made a much better physical layout (see Perf/Power on GP108) and also a way better binning. Better binning = lower voltage.
 
What CRT has HDMI 2.1 support?
Nice way to miss the point, but at least you seem to have company.

There are other display technologies that have the same behavior: plasma, some OLED, some LCD backlights, ... Anything where here is some kind of pulsing involved. And since this is a field of a lot of research, who knows what the future will bring.

HDMI is an interconnect standard and the core spec is about being able to push large amounts of pixels.

The job of HDMI is not to enforce the presence of specialty features. Variable refresh rate is a specialty feature, just like, for example, audio or 3D stereo are specialty features.

It'd be ridiculous to enforce a specialty feature that can't even be supported by all displays.

Furthermore, variable refresh rate is useless for, say, a future 8K workstation monitor that only needs BW.

It makes total sense to separate a core minimum spec from side features that are lot needed for many cases.

Do you expect the HDMI spec to contain a rule "variable refresh rate is not required, except for LCDs with the following backlight, for those support is mandatory" ?
 
It'd be ridiculous to enforce a specialty feature that can't even be supported by all displays.

Furthermore, variable refresh rate is useless for, say, a future 8K workstation monitor that only needs BW.

Who says the displays have to require it? It could merely be required for all output devices to support it. So it you want to have a device that can output 2.1 it has to be able to support VRR, not that VRR is required for all screens.
 
I'm a little confused when he talks about power because he says something like power target limits the amount of current (amps) to the gpu, but power is watts.
Surely if the voltage is a set value, reducing the power will reduce the current.
P = I x V or
I = P / V
 
Last edited:
Surely if the voltage is a set value, reducing the power will reduce the current.
P = I x V or
I = P / V

Yah, I get that.

There's just some terminology confusion here. They talk about increasing power target while "under-volting" the card, but also conclude it draws less W in that state ... so "power target" really is current? Current should increase when you lower voltage, but then power is the wrong word. Just strange, because they use power to mean watts or current interchangeably.

Or I guess the load is fixed in this case, so if you lower voltage current will drop?
 
Last edited:
Power target is a setting in wattman and it defines the GPU consumption in watt. If you fix the core's voltage V then per the P=V.I formula, setting the power P is done by changing the current I.
If V is constant then dP = V.dI
 
Who says the displays have to require it? It could merely be required for all output devices to support it. So it you want to have a device that can output 2.1 it has to be able to support VRR, not that VRR is required for all screens.
So an IPTV set-top box that only does an MPEG decode at constant rate would also need to support variable refresh rate?
 
Nice way to miss the point, but at least you seem to have company.

There are other display technologies that have the same behavior: plasma, some OLED, some LCD backlights, ... Anything where here is some kind of pulsing involved. And since this is a field of a lot of research, who knows what the future will bring.

I'm confused. Why would variable refresh exclude constant refresh? Isn't the latter a subset of the former?
 
Never! The poor thing is a reference edition running at pitifully low clocks, it's well behind a 1070 even
All the cards in that chart seem to be reference so I'm not sure what you're getting at. I guess you could argue that the 980Ti overclocks very well (especially compared to AMD cards) and as such will come out better in a comparison of overclocked cards.

But my goodness, I was not expecting this level of suck from Vega. It makes R600 look like a resounding success. What the hell happened?
 
Last edited:
Who says the displays have to require it? It could merely be required for all output devices to support it. So it you want to have a device that can output 2.1 it has to be able to support VRR, not that VRR is required for all screens.
How would it make sense to require all output devices to support variable refresh rate when the vast majority of source content isn't variable refresh and probably never will?

How would you even test for it?

"Hey 8K BlueRay player, produce some variable refresh rate content out of thin air, otherwise you don't get the HDMI stamp!"
 
Why would variable refresh exclude constant refresh? Isn't the latter a subset of the former?
It wouldn't. All I'm saying is that it'd be dumb for HDMI to require support for devices to support variable refresh rate in order to get certified.
 
But my goodness, I was not expecting this level of suck from Vega. It makes R600 look like a resounding success. What the hell happened?
Well, that is what all the debate is about, isn't it? Just expanding polaris to 64CUs, 512bit GDDR5, and running at 1400MHz would be a bit smaller than Vega, wouldn't draw more power, and would perform much better than we have seen so far. So why Vega, and more than a year later?
I'm not comfortable with the assumption that AMD graphics folks are totally incompetent however. I hope AMD introduces Vega properly, explaining their design goals, the features of the new design et cetera, giving us something better than just a few puzzling benchmark scores.
 
Back
Top