AMD Vega Hardware Reviews

Stable max before was 960 (might be a bit more, but 980 definitely was not). GPU-z has been update lately to (more?) correctly detect HBM manufacturer. I have not tests mining efficiency at same clock speeds, but I am getting 38.5 MH/s as well at roughly the clocks you stated (995/1100 MHz, 71/77/81 °C for the three temps with HBM obviously being the hottest, and at 2,000-2,100 rpm fan speed in an rather small desktop case, GPU-z tells me 0,875v, though it might be higher, did not measure directly at card here at home). System pulls 205W at the wall, GPUz says GPU-only power draw is 116W, normal idle for this system is around 35W.

GPU-z 2.4.0 tells me my GPU power is 174W at 1.0188V where before it was 128W at 0.918V. This pretty much reflects what my Kill-A-Watt is showing, an increase by about 20W at the wall with total of 452W mining at 38.5+29.7MH/s (Vega 56 + Fury) compared to 430W mining at 34.7+29.7MH/s before flashing. Same GPU clocks, only difference is now HBM at 1100MHz versus 930MHz before and higher vGPU floor. It is still worth it as 3.5MH/s costs me around 25W of power.

The only problem I see on my side compared to your results are temps! I need 4200RPM fan to keep HBM2 below 85C throttling point as hashing drops to 2xMH/s as soon as it reaches 87C.
This is after 1h of mining:
rsd.png


I suspect bad assembly on my card and lack of proper contact between HBM modules and heatsink. Air temperature coming of the vent is at 50C. I have good gap between Fury and Vega, but will test later today without Fury as I'm selling it.

PS. which BIOS you've used? I loaded Powercolor (as my card's original manufacturer) BIOS from TPU, but there also is slightly newer revision for Sapphire on their site.
 
Last edited:
I used the AMD reference VBIOS from techpowerup. I always have a gut feeling (meaning I've seen it, but of course cannot prove that it's always the case) that AIBs try to eek out 1-2% more performance by upping voltages/power budgets disproportionally - and for that, electricity is just too expensive where I live. That's why I went with reference VBIOS. My card officially is from XFX, but that's just the box.

edit: What's important is air intake temp (basically case temperature on the inside). My case is a very low power build expect for the Vega (and the darn idle mode which is at least 10 watts too high), so very cool air for the fan to work with.
 
So after tinkering with the voltages for the past couple of days, I've a few interesting observations. The card can run around 1.5-1.6Ghz with 1V but the power consumption/stability/gpu usage in some applications makes it prohibitive and they downclock the card pretty hard. So much so that the auto option outscores the customisation. Superposition benchmark is the biggest culprit where the card downclocks to even lower than 1.2Ghz unless the custom voltages are lower than 930mV. Firestrike runs okay but ultra makes more demands on the card and it shows similar behavior in latter parts of the 1st test.

I still get 10% better scores than stock with lower voltage that works everywhere with lower power consumption than stock, but on overclock forums I see cards doing 20%, even 30% over this. So the 64 bios can help a great deal but I'm leery of overvolting the memory. And I'm not sure if the rest of the card circuitry is same as 64's.

To reiterate the comment made before, Vega has very good potential since it can touch these high core clocks regularly and AMD's revision to the silicon can yield them ~20% performance boost easily. It can be a way better competition to Pascal before Volta drops.

edit : Spoke too soon, crysis 3 doesn't like the undervolt, but worse than that the drivers/game seem to be fubar, gpu usage dips in 60-70s and even goes down to 12% in some places with fps <20. Look in another direction and it's over 100. :mad:
 
Last edited:
I used the AMD reference VBIOS from techpowerup. I always have a gut feeling (meaning I've seen it, but of course cannot prove that it's always the case) that AIBs try to eek out 1-2% more performance by upping voltages/power budgets disproportionally - and for that, electricity is just too expensive where I live. That's why I went with reference VBIOS. My card officially is from XFX, but that's just the box.

edit: What's important is air intake temp (basically case temperature on the inside). My case is a very low power build expect for the Vega (and the darn idle mode which is at least 10 watts too high), so very cool air for the fan to work with.

Open case, no side-panel as I was mining, plus I have big 300mm fan blowing cold air on the cards ... even two R9 290X sandwiched ran happily in the 50-70C ranges before.
 
Open case, no side-panel as I was mining, plus I have big 300mm fan blowing cold air on the cards ... even two R9 290X sandwiched ran happily in the 50-70C ranges before.
Oh, wow. Maybe it's the rather high core voltage. What did GPU-z say? 1,0188v - that's when something with undervolting is wrong and voltage jumps back up, I think.
 
Last edited:
Some early comparison between the 1070 and this card, VSR has fewer choices but looks sharper at all resolutions. For nvidia I've to turn down the smoothing to look that crisp even at 4k while the other resolutions are basically useless. The text on desktop looks better too.

I'll use the crysis trilogy for a start and check how well these cards get rid of jaggies and shader aliasing with supersampling modes.

It's a hit and miss, beside chip.lotterty, most results you see on the net are from people who didn't property test proof it against a wide range of workloads.

Indeed, but those benchmarks load the graphics card quite heavily, as I said, superposition was dipping the clockspeeds way down and it's a long test. I think it's more that the load was jumping around and it's easier for the card to maintain stability with those benchmarks with constant loads.
 
So after tinkering with the voltages for the past couple of days, I've a few interesting observations. The card can run around 1.5-1.6Ghz with 1V but the power consumption/stability/gpu usage in some applications makes it prohibitive and they downclock the card pretty hard. So much so that the auto option outscores the customisation. Superposition benchmark is the biggest culprit where the card downclocks to even lower than 1.2Ghz unless the custom voltages are lower than 930mV. Firestrike runs okay but ultra makes more demands on the card and it shows similar behavior in latter parts of the 1st test.

I still get 10% better scores than stock with lower voltage that works everywhere with lower power consumption than stock, but on overclock forums I see cards doing 20%, even 30% over this. So the 64 bios can help a great deal but I'm leery of overvolting the memory. And I'm not sure if the rest of the card circuitry is same as 64's.

To reiterate the comment made before, Vega has very good potential since it can touch these high core clocks regularly and AMD's revision to the silicon can yield them ~20% performance boost easily. It can be a way better competition to Pascal before Volta drops.

edit : Spoke too soon, crysis 3 doesn't like the undervolt, but worse than that the drivers/game seem to be fubar, gpu usage dips in 60-70s and even goes down to 12% in some places with fps <20. Look in another direction and it's over 100. :mad:
I have a similar experience to yours. Undervolted, everything seemed nice (except fan noise) but it seems that the card can start increasing the max voltage behind my back for reasons that are totally opaque to me.
Investigations tracking voltages are still underway, but even though I can see that the card increases voltage => higher power draw => high temps & throttling, I have no idea what to do about it. At some point it simply doesn't seem to respect the max voltages I've set in WattMan, for whatever reason. I'm stumped.
 
Water blocks.
:)

You're right on the money, I just now realized that it's throttling the clockspeeds due to temp. target, which I hadn't been thinking much about, but not pulling back voltages and so it's maddening to see the card pull more watts than stock yet have lower clocks than stock. So it bodes well for custom cards, except ASUS who really like to mess up their AMD versions, I ought not to have been so impatient but this was a pretty good deal.

I have a similar experience to yours. Undervolted, everything seemed nice (except fan noise) but it seems that the card can start increasing the max voltage behind my back for reasons that are totally opaque to me.
Investigations tracking voltages are still underway, but even though I can see that the card increases voltage => higher power draw => high temps & throttling, I have no idea what to do about it. At some point it simply doesn't seem to respect the max voltages I've set in WattMan, for whatever reason. I'm stumped.

Apparently the voltage control under memory is a core voltage floor, or at least that's what I read. So the 'manual' voltage set for core will bounce around with respect to the 'manual' voltage set under memory.


edit: DF's recent video shows Vega64 only 12% better than a Fury X in Crysis3, 3:30

 
Last edited:
Oh, wow. Maybe it's the rather high core voltage. What did GPU-z say? 1,0188v - that's when something with undervolting is wrong and voltage jumps back up, I think.

I've tried loading AMD BIOS for Vega 64 and the results are identical. It looks to me that each GPU has predetermined voltage characteristics set on die after binning process and then BIOS applies same voltage values for each card, but depending on chips pre-set configuration, final voltage is selected.
That would explain why different Vega cards seem to have different minimum voltages when there is load present.
BTW my Vega idles at 0.7562V.

PS. Temps while mining are better without 2nd card in the system and now my fan settles @2550RPM with the settings from my previous post.
 
Looks like the mystery of random throttling is solved, gpuz shows a hotspot temperature as well and that easily goes over 100C with higher power limits, even at stock it's over 90C.

So while the card can be far away from the (core) temp. limit and the power limits, once this temperature goes into the 90s, the throttling starts. The best I can find about this temp. is that it is collected from various temp. sensors and is likely the hottest VRM on the card. Might also be causing random core voltage fluctuations.

Whichever board partner tames it will have 200-250Mhz advantage over the stock card.
 
Looks like 3rd party Vega boards may be as late as some time in november now, according to rumors/hardware.fr.

Epically fucked-up product launch. *sigh* I really should just buy a 1080 and be done with it. :(
 
If I had to guess, the 3rd parties are trying to wait for AMD to release a driver for Vega that enables DSBR / primitive shaders in auto-mode, so that their aftermarket versions compare better to competitor cards when they get reviewed. That way they can justify a price higher than the current SEP.

If I had to guess around the first guess, 3rd parties are definitely not happy with how long it's taking AMD to enable that. I know I'm not.
 
If I had to guess, the 3rd parties are trying to wait for AMD to release a driver for Vega that enables DSBR / primitive shaders in auto-mode, so that their aftermarket versions compare better to competitor cards when they get reviewed. That way they can justify a price higher than the current SEP.

If I had to guess around the first guess, 3rd parties are definitely not happy with how long it's taking AMD to enable that. I know I'm not.
Or it's AMD itself halting the AiB cards wanting to have a second, less disasterous round of reviews for RX Vega.
 
Or it's AMD itself halting the AiB cards wanting to have a second, less disasterous round of reviews for RX Vega.
"Disasterous" isn't nearly accurate.
No review I know of ever called any Vega card "disasterous".
 
Or the supply situation for HBM2 is bad and the different packages caused further delay. But with AMD GPUs we are always waiting for the mystical moment that will enable the full potential and heal all problems.
 
DSBR is already enabled, it was enabled in the marketing slides of Vege in an old driver batch, that driver has been released long time ago. We are already at a new driver batch. As for primitives I suggest we take whatever AMD told about them with a huge sack of salt, if they were even partially the game chnager the evanglists want AMD would have statrted with them before any other feature and released vega with them enabled out of the door. AMD also never once talked about their potential performance uplift, they were happy talking about some RPM 15% projected fps uplift, and even the DSRB limited uplift, but said nill about primitive shaders.
If I had to guess, the 3rd parties are trying to wait for AMD to release a driver for Vega that enables DSBR / primitive shaders in auto-mode, so that their aftermarket versions compare better to competitor cards when they get reviewed. That way they can justify a price higher than the current SEP.
 
Shitposting aside, they've launched it with some very certain uarch features disabled.

Which is sad enough, but with each passing day, I am more convinced that the perfromance gain in normal scenarios will be very limited and maybe some features do not work correctly.
 
Back
Top