AMD Vega Hardware Reviews


If the HBM2 temperature thottling is the cause of the poor bandwidth results (on the B3D test suite), hopefully the 4-hi stacks on RX will have an easier job of dissipating heat.

Is it possible to underclock the memory on the FE? I wonder what bandwidth results it would give at a halved frequency. Like GDDR5 (but unlike HBM1) IIRC performance can be lowered if frequency is pushed too high because of error-correction overhead.

(Although since AMD are comparing RX to the GTX1080 non Ti at the roadshow I don't expect any massive performance increases.)
 
If AMD is using support for VESA's adaptive sync standard as leverage to sell Vega for a lower price/performance ratio than the competition, then shit's gonna hit the fan.
At any moment nvidia could just enable adaptive sync in the drivers and suddenly every Freesync monitor with DP1.2 would support it.

nvidia is being an ass for not supporting it at the moment (or rather, by only supporting it in laptops, meaning the hardware capability is there), but AMD is painting a big target on their backs if they're counting on this being an irreversible decision.


In the end, AMD's marketing keeps doing the strangest things. Zero info on the RX Vega cards, but now they're showing them around in totally random places, inside sealed boxes.
So here's this sealed box and that sealed box. You probably won't notice any difference between these two sealed boxes using these games and settings we cherry-picked, and you'll totally have to trust us that our sealed box is cheaper.
What is this? A blind wine tasting of videocards? Because if it is, it's fucking ridiculous IMO.
Carrying these cards around in vans doing dodgy stuff instead of just giving them to reviewers makes AMD look dodgy as hell.

And they chose Budapest.. do they have any idea of the minimum wage in Hungary? Why not Berlin, London or Paris?
 
Unfortunately, even with the cooler mod, which kept the HBM gen2 temperature (or whatever it is that HWInfo64 is reporting for this) at a maximum of 57 °C during a B3D-suite re-run, bandwidth figures did not improve. The duration of the individual tests is rather short in terms of heating up an actively cooled ASIC.
 
Last edited:
If AMD is using support for VESA's adaptive sync standard as leverage to sell Vega for a lower price/performance ratio than the competition, then shit's gonna hit the fan.
At any moment nvidia could just enable adaptive sync in the drivers and suddenly every Freesync monitor with DP1.2 would support it.

nvidia is being an ass for not supporting it at the moment (or rather, by only supporting it in laptops, meaning the hardware capability is there), but AMD is painting a big target on their backs if they're counting on this being an irreversible decision.


In the end, AMD's marketing keeps doing the strangest things. Zero info on the RX Vega cards, but now they're showing them around in totally random places, inside sealed boxes.
So here's this sealed box and that sealed box. You probably won't notice any difference between these two sealed boxes using these games and settings we cherry-picked, and you'll totally have to trust us that our sealed box is cheaper.
What is this? A blind wine tasting of videocards? Because if it is, it's fucking ridiculous IMO.
Carrying these cards around in vans doing dodgy stuff instead of just giving them to reviewers makes AMD look dodgy as hell.

And they chose Budapest.. do they have any idea of the minimum wage in Hungary? Why not Berlin, London or Paris?
On the plus side, it would be a win for consumers if this pushes NVIDIA to support VESA Adaptive Sync. I'd get a Volta in that case :LOL:
 
Carrying these cards around in vans doing dodgy stuff instead of just giving them to reviewers makes AMD look dodgy as hell.

One of the videos I saw said that they can't say much because they are still under embargo. I'm sure we'll see actual reviews and stuff from people closer to or just after launch @ Siggraph (30th?)

On the plus side, it would be a win for consumers if this pushes NVIDIA to support VESA Adaptive Sync. I'd get a Volta in that case :LOL:

Exactly. It would be great for everyone is Nvidia would support Adaptive Sync. Hopefully they will be forced to with HDMI 2.1 requiring VRR, though I'd need a new monitor with that :p
 
If AMD is using support for VESA's adaptive sync standard as leverage to sell Vega for a lower price/performance ratio than the competition, then shit's gonna hit the fan.
At any moment nvidia could just enable adaptive sync in the drivers and suddenly every Freesync monitor with DP1.2 would support it.

nvidia is being an ass for not supporting it at the moment (or rather, by only supporting it in laptops, meaning the hardware capability is there), but AMD is painting a big target on their backs if they're counting on this being an irreversible decision.


In the end, AMD's marketing keeps doing the strangest things. Zero info on the RX Vega cards, but now they're showing them around in totally random places, inside sealed boxes.
So here's this sealed box and that sealed box. You probably won't notice any difference between these two sealed boxes using these games and settings we cherry-picked, and you'll totally have to trust us that our sealed box is cheaper.
What is this? A blind wine tasting of videocards? Because if it is, it's fucking ridiculous IMO.
Carrying these cards around in vans doing dodgy stuff instead of just giving them to reviewers makes AMD look dodgy as hell.

And they chose Budapest.. do they have any idea of the minimum wage in Hungary? Why not Berlin, London or Paris?
AMD marketing has always be the worse part of the company. I though it was fix because Ryzen and Epyc presentation were pretty good in my opinion but this whole thing with VEGA just shows that they still need much work to do if they want to step up and challenges its competition.

If vega is bad just admit it was bad and move on, don't waste money, peoples time and patient just to make them realize is bad at the end. Every day vega images is getting worse and it better be good with that 40% perf. improvement or ppl will get angry.
 
One of the videos I saw said that they can't say much because they are still under embargo. I'm sure we'll see actual reviews and stuff from people closer to or just after launch @ Siggraph (30th?)



Exactly. It would be great for everyone is Nvidia would support Adaptive Sync. Hopefully they will be forced to with HDMI 2.1 requiring VRR, though I'd need a new monitor with that :p
Is it an actual requirement? As in NVIDIA needs to support it to claim 2.1?

AMD marketing has always be the worse part of the company. I though it was fix because Ryzen and Epyc presentation were pretty good in my opinion but this whole thing with VEGA just shows that they still need much work to do if they want to step up and challenges its competition.

If vega is bad just admit it was bad and move on, don't waste money, peoples time and patient just to make them realize is bad at the end. Every day vega images is getting worse and it better be good with that 40% perf. improvement or ppl will get angry.
It's obvious AMD and RTG have completely different marketing people.
That said, it's hard to do good marketing to a product with no redeeming qualities. With Ryzen AMD had a winner, so marketing is much easier.
 
Is it an actual requirement? As in NVIDIA needs to support it to claim 2.1?

I think something like a blu-ray player definitely doesn't have to support Game Mode VRR to claim HDMI 2.1 compliance because it doesn't need to send anything with variable refresh rate, so I guess it's perfectly possible that nvidia won't support VRR either and keep pushing the G-sync nonsense.
 
Is it an actual requirement? As in NVIDIA needs to support it to claim 2.1?


It's obvious AMD and RTG have completely different marketing people.
That said, it's hard to do good marketing to a product with no redeeming qualities. With Ryzen AMD had a winner, so marketing is much easier.
Good Marketing can make sh** shines like gold. bad marketing can make a good product invisible(oh, hi HTC) when you have bad marketing and a not so good product(Its not like vega its the worst GPU ever made either) then you are just asking for negative numbers.

I think even the FX had better marketing and execution than vega...

One thing I can not understand was all that "poor volta" slide in the -strike- forgotten video -/strike- video they released like a year ago...its not like vega was a performance champion back then and then get worse, they knew its performance so unless something was broken in the hardware that they though would give them a huge advantage and though they could fix it in time but couldn't I can't understand why you would play the "i'll shit on you" card instead of the "damage control" one if you know you will have an inferior product.
 
Too bad they have advertised ~2x better perf/watt of Polaris back in March 2016 - https://www.extremetech.com/wp-content/uploads/2016/03/AMDGPU.jpg. Now they have a 300+W power hog on their hands.

Not comparable, but most 1080TI retail overclocked hit between 310 and 350W ( i dont compare perf with it as we will see what vega RX give in this sense later ) .... Same goes for a lot of 1080 overclocked who are running way higher than the "standard version" and who are offtly the one used in review . ( 250W+ ) ..

Ofc, it change, from a review to another.. but well.
index.php


power_peak.png



The question there is not if performance wise they are comparable, but more than pass 300W is not so much "incredible". Ofc the problem is if they perform as a 1080 and need 300W+ ..
 
Last edited:
Is it an actual requirement? As in NVIDIA needs to support it to claim 2.1?

AFAIK yes, unlike Adaptive-Sync which is "optional" 1.2a spec VRR is a requirement for 2.1 output.

http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx
blu-ray player

Wouldn't those just stay as 1.4 or 2.0 or w/e?

Not comparable, but most 1080TI retail

Well even most 1080's use 210w+

power_average.png


power_average.png


And yeah, that's before a custom OC that's just out of the box.

All GPUs end up using a lot more power than people think when pushed above their clock efficiency range. I have a feeling that 1600 Vega RX will be similar power usage as an OC'd 1080 while performing similarly.
 
Well the whole point in redesign an arch. is exactly to improve its efficiency because you just can't scale the same design to infinity. from what we know in order to get the same performance than a 1080(in the middle between reference and customs) it eats more than 400W... I don't do the math but how much would it eat to get to the 1080Ti? 500W?

The problem with Vega is that it eats close than 300W while performing close to a 1070 which is about or less than 200W.

TDP alone doesn't tell you the whole story, TDP vs performance does and vega in that aspect its just a poor volta.
 
I've seen that before and tried to look elsewhere as well however I've only ever found Game VRR as a "feature", with nothing specifying it as an actual requirement.
You can't require support for variable refresh rate because any technology that has some kind of flashing behavior (like CRTs) can't do variable refresh rate by definition.
 
There's a CRT with a HDMI port? :oops:

ok I think we've had enough fun :LOL:
 
Back
Top