AMD Vega Hardware Reviews

DigitalFoundry tested the liquid cooled version @4K, basically equal to 1080.

Untitled.png

http://www.eurogamer.net/articles/digitalfoundry-2017-amd-radeon-rx-vega-64-performance-preview

Note: COD Infinite Warfare's performance is bad on NV hardware because of a bug in the latest driver:
Issues and updates:
[Call of Duty Infinite Warfare] Major FPS drop in Call of Duty Infinite Warfare after updating to R384 drivers [1955894]
https://forums.geforce.com/default/...lay-driver-feedback-thread-released-7-24-17-/
 
Did any reviewers look at mining benchmarks?

I've been betting Vega is worse than the 1070 at mining which gives AMD an opportunity to grab gamer share if 1070 shortages persist and AMD pricing is right.
http://www.phoronix.com/scan.php?page=news_item&px=RX-Vega-OpenCL-First

Really bad results there. Highest power consumption and lowest (less than 560) performance so obviously some driver work. He's compiling the kernel and drivers from git as well. The Linux gaming results were good though for open source. Still a work in progress as nothing has hit stable branches yet.

Are you really being serious?
Unless the initial results were wrong, yes.

Where are the facts? I don't see them, link please!
First there was the magic drivers, then DSBR, then now of course FP16. Yeah just like Async Compute, we all know how well that one turned out. (hint: not well).
Ok, but they were linked already.

http://www.hwbattle.com/bbs/board.php?bo_table=hottopic&wr_id=7333&ckattempt=2

Who said anything about "magic" drivers? No magic involved, just time to get everything working. Most people don't often refer to drivers fixing bugs as "magic". At least not any avid gamer or developer with even the slightest of understanding of how things work.

And yeah async panned out and devs stumbled on it. Takes time to work around the performance hit and it did improve performance as demonstrated in every(?) title using it in DX12. DSBR is enabled everywhere and not requiring specific dev intervention as some suggested, Rys confirmed that on Twitter. AMD likely needs to tune the binning though. Took Nvidia a while to get theirs right. FP16 was announced by AMD when they first showed Vega, so not sure why it would just "now" be becoming relevant. It's been there the entire time. All of those will make a difference, that part has never changed. All of those happen to be in your "magic" drivers.
 
In current titles sure, but upcoming titles that actually use the new features? With no other changes the FP16 alone would make them equal.
We'll see how Wolfenstein 2 runs, with both intrinsics and FP16 it could be Vega 10's best case scenario. I still think reaching the 1080Ti is too optimistic.

As some people so conveniently forget, Vega 10 is standing on a major manufacturing process deficit (>20% worse efficiency at lower frequencies, greater than that at higher ones) when compared to GP102.
 
We'll see how Wolfenstein 2 runs, with both intrinsics and FP16 it could be Vega 10's best case scenario. I still think reaching the 1080Ti is too optimistic.

As some people so conveniently forget, Vega 10 is standing on a major manufacturing process deficit (>20% worse efficiency at lower frequencies, greater than that at higher ones) when compared to GP102.

It's AMD's choice, no ? Are they forced to use GloFo ? If so, well, bad contract for them. If not, well, very bad choice. Either way, their fault.
 
Vega 56 looks decent, power draw is similar compared to my AIB 1070 as is performance and it will likely pull away in a couple of months. The price is a bit iffy.

I think it's the bandwidth holding them back, just too low to be compensated by what improvements they made in architecture. The 14nm+ Vega could do a lot better than what the 580 managed if they can get 1.2Ghz or so of HBM.
 
Last edited:
We'll see how Wolfenstein 2 runs, with both intrinsics and FP16 it could be Vega 10's best case scenario. I still think reaching the 1080Ti is too optimistic.
If the Linux drivers are anything like the windows ones, AMD has some issues lingering and a lot of tuning yet. Outside of FP16 and intriniscs(SM6 or Vulkan equivalent once released) just simple tuning I think will do a lot. I'm still wondering if Vega is a TBDR architecture and that's not implemented yet. Resulting in Vega clocks and power being pushed. Makes sense with the synthetics Anandtech posted: high texture, low fillrate. Still need some good in-depth articles or whitepapers to clear up some design choices. Namely where all the cache went, unless it took massive instruction and forwarding buffers to achieve the new clocks.
 
Well all the speculation and here we are again. Yet another barely-follow-the-leader launch with a chip pushed outside of its power efficiency envelope. Woohoo. Vega is spectacularly over and under engineered.

I'm sure the drivers are expected to save the day.
 
Stop what? Just because you don't like the facts doesn't make them wrong. What I said is indicated in the first review linked and it doesn't take a genius to reach a similar conclusion or not reject the evidence.
You are certainly entitled to your opinion, but your posts are misleading at best and it makes the whole site look bad IMO. I don't think anyone here is willing to argue with you any more since your position is so hysterically biased and unreasonable. I find myself almost liking your posts just so I can unlike them.
 
I think if AMD were to do 4 stacks again, they'd need a bigger chip and interposer and would face the same or even worse problems than Fiji with HBM2 being a bit bigger and that's why the last year's leak of 7nm Vega 20 had four stacks. It looks like AMD have scrapped those plans and are going for 14nm+ which would have the same problem.
 
From a materials science perspective the aluminum housing should dissipate a bit more heat.
 
Back
Top