AMD Vega Hardware Reviews

Well, that is what all the debate is about, isn't it? Just expanding polaris to 64CUs, 512bit GDDR5, and running at 1400MHz would be a bit smaller than Vega, wouldn't draw more power, and would perform much better than we have seen so far.
Really? RX 580 at 1400 MHz consumes ~210 watts in games (ComputerBase.de), so Polaris with 64 CUs at 1400 MHz would need over 400 watts for typical gaming and likely 450-500 watts for heavy workloads. As we saw on HD 7870 vs 7970 or R9 290X / Fury, performance of current GCN implementations doesn't scale well with the number of CUs. Also, how would AMD get in the HPC market with Polaris, which doesn't support fast Int8, FP16 nor FP64?
 
Really? RX 580 at 1400 MHz consumes ~210 watts in games (ComputerBase.de), so Polaris with 64 CUs at 1400 MHz would need over 400 watts for typical gaming and likely 450-500 watts for heavy workloads. As we saw on HD 7870 vs 7970 or R9 290X / Fury, performance of current GCN implementations doesn't scale well with the number of CUs. Also, how would AMD get in the HPC market with Polaris, which doesn't support fast Int8, FP16 nor FP64?
Sorry, brain glitch. I calculated with a straight doubling all around of Polaris, giving 72 CUs, and then maintaining clocks letting the additional CUs compensate for the clock deficit (with a bit of rounding - we all know that small changes to frequency and voltages on this part of the curve makes for large differences where P is roughly O(f^3))
My main point stands though - what we have seen so far of Vega doesn't impress much given both complexity and time to market. But rather than assume that this is the whole story, I'd like to have more information, preferably straight from AMD, regarding the architecture and product. They have been too quiet.
 
Anyone know what % of AMD's revenues are GPU related? With an earnings update pending and a $14 share price I'd wager Polaris silence is more about share price damage control. Lots of Threadripper, easy on the GPU PR.

Sure would've been nice to see something competitive. I read an article this week about how Ryzen revealed just how much Intel was fleecing customers and the idea was that Vega would do the same (but for Nvidia) since the 1000 series is a small die, etc. Seems Vega cannot possibly be competitive based on size and power though so no downward price pressure on the high end [emoji853]
 
Well, that is what all the debate is about, isn't it? Just expanding polaris to 64CUs, 512bit GDDR5, and running at 1400MHz would be a bit smaller than Vega, wouldn't draw more power, and would perform much better than we have seen so far. So why Vega, and more than a year later?
I'm not comfortable with the assumption that AMD graphics folks are totally incompetent however. I hope AMD introduces Vega properly, explaining their design goals, the features of the new design et cetera, giving us something better than just a few puzzling benchmark scores.
The thought that AMD perhaps as a cost cutting measure got rid of their actual good GPU people and replaced them with B tier staff terrifies me. That would spell disaster for competition in the market.

You don't have a Zen moment often, but that's what it would take from Navi at this rate to rebound, if Vega turns out like my fears.
 
It wouldn't. All I'm saying is that it'd be dumb for HDMI to require support for devices to support variable refresh rate in order to get certified.

Gotcha. Although older HDMI specs would still be grandfathered under their old certs, no? Kinda like PSU efficiency ratings.
 
The thought that AMD perhaps as a cost cutting measure got rid of their actual good GPU people and replaced them with B tier staff terrifies me. That would spell disaster for competition in the market.
This never happened though there has been a lot of turnover in the last 5+ years as new players entered or tried to enter the market. Qualcomm, Apple, Samsung, etc.
 
Well, that is what all the debate is about, isn't it? Just expanding polaris to 64CUs, 512bit GDDR5, and running at 1400MHz would be a bit smaller than Vega, wouldn't draw more power, and would perform much better than we have seen so far. So why Vega, and more than a year later?
I'm not comfortable with the assumption that AMD graphics folks are totally incompetent however. I hope AMD introduces Vega properly, explaining their design goals, the features of the new design et cetera, giving us something better than just a few puzzling benchmark scores.


Can Polaris scale to 64CUs?
 
Anyone know what % of AMD's revenues are GPU related? With an earnings update pending and a $14 share price I'd wager Polaris silence is more about share price damage control. Lots of Threadripper, easy on the GPU PR.

Sure would've been nice to see something competitive. I read an article this week about how Ryzen revealed just how much Intel was fleecing customers and the idea was that Vega would do the same (but for Nvidia) since the 1000 series is a small die, etc. Seems Vega cannot possibly be competitive based on size and power though so no downward price pressure on the high end [emoji853]

The "chip X has +35% per wafer yield than chip Y, so it should be 35% cheaper!" analysis is cringeworthy, but gets repeated over and over in regard to multiple semiconductor. At design complexities we are currently dealing with, anything that appears "easy to manufacture" had hundreds of millions, if not more, invested on the front end specifically to make it so. Throw in short product cycles, and the amount of R&D that need to be amortized per chip is substantial but gets lost when people start focus solely on per-wafer manufacturing costs and yields.
 
The "chip X has +35% per wafer yield than chip Y, so it should be 35% cheaper!" analysis is cringeworthy, but gets repeated over and over in regard to multiple semiconductor. At design complexities we are currently dealing with, anything that appears "easy to manufacture" had hundreds of millions, if not more, invested on the front end specifically to make it so. Throw in short product cycles, and the amount of R&D that need to be amortized per chip is substantial but gets lost when people start focus solely on per-wafer manufacturing costs and yields.

And yet, there's no way for wall street to comprehend anything more complex...
 
Anyone know what % of AMD's revenues are GPU related? With an earnings update pending and a $14 share price I'd wager Polaris silence is more about share price damage control. Lots of Threadripper, easy on the GPU PR.
How bad can a report be if Ryzen appears to have been well received and all the Polaris (along with practically every other generation) has been sold out thanks to mining at potentially inflated prices?

Can Polaris scale to 64CUs?
XB1X was approximately Polaris with 48 CUs and we've seen a 64CU Fiji, so no reason to think otherwise.
 
How bad can a report be if Ryzen appears to have been well received and all the Polaris (along with practically every other generation) has been sold out thanks to mining at potentially inflated prices?
That's what I'm wondering about. Even if the Vega isn't up to nVidia's best can you really call it a fail if they can't keep them on the shelves?

I know from a gaming perspective yes and sure, with lots of room to argue...but from a financial perspective aren't they mostly concerned about selling units and if the current mining craze continues they shouldn't have problems for the foreseeable future so why all the doom and gloom talk?
 
I know from a gaming perspective yes and sure, with lots of room to argue...but from a financial perspective aren't they mostly concerned about selling units and if the current mining craze continues they shouldn't have problems for the foreseeable future so why all the doom and gloom talk?
Because we generally want AMD to succeed based on merits of an excellent design, not based on selling units solely for a market it's not designed for and only exists on the flimsiest of ideals.
 
Because we generally want AMD to succeed based on merits of an excellent design, not based on selling units solely for a market it's not designed for and only exists on the flimsiest of ideals.
No argument from me on that one, but again from a business perspective why does everyone seem so doom and gloomy? I really don't get that one I guess.
 
I know from a gaming perspective yes and sure, with lots of room to argue...but from a financial perspective aren't they mostly concerned about selling units and if the current mining craze continues they shouldn't have problems for the foreseeable future so why all the doom and gloom talk?
I have no idea on the doom and gloom as the CPU side alone should be able to carry them if they make gains there. That seems to have been the goal when they cut high end Polaris and possibly low end Vega to save cash. Only guess is because FE doesn't look to have good gaming performance, but we're also yet to see any valid gaming results with all the features enabled.

Because we generally want AMD to succeed based on merits of an excellent design, not based on selling units solely for a market it's not designed for and only exists on the flimsiest of ideals.
Agreed, but it's too early to gauge success based on the merits. We do however know that retailers are selling a lot of cards to miners and it's hard to see a financial downside to that in the short term.
 
Because we generally want AMD to succeed based on merits of an excellent design, not based on selling units solely for a market it's not designed for and only exists on the flimsiest of ideals.

Not only that, but the newer mining code is CUDA optimized now and gtx1070s beat the rx580 on hashrate and consumption. AMD cannot count on the mining demographic.
 
No argument from me on that one, but again from a business perspective why does everyone seem so doom and gloomy? I really don't get that one I guess.

Distributors pick up all the extra margin from the inflated prices and ramping up manufacturing takes time and puts them at a risk of being stuck with a bunch of unmovable product that can't compete with dirt cheap miner cards flooding the second-hand market once the bubble pops.

We've been here before.
 
Not only that, but the newer mining code is CUDA optimized now and gtx1070s beat the rx580 on hashrate and consumption. AMD cannot count on the mining demographic.

So a 250 $ish MSRP card is beaten by a 350$ one? That's hardly an issue; the demants is (was) there for both vendors' creations to go out of stock
 
If you are a miner and will have to buy those cards, atm it's rather a question of availability. Smaller profit in the short term vs. no profit in the short term at all, depending on which make you can get your hands on.
 
Back
Top