When will Third Party Vega Boards be available?

When do you think Third Party Vega boards will be available?

  • Before New Years 2017

    Votes: 0 0.0%
  • Before end of Second Quarter 2018

    Votes: 0 0.0%

  • Total voters
    25
lol 3rd parties have got to be asking themselves "why are we making these?". Terrible results for that Red Devil, worse than reference.

They're less noisy when people overclock Vega cards to near death.

I guess that counts a lot for today's enthusiasts.
 
All in all, AMD probably did not anticipate Ryzen selling this well.

A couple of things to think about (assuming AMD is capacity limited):
  • AMD priced Ryzen slightly under Intel chips at the start. Since then, AMD cut prices both in June and October. Why cut prices when you are already selling everything you can make?
  • Chips take about 6 months from starting the wafer to selling finished product. AMD chose the amount of Q1 VEGA wafer starts (to be sold in Q3) just as Ryzen launched, before AMD had any idea how popular Ryzen would be.
I'm leaning towards assuming that AMD is not capacity limited. Most of the evidence I see doesn't support it.
 
Since then, AMD cut prices both in June and October. Why cut prices when you are already selling everything you can make?

They're obviously not selling everything they make very fast because there's stock everywhere. It doesn't mean they're not selling very well, as we've been seeing Ryzens (especially the R5 1600) as best selling CPUs in amazon listings.

They're cutting prices to try to flood the market as much as they can before Intel can ship large quantities of the 6-core Coffee Lake models which will significantly reduce AMD's multi-threaded advantage while keeping the same single-threading advantage they already have with Kaby Lake.


Intel made mobo manufacturers really pissed with "Coffee Lake paper-launching a quarter eaelier" hat trick.

Fixed that for you.
 
Regarding overclock3d's Vega 64 Red Devil review:

28103055809l.jpg


Is that correct?
A PC with a single custom Vega64 card use slightly more power than a PC with two 1080s?
 
And crimson redux, yeah. I'm waiting for this baby, oh man. :D Here's to hoping it'll scare some hidden horsepower out of vega...
I still think Crimson Magic would have been better.

AMD priced Ryzen slightly under Intel chips at the start. Since then, AMD cut prices both in June and October. Why cut prices when you are already selling everything you can make?
Binning distribution issue as AMD used the same silicon in the entire lineup. Getting more Epycs could mean flooding the market with Ryzens while being capacity limited. Someone must have ordered a crazy number of Epycs or yields are iffy for fully scaled chips though.
 
I still think Crimson Magic would have been better.
Crimson FineWine?
Gotta go all-in with memes.
Binning distribution issue as AMD used the same silicon in the entire lineup. Getting more Epycs could mean flooding the market with Ryzens while being capacity limited. Someone must have ordered a crazy number of Epycs or yields are iffy for fully scaled chips though.
EPYC is a different silicon, being ZP-B2.
Regarding overclock3d's Vega 64 Red Devil review:

28103055809l.jpg


Is that correct?
A PC with a single custom Vega64 card use slightly more power than a PC with two 1080s?
Something is very, very off, just looks at that german review.
 
Just a newer revision of the same thing from everything I've read. Ryzens the initial B1, but at some point they'd catch up as features should be the same. Just various errata fixed.
That's still a new silicon.
That's why EPYC has been missing for so long.
Lisa said something something cloud wins before the end of the year recently.
Anyway, that's unrelated to Vega.
Besides, where's Redux?
Maybe some teasers.
Or anything.
 
Last edited:
The computerbase.de review puts Vega64 Red Devil at 285W minimum (silent mode) while the 1080 FE consumes 157W.
So it looks like overclock3d's measurements were correct after all.
The disparity in power efficiency between AMD and Nvidia is shocking.

My system has a 120W 10 core Ivy Bridge Xeon with 8 modules DDR3, a bunch of SSDs and HDDs, AiO watercooler in the CPU and a vanilla Vega 64.

During gaming in power saving mode, my system pulls 350-360W at the wall.
The GPU is probably pulling 220W or less.


It's the OEM's fault that they decided to create a power hog just to appear 1% higher in some charts.
 
During gaming in power saving mode, my system pulls 350-360W at the wall.
The GPU is probably pulling 220W or less.
However, a GF1080 would only pull like 185ish watts, or something like that, and still be faster in many/most situations than vega at full-speed, near-300W power levels. So the chip does have issues with power consumption for one reason or another, be it silicon process-related, architecture, bugs/errata or whatever.

Still, it's good that 3rd party custom boards are finally starting to appear by the looks of it. Just want that ASUS board to come out, dammit... :p
 
@Bondrewd
It's bigger yes, but all those extra ALUs aren't doing MORE work, they're doing LESS, in most every situation, even a bunch of compute-only cases.

For sure. In practice we have a six-year old architecture that begs for a real revamp. But i discord on power measurements of Nvidia vs. AMD cards. I really don't think that GTX 1080 consumes just ~150W.

 
Hmm so overclock3d do not retest previous cards such as 1080FE and earlier custom models?
Not sure how useful it is including cards not retested; I am looking at 1070/1070ti FE/1080FE for that conclusion but this applies in context to all manufacturers, even custom GTX1080s are lower than the 1070ti FE - not memory related because of the 1070FE as a baseline.
Ah well at least one can see as a rough guideline historical improvements whether driver/patch if model version newly tested is similar to a previous model.
 
However, a GF1080 would only pull like 185ish watts, or something like that, and still be faster in many/most situations than vega at full-speed, near-300W power levels. So the chip does have issues with power consumption for one reason or another, be it silicon process-related, architecture, bugs/errata or whatever.

Still, it's good that 3rd party custom boards are finally starting to appear by the looks of it. Just want that ASUS board to come out, dammit... :p

Yeah that is a pretty reasonable figure you give for the GF1080.
Tom's Hardware with a scope and isolated GPU measured 173W in normal operation and OC'd to 2.1GHz hits average 206W (no point looking at instantaneous burst cycle peaks and dips for conclusion, which is a problem a lot misunderstood when looking at the scope results).
Now that is worst case scenario at 4K resolution with what they feel is the most demanding game they have found power demand wise, which tbh also applies to their measurements for all GPUs they test and also PCPerspective (also use a scope and isolate the GPUs).
.
 
Last edited:
Back
Top