When will Third Party Vega Boards be available?

Discussion in 'Architecture and Products' started by Grall, Nov 26, 2017.

?

When do you think Third Party Vega boards will be available?

  1. Before Christmas 2017

    20.0%
  2. Before New Years 2017

    0 vote(s)
    0.0%
  3. Before end of January 2018

    12.0%
  4. Before end of First Quarter 2018

    8.0%
  5. Before end of Second Quarter 2018

    0 vote(s)
    0.0%
  6. After Heat Death of the Universe [AKA Never]

    52.0%
  7. Some Other Time

    8.0%
  1. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    They're less noisy when people overclock Vega cards to near death.

    I guess that counts a lot for today's enthusiasts.
     
  2. DrYesterday

    Newcomer

    Joined:
    Jan 8, 2013
    Messages:
    32
    Likes Received:
    18
    A couple of things to think about (assuming AMD is capacity limited):
    • AMD priced Ryzen slightly under Intel chips at the start. Since then, AMD cut prices both in June and October. Why cut prices when you are already selling everything you can make?
    • Chips take about 6 months from starting the wafer to selling finished product. AMD chose the amount of Q1 VEGA wafer starts (to be sold in Q3) just as Ryzen launched, before AMD had any idea how popular Ryzen would be.
    I'm leaning towards assuming that AMD is not capacity limited. Most of the evidence I see doesn't support it.
     
  3. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    507
    Likes Received:
    231
    The faster you clean the inventory, the better.
    Intel made mobo manufacturers really pissed with "Coffee Lake launching a quarter eaelier" hat trick.
     
  4. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    They're obviously not selling everything they make very fast because there's stock everywhere. It doesn't mean they're not selling very well, as we've been seeing Ryzens (especially the R5 1600) as best selling CPUs in amazon listings.

    They're cutting prices to try to flood the market as much as they can before Intel can ship large quantities of the 6-core Coffee Lake models which will significantly reduce AMD's multi-threaded advantage while keeping the same single-threading advantage they already have with Kaby Lake.


    Fixed that for you.
     
    Lightman likes this.
  5. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    507
    Likes Received:
    231
    Whatever, it still means their have excess inventory for 200 series mobos.
    At least AMD was smart enough to throw some aggresive deals on Ryzen 1000 series before launching 2000 series.
     
  6. rcf

    rcf
    Regular Newcomer

    Joined:
    Nov 6, 2013
    Messages:
    370
    Likes Received:
    302
  7. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    I still think Crimson Magic would have been better.

    Binning distribution issue as AMD used the same silicon in the entire lineup. Getting more Epycs could mean flooding the market with Ryzens while being capacity limited. Someone must have ordered a crazy number of Epycs or yields are iffy for fully scaled chips though.
     
    Grall likes this.
  8. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    507
    Likes Received:
    231
    Crimson FineWine?
    Gotta go all-in with memes.
    EPYC is a different silicon, being ZP-B2.
    Something is very, very off, just looks at that german review.
     
    el etro likes this.
  9. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    Just a newer revision of the same thing from everything I've read. Ryzens the initial B1, but at some point they'd catch up as features should be the same. Just various errata fixed.
     
  10. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    507
    Likes Received:
    231
    That's still a new silicon.
    That's why EPYC has been missing for so long.
    Lisa said something something cloud wins before the end of the year recently.
    Anyway, that's unrelated to Vega.
    Besides, where's Redux?
    Maybe some teasers.
    Or anything.
     
    #30 Bondrewd, Nov 29, 2017
    Last edited: Nov 29, 2017
  11. rcf

    rcf
    Regular Newcomer

    Joined:
    Nov 6, 2013
    Messages:
    370
    Likes Received:
    302
    The computerbase.de review puts Vega64 Red Devil at 285W minimum (silent mode) while the 1080 FE consumes 157W.
    So it looks like overclock3d's measurements were correct after all.
    The disparity in power efficiency between AMD and Nvidia is shocking.
     
    DavidGraham and Grall like this.
  12. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    My system has a 120W 10 core Ivy Bridge Xeon with 8 modules DDR3, a bunch of SSDs and HDDs, AiO watercooler in the CPU and a vanilla Vega 64.

    During gaming in power saving mode, my system pulls 350-360W at the wall.
    The GPU is probably pulling 220W or less.


    It's the OEM's fault that they decided to create a power hog just to appear 1% higher in some charts.
     
    Lightman and Grall like this.
  13. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,171
    Location:
    La-la land
    However, a GF1080 would only pull like 185ish watts, or something like that, and still be faster in many/most situations than vega at full-speed, near-300W power levels. So the chip does have issues with power consumption for one reason or another, be it silicon process-related, architecture, bugs/errata or whatever.

    Still, it's good that 3rd party custom boards are finally starting to appear by the looks of it. Just want that ASUS board to come out, dammit... :p
     
  14. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    507
    Likes Received:
    231
    It's simply bigger.
    Much, much bigger.
    Like 1.5k ALUs more.
     
  15. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,171
    Location:
    La-la land
    @Bondrewd
    It's bigger yes, but all those extra ALUs aren't doing MORE work, they're doing LESS, in most every situation, even a bunch of compute-only cases.
     
    pharma and el etro like this.
  16. el etro

    Newcomer

    Joined:
    Mar 9, 2014
    Messages:
    95
    Likes Received:
    12
    For sure. In practice we have a six-year old architecture that begs for a real revamp. But i discord on power measurements of Nvidia vs. AMD cards. I really don't think that GTX 1080 consumes just ~150W.

     
  17. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Hmm so overclock3d do not retest previous cards such as 1080FE and earlier custom models?
    Not sure how useful it is including cards not retested; I am looking at 1070/1070ti FE/1080FE for that conclusion but this applies in context to all manufacturers, even custom GTX1080s are lower than the 1070ti FE - not memory related because of the 1070FE as a baseline.
    Ah well at least one can see as a rough guideline historical improvements whether driver/patch if model version newly tested is similar to a previous model.
     
  18. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
  19. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Yeah that is a pretty reasonable figure you give for the GF1080.
    Tom's Hardware with a scope and isolated GPU measured 173W in normal operation and OC'd to 2.1GHz hits average 206W (no point looking at instantaneous burst cycle peaks and dips for conclusion, which is a problem a lot misunderstood when looking at the scope results).
    Now that is worst case scenario at 4K resolution with what they feel is the most demanding game they have found power demand wise, which tbh also applies to their measurements for all GPUs they test and also PCPerspective (also use a scope and isolate the GPUs).
    .
     
    #39 CSI PC, Nov 30, 2017
    Last edited: Nov 30, 2017
  20. Bondrewd

    Regular Newcomer

    Joined:
    Sep 16, 2017
    Messages:
    507
    Likes Received:
    231
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...