AMD: Speculation, Rumors, and Discussion (Archive)

Discussion in 'Architecture and Products' started by iMacmatician, Mar 30, 2015.

Thread Status:
Not open for further replies.
  1. Arzachel

    Newcomer

    Joined:
    Jul 23, 2013
    Messages:
    28
    Likes Received:
    22
    Haha, touché. I am slightly hopeful, since they're using Samsung's R&D.
     
  2. SimBy

    Regular

    Joined:
    Jun 21, 2008
    Messages:
    700
    Likes Received:
    391
    In an ideal world you would release GPUs for all market segments at the same time. Since it's not, they decided to prioritize mainstream where most of the volume is. Supposedly there's smaller Vega coming in October but I wouldn't count on it.
     
  3. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    Hard to say if it was set like that from the start, or if they was initially believe that HBM2 will be released faster and had previously think to been able to release a Polaris " high end " + HBM2, or Vega and Polaris in a shorter time frame.

    IF ( and that a big IF ), GF 14nm can clock as high as the 1080,... I dont really see any reason to dont release a "slighty bigger" chips at first ( based on Vega or Polaris ) with high clock, as gaming part. But if they had plan to use HBM2 only, well this could explain the delay.
     
  4. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    The current rumours suggest Vega will be pushed out early, in October
     
  5. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    You touched on an aspect that concerns me from a business perspective.
    You rightly mention AMD is sort of creating a price war, at a time when the next die is not expected to provide the same level of gains recently seen.
    So what are the repurcussions if a business sells their best improved technology product at a discounted price, that will also compete pretty well against future generation product?
    You end up caught in a cycle of depressed prices, great for consumer but not great for a manufacturing business.
    Sure they may shift a fair amount for next few quarters but this is offset against what could be achieved with higher margins, the real downside is this acts as a price anchor.

    History has shown this where AMD tried to raise the prices of the 390/390X products at launch compared to the heavily discounted earlier models near end of life.
    And this situation could get nasty if Nvidia responds with their own competitive price corrections (not talking about the 1% market enthusiast cards), unlikely but if AMD sells well they will adjust the 1070 and lower cards IMO.
    So future technologies need greater levels of investment in R&D to gain more performance and efficiency on next 10nmFF, potentially hurting the business even more if the above scenario happens to some extent.
    This would be applicable to both companies.
    Cheers
     
  6. Orion

    Regular

    Joined:
    Feb 18, 2013
    Messages:
    355
    Likes Received:
    49
    I'll believe what is shown in Jun 29 when Nda lifts and all can show.

    But again unless they made up the 6pin+6pin and 6pin+8pin, Im seeing some versions of these cards do quite well
     
  7. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    I'm pretty sure we'll know the real deal after Apple has updated their machines with Polaris GPUs, then we'll see if they're getting 36 CU part, or is it like Tonga where they got the first full shader parts
     
    kalelovil likes this.
  8. dskneo

    Regular

    Joined:
    Jul 25, 2005
    Messages:
    816
    Likes Received:
    298
    If they are anything like the 290x, if you give them voltage, they will scale, and scale, and scale. Two of them @ 1.5vcore can trip one 1200w powersupply. If you gave them a third 8 pin connector, they would use it. The problem is keeping them cool.

    I don't see how the RX480 (or any other vidcard) could be any different, unless the pcb components are not built like a tank (like the 290x is).
     
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Hmmm, I honestly can't find it again. It had both a reference 270 and reference 270x, IIRC. I'm guessing I saw it when I was looking up various other card combinations and found it strange. Then again that plays right into what Dave said in that there is much higher variability in the salvage chips than in the full chips used for the higher models.

    Another example, I ran across while looking for the article again. R9 280 consumers 14 watts less than R9 280x. R9 290 consumes 6 watts more than R9 290x.

    Yeah, my apologies. Either I thought I thought I saw something that I didn't see, or that was one obscure video card review that I can't find again.

    Regards,
    SB
     
  10. CSI PC

    Veteran

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Like you I trawled various reviews but they make it hard to get a true like-for-like from one reviewer :(
    Custom AIB models make it so difficult.
    I think it was possibly an obscure review and I just missed it.
    And the situation is made worst that really the only ideal measurements are when they are done at the terminals and slot.
    Thanks
     
  11. itsmydamnation

    Veteran

    Joined:
    Apr 29, 2007
    Messages:
    1,349
    Likes Received:
    470
    Location:
    Australia
    Could also be bandwidth limitation not making full chip worth it yet. The question is does polaris support GDDR5x, the obvious reason not to use GDDR5x would be supply at this stage, its looking like to goal of polaris is creating the best possible perf per watt while sliding it right in between what would traditionally be the mainstream (128bit memory bus) and "performance" ( above 128bit memory interface) . As a result it is a high volume card and also ships with 4/8gb, GDDR5x doesn't sound viable at this stage for that number of chips.

    I think there is a 50/50 chance of memory shaders + gddr5x support. What we really need is chipworks to get on this ASAP!
     
  12. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    13,878
    Likes Received:
    4,727
    next year ? you mean fall ? Small vega should hit in fall which would negate the impact of a 480x with 40cu. I could see if it existed them announcing it in the fall with the vega series ?
     
  13. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    And don't forget cards of the same model vary substantially.
     
  14. majord

    Joined:
    Jun 15, 2016
    Messages:
    4
    Likes Received:
    0

    Probably because the card is pitched towards affordable VR - and I think it will provide that, but at the same time people may be having high expectations for high resolution performance in general -

    I have the distinct feeling this card will be ROP limited at 4K and VR if the 6.3 score is any indication - perhaps evn raw Bandwidth.

    These will IMO shine at 1080p , but beyond that - maybe a bit iffy, and not compare so favourably to the old Hawaii's

    As for old drivers - That's fine, but most of the data for Hawaii cards hitting well over 7, consistently are on old drivers too.
     
  15. xEx

    xEx
    Veteran

    Joined:
    Feb 2, 2012
    Messages:
    1,060
    Likes Received:
    543
    Yes but you dont really spend 200 dollars expecting to play a 4k, do you?
     
  16. renderstate

    Newcomer

    Joined:
    Apr 24, 2016
    Messages:
    54
    Likes Received:
    51
    Well, VR has really high requirements:

    4K at 30 Hz -> 3840 x 2160 x 30 -> ~250 MPixel/s

    Oculus/Vive at 90 Hz -> 1080 x 1200 x 90 x 2 X 1.4 -> ~325 MPixel/s

    Note: the 1.4X factor for VR comes from VR best practices demanding super sampling to avoid excessive aliasing in the center of the image due to lens distortion (Pascal can get rid of this factor by using lens matched shading)

    In practice there are other tricks a VR app can put in place to avoid filling so many pixels per second but I just wanted to show that the amount of pixels a VR app needs to push per frame is actually similar or larger than what a 4K app at 30 Hz needs.
     
    CSI PC likes this.
  17. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    good perf/€, good perf/W, and good performance overall (fast enough for 1080@60 on most games), what else would I need ?
    (Maybe a little more if I update my screen to higher definition but I don't see the point atm, it would mean € for the screen, then more € for the GPU then more € for electricity, and more heat, all that for a dubious benefit...)
     
  18. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    Where is that "100W range" thing coming from?
     
  19. jacozz

    Newcomer

    Joined:
    Mar 23, 2012
    Messages:
    90
    Likes Received:
    23
    Alexko likes this.
  20. jacozz

    Newcomer

    Joined:
    Mar 23, 2012
    Messages:
    90
    Likes Received:
    23
    I would like to see some tests of the supposed? improvements in geometry/tessellation engine in Polaris. I think this is a key thing to combat the poor results in gameworks titles, or at least damage control ;)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...