Nvidia Pascal Announcement

Discussion in 'Architecture and Products' started by huebie, Apr 5, 2016.

Tags:
  1. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    60 SMs.
     
  2. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    Looks like a straight GM200 remplacement in a smaller TDP package.
     
  3. bridgman

    Newcomer Subscriber

    Joined:
    Dec 1, 2007
    Messages:
    62
    Likes Received:
    123
    Location:
    Toronto-ish
    Am I the only one who thinks that looks like Photoshop ?

    In the second picture the angle of the bright white text seems off relative to the other text.

    EDIT... maybe it is just me... there's a bit of skew between dim and bright text in the first picture too I guess... maybe it's just how oddly bright the lower text looks that seemed off.
     
  4. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    Nah, it's fine. The package size is also within the measures of the GM204 for a 256-bit BGA part.
     
  5. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
  6. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    better photo ( if not allready posted ) GP104-400
    -A1 GDDR5x
    [​IMG]
     
    Razor1, Ext3h, Jawed and 1 other person like this.
  7. Jupiter

    Veteran

    Joined:
    Feb 24, 2015
    Messages:
    1,583
    Likes Received:
    1,198
  8. McHuj

    Veteran Subscriber

    Joined:
    Jul 1, 2005
    Messages:
    1,613
    Likes Received:
    869
    Location:
    Texas
    I imagine the performance gulf between the 1080 and 1070 will be rather big now (retail price as well)
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    Well, there has been rumours of 3 models, maybe it's x70, x80 and x80 ti, which of only Ti comes with GDDR5X? (and which should be released a bit later, since Micron still hasn't started mass producing those memories)
    Also, just getting more bandwidth doesn't automatically do miracles on performance if the chip already had sufficient bandwidth for most scenarios
     
  10. Benetanegia

    Regular

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    425
    Maybe the chip can clock higher nicely without losing too much efficiency all the way up to 200-225W or something like that (assuming an initial target of ~175W).

    But IMO the chip was probably thought out as being paired with GDDR5X all along, and it's more a case of cut down chips also offering "good enough" performance with GDDR5.
     
  11. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Maybe Micron is rushing first mass production lots a little bit, but regular time through fab and packaging is around 3 months. With stated availability in June timeframe, there's little question that mass production must have started already.
     
  12. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    It's a safe bet to assume that GDDR5X support wasn't accidental. :wink:
     
    Benetanegia likes this.
  13. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    Where did they state June availability for GDDR5X? Or where did NVIDIA say GP104+GDDR5X will be available in June?
    Microns last statement was that they plan to start mass production in summer, a bit earlier they said August.
     
  14. Benetanegia

    Regular

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    425
    Haha. Yeah that was a stupid sentence on my part.

    But seriously now, I meant the kind of bandwidth that GDDR5X offers in a 256 bit config. Could have been 384 bit GDDR5 when in the drawing board.
     
  15. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    I've posted this quote from their March conference call already:
    Their current fiscal quarter ends of May 31st.

    They must have accelerated things. The conf call came after those early predictions.
    I can't blame you for thinking otherwise: everybody seems to be using GDDR5X availability as a crutch to claim that the GDDR5X version will be after the summer or q4, but that Micron statement throws that argument under the bus.
     
    pharma and Razor1 like this.
  16. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    I don't think 384 bits was never on the table for this class of chip.

    GDDR5 at 8Gbps is already 15% faster than 7Gbps.

    I use the rule of thumb that an x% increase in memory clock speed results in an x/2% increase in performance, based on the overclocking table in Anandtech's GTX980 review. And that the first order opposite is true as well: a reduction in BW only results in half the performance loss.

    But this means that a lot of work segments are not memory controller limited.

    A GTX 980 Ti has 336 GB/s. A GDDR5 gp104 will have 256 GB/s. That's 30% less, but only 15% less in performance. If you now significantly increase the performance of the compute bound work segments, you should be able to make up for that 15% loss. Going from 3072 cores at 1.1GHz to less than that but at much higher clocks. According to that same overclock table, core clock increases by x% increase performance by more than x/2%.

    Add it some architectural improvements and there you have it: 980 Ti level performance with old school GDDR5 at lower cost, a perfect replacement for gm204. The GDDR5X is the cherry on the cake to make a true new performance leader, with no competitor in sight.

    Cue in everybody complaining again how Nvidia dares to ask high-end prices for a mid-end chip.
     
  17. Infinisearch

    Veteran

    Joined:
    Jul 22, 2004
    Messages:
    779
    Likes Received:
    146
    Location:
    USA
    What workloads were benchmarked?
     
  18. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    they showed different games with separate memory and GPU overclocking. Games got around .5% increase per 1% frequency bump on memory where they got .7% difference per 1% GPU bump (more GPU bound than memory bound)
     
  19. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Check for yourself.

    For a 22% core clock increase, they see a 13% performance increase, or a a scaling of 0.59.
    For an 11% memory clock increase, they see a 5.2% performance increase, or a scaling of 0.46.

    So core is definitely a bigger limiter than memory.

    Edit: bonus comment from Anandtech's gm200 review:
     
    #460 silent_guy, Apr 23, 2016
    Last edited: Apr 23, 2016
    Razor1 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...