Nvidia Pascal Announcement

Discussion in 'Architecture and Products' started by huebie, Apr 5, 2016.

Tags:
  1. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Unfortunately that would mean the same FP64 and slightly improved FP32/FP16 as the P100.
    This is not what Nvidia needs for the next down in the Tesla range that is also shared with the Quadro and Titan.

    Cheers
     
  2. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    What to watch out for IMO is to see if they give the Titan the same compliment of SM at 56 or 60, and support for FP16x2, along with number of FP64 cores as the Quadro card.
    Cheers
     
  3. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,126
    Likes Received:
    2,599
    Location:
    Germany
    If they're going the BIG GPU route again, it would make sense to put the more traditional-style SMs with 128 ALUs in there (including FP16x2) and add an additional GPC or two instead of the FP64 cores. Instead of HBM2, you could use 512 Bit GDDR5X. That way, you had a much more complementary line-up.

    If they want to cheap out, GP102 could just be GP104×1.5.
     
  4. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    668
    Likes Received:
    111
    Can you run in clamshell mode with GDDR5X? If so you can do 8 GB on a 128 bit bus..but GDDR5X price and/or availability may be constraints. A 192 bit GDDR5 bus gives you similar bandwidth at lower cost.
    It may be enough today(that is also debatable), but what about a year from now? 2 years? I wouldn't buy anything less than 4 or 6 GB tbh. And besides the margins on the 3 GB card will be lower and its main purpose will be to serve as a price anchor (eg..the magic $199.99).

    But one thing to keep in mind that if a particular card is popular then devs will definitely keep the memory capacity in mind. Considering how popular GTX970 is (as per the Steam hardware survey at least)..I'm sure many devs will aim for 3.5 GB ;-)
    384 bit. Q3.
     
  5. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,126
    Likes Received:
    2,599
    Location:
    Germany
    I think so, yes. But 5X probably carries another price premium, as you said, so...
     
  6. xpea

    Regular Newcomer

    Joined:
    Jun 4, 2013
    Messages:
    421
    Likes Received:
    461
    got same information. GP102 = big GP104. ready for back to school
     
  7. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,380
    Nice! Should give AMD plenty of time to set Vega clock speeds and update their power circuit!
     
    Lightman likes this.
  8. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    I think a fair few will be disappointed if they did remove the 64 cuda cores per SM, but this is Nvidia so who knows :)
    While we can expect the GP102 to have fewer FP64 cores, it will also be interesting if they provide the option as they did back with the GK110 Titan that enabled switching the ratio between 1/3 and was it 1/24 (to do with loads)?
    I am assuming they are going to put a moderate-useable amount of FP64 on the GPU102 for the Tesla and Quadro, maybe a quarter of GP100 for now *shrug*.

    Cheers
     
  9. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,175
    Location:
    La-la land
    With it supposedly launching so soon after GP102, one has to wonder what kind of damage to your wallet it is going to cause in order to set itself apart from (the already quite expensive) GP104 products...because NV isn't going to push GP104 down in price; not so soon after launch. Ugh. It's gonna be the worst reaming of all time, I fear.
     
  10. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,126
    Likes Received:
    2,599
    Location:
    Germany
    Just to be clear: They're gonna keep their mandatory 1/32th rate or so. I did not mean remove them completely. And those who want half-rate DP on a 10-TFLOP-chip still can go for GP100 (or Vega probably).
     
  11. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    It is possible, but then would need to be considered against that at some point they need to update the Keplers still around for Tesla/Quadro that have 'moderate' DP, and also that they need to compete against AMD in the pro-workstation space in terms of FP16 solutions.

    If they do not do a GK110 type die (would be the K40-K80/K600) that would mean another GP1xx card somehow positioned in the tiers being designed with DP to replace those.
    Ah well time will tell.
     
    #1511 CSI PC, Jul 5, 2016
    Last edited: Jul 5, 2016
  12. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    It make more sense for this type of range than for any.. specialy for peoples who dont have the money to buy 1 high end gpu .. but can buy 1 low end class one, and then "upgrade" it in some month..

    No miracle there, the first SLI gpu capable was the 6600GT SLI edition, not the 6800 utra one .

    SLLI and CFX have been sold like that since the start ( and even with APU ), use low end gpu's and upgrade them to match performance of high end.. This is how AMD and Nvidia have sold it from the start.. in fact, it is really surprising than Nvidia remove SLI capacity from the 1060 ( there's no reason, not even bad reason for do it )
    .. I dont even understand why you think it is a good thing.. If it was SLI capable and you dont want to use SLI, this willl just dont do any difference for you ..

    Im a little bit emotive when we are speaking about SLI, as i was surelly one of the first users of it, i even stil got the badges that Nvidia and DFI have send me at this time, we have create the first Guru3D SLI team and create the first real guide of user, most byte of code who are running in SLI in thoses periods was created by us... each day we was recode the drivers and mod them for been used in SLI.... It have continue then with the Tweaksrus driver ( who have been banned in justice by Nvidia advocate attack ) ... but in all honestly we had prepare the way...


    They cant remove it complety, DX11 and DX12 ask for for a compliance of at least been abe to do " FP16 and FP64 " so at 1/32 they are just "capable of doing it." and respect the mandatory minimum of DirectX..
     
    #1512 lanek, Jul 5, 2016
    Last edited: Jul 5, 2016
    Grall likes this.
  13. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,126
    Likes Received:
    2,599
    Location:
    Germany
    That's why i wrote "just to be clear". But then, I am not so sure about FP16 right now. A lot of cards report minimum precision still as full precision in the DX12 FL12_0/12_1.
     
  14. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,286
    Likes Received:
    1,061
    Location:
    still camping with a mauler
    They shouldn't be wasting any resources on (or encouraging customers to go with) boner solutions like 2 1060s when 1 1070 or 1080 is such a better option for all involved. It's a very good move in my book. SLI/Crossfire should be reserved to the high end or even better simply not exist at all.
     
    Florin, pharma and Otto Dafe like this.
  15. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland

    Not everyone have the possiblity " money " wise to buy a 1080 or a 1070 .. but they can buy a 1050 a month, in their pay, and then upgrade to a second 1050.. it is why the SLI and CFX have been created .. its not perfect by no means, but it work..

    I have the luck to been able to buy a good hundred of 1080 with my pay.. but i have not forget the time when i was young and i couldnt surely do it it.
     
  16. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Well with mGPU the need for SLi kinda goes down, if developers are going to move towards that more, it looks nV is guiding them towards that as well or expecting it to happen in the normal course of this gen.
     
  17. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,286
    Likes Received:
    1,061
    Location:
    still camping with a mauler
    But that is such a dumb thing to do when you can save for 2 months and get the 1070...less power and far fewer headaches. Buying 2 1050s is the wrong way to do it and I'm glad NVIDIA has taken the option away. They never should have offered SLI on mid-low end cards and it seems they've finally realized that.
     
    pharma and BRiT like this.
  18. RedVi

    Regular

    Joined:
    Sep 12, 2010
    Messages:
    393
    Likes Received:
    42
    Location:
    Australia
    Eh, it's not exactly smart though. Obviously it's better to save. The first video card I bought was a Voodoo 2 for $250 back in 98. I was 11 and got something like $10 pw pocket money. Ditto the 440Ti I bought for $450 when I was earning $8/hr. And again a 9800Pro for $500 when not earning much more.
     
  19. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,736
    Likes Received:
    2,596
  20. Clukos

    Clukos Bloodborne 2 when?
    Veteran Newcomer

    Joined:
    Jun 25, 2014
    Messages:
    4,516
    Likes Received:
    3,873
    I don't know how relevant this is to the topic but Nvidia recently updated their GeForce Experience software, it looks and runs much better than the previous iteration (or the current non-beta version). According to nvidia it also has a smaller CPU footprint (for the share and streaming functionality).
     
    #1520 Clukos, Jul 6, 2016
    Last edited: Jul 6, 2016
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...