Nvidia Ampere Discussion [2020-05-14]

Discussion in 'Architecture and Products' started by Man from Atlantis, May 14, 2020.

Tags:
  1. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,242
    Likes Received:
    3,405
    Scrambling as in "cancelling" something which was "announced" by the same sources who are now saying that it's "cancelled"?
    I mean, stuff like this happens in product planning more often than one could count, and - assuming that this isn't a wholly made up thing - the only difference here that this particular instance got leaked to public.
    As for the possible reasons - who knows? Maybe they haven't got double capacity G6X chips from Micron on time and launching only a 3070 16GB would look weird in comparison to 3080 10GB. Maybe they've decided to spend all 1GB G6X chips they have on making twice as many 3080 10GB cards instead of wasting them on a 20GB model which would give them ~0% performance advantage over the 10GB one. Maybe there's some truth to the rumor of them moving to N7 next year - possibly with a new GA103 chip which would essentially substitute GA102 and provide an option of launching a "Super" upgrade for 3080 in about half a year from now in which case it would be best to leave 20GB option until then. I can think of another hundred of reasons off the top of my head and any and all of them could be true - it's impossible to say which one actually is without knowing.
     
    Cuthalu, Lightman and PSman1700 like this.
  2. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    All of that spitballing should’ve happened internally at Nvidia before asking AIBs to plan for 2x memory cards. It’s really late in the game to be making product decisions and jerking your partners around like that isn’t good business. In a week we’ll have more info to theorycraft with.

    The GA103 rumor is interesting. According to the tweeters who started it that chip is also at Samsung. Not really sure how another smaller Samsung chip helps assuming Big Navi is already competing with GA102.
     
  3. arandomguy

    Regular Newcomer

    Joined:
    Jul 27, 2020
    Messages:
    256
    Likes Received:
    364
    Anything is going to be too speculative due to lack of information.

    It could just be something boring like due to the heavy demand/supply ratio and ongoing logistics issues (due to global conditions) it was determined to keep avoid more inventory fragmentation to keep logistics more manageable at this time.

    A big question will be how cross platform ports end up being handled next gen in practice.

    I still think the big issue with respect to VRAM is how much is enough is too open ended/subjective a question. If I had guess/forecast I think 10GB will age roughly inline in the 4GB-5GB (the latter being hypothetical) would have this generation. But if you asked a bunch of people whether or not that was "enough" I don't think you'd get any consensus.

    The importance is the scaling rate and cost factor is lower nowadays compared to years past, not that we aren't getting absolute better performance if you're willing to pay for it. If we had the same pace as the Xbox 360/PS3 generation then it'd mean that 2x console performance is available in the $200 price range 2 years from now, which is something I don't think is likely.

    Since the topic line was kind of VRAM it's notable that memory cost scaling has been the most stagnant. Remember at the end of the Xbox 360/PS3 generation common desktop graphics VRAM was 4x that of the memory pool of the Xbox 360/PS3 from the lineups of both AMD/Nvidia. We for sure aren't seeing the cost reductions in bit/$ of memory to even approach that these days.

    Strictly speaking, and it was my thought when that report first came out, 8nm does not equal 7nm.
     
  4. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,242
    Likes Received:
    3,405
    This is true. However, technically speaking, 4GBs were "enough" for this whole generation, and you would be hard pressed to find a game now which will have performance issues running on a 4GB card in 1080p/console level settings. I expect something similar to be true for 8-12GB cards for the upcoming console generation - with an added bonus of consoles aiming at 4K resolution from the start - which isn't something widely used in PC space yet and there's a chance that it will remain on a distant third place after 1080p, 1440p and various UWs for the whole upcoming console generation - which in turn means that VRAM requirements for higher than that of consoles resolutions may not go up on PC as fast as they did during this console generation.

    Basically, anyone who's saying that "8GB isn't enough" or "10GB isn't enough" has literally no basis for saying that at the moment. It may turn out to be true, universally or to a degree, or it may turn out to be completely bull.
     
    Putas, PSman1700 and pjbliverpool like this.
  5. Because it's a significantly cheaper node. They did complain about TSMC's N7 price on several occasions and bragged about how great of a decision it was for postponing the transition from 12FFN to N7 when Turing was launched, and how great Samsung 8nm was now that gaming Ampere launched.

    They.. #gasp#.. they lied?
    No.. that can't be!


    Power consumption is better than expected, considering their 12FFN was already pretty close to TSMC's own 10nm (edit: in performance) and Samsung's 8LPP always was just an incremental evolution over their 10LPP.
    In the end, what they got was similar performance at much higher transistor density.
    Nvidia is getting exactly what they paid for, if not better. Though I reckon they might try to shift the blame towards Samsung if they realize they underestimated RDNA2.

    I was referring to performance, not VRAM amount.
     
    #2246 Deleted member 13524, Oct 22, 2020
    Last edited by a moderator: Oct 22, 2020
    Lightman likes this.
  6. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,242
    Likes Received:
    3,405
    "Terrible" compared to what? Comparing GA102 to TU102 there's an obvious gain in perf/watt.
    We don't know what power consumption would the same Ampere chips have on N7/P and chances are that we'll never find out.
    We don't know how Ampere's competition will fare in perf/watt yet either.
     
    pharma and PSman1700 like this.
  7. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,109
    Likes Received:
    496
    Location:
    Finland
    >300W GPUs are terrible compared to just about anything I've used in the last 30 years of computers. Doesn't matter if perf/watt improved by a few tenths.
     
    Cuthalu and nutball like this.
  8. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    Samsung 8nm has been in production for more than 2 years now, and itself an evolution of their 10nm, which is more than 3 years. Samsung has really screwed up if the yields are so bad at this point. Perhaps the bottleneck is Ampere availability is GDDR6X production as others have speculated?
    Turing launched just as 7nm had entered production/commercial availability in mobile SoCs (September 2018). Given their planned transistor counts/die sizes, I think Nvidia had no choice but to go with the mature 12nm process at that point. It's the delayed transition to a newer node subsequently which I think we were all surprised by (A100 aside).
    The gain in perf/w is definitely on the low side for a new architecture and process that's between a half and a full node transition though (10nm being a half node and 8nm a bit more). Maxwell and Pascal are perhaps outliers, but even the non RTX Turing parts had more of an increase in perf/w from Pascal on an evolution of the same node, than what we're seeing with Ampere.

    Is there any compute benchmark which can be run on A100 and GA102 which is comparable? That should give us some idea of the difference.
     
  9. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    Well that seems arbitrary. The FX 5800 had a 45w TDP. Have all the 150w and 250w cards since then also been terrible?
     
  10. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,242
    Likes Received:
    3,405
    About all AIB 2080Ti cards had >300W. These must have been terrible too, right?

    Also you're about to get 2080Ti performance in 220W soon - this is also somehow terrible now?
     
    PSman1700 likes this.
  11. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    PSman1700 likes this.
  12. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    And we have a 350watt console inbound from the east. I think wattage and power draw are not so important as i thought they where for gaming hardware.
    I gladly welcome high wattage products if that means one of the largest jumps in performance yet anno 2020 (100% already now over 2080) and a 200% increase in ray tracing. Im all for going green but not in the gaming space.
     
  13. dskneo

    Regular

    Joined:
    Jul 25, 2005
    Messages:
    816
    Likes Received:
    298
    You have a 350w power supply in a console coming from the east.
     
  14. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,109
    Likes Received:
    496
    Location:
    Finland
    Yes they were terrible! Am I supposed to applaud the turtle for it's speed?
     
    Cuthalu likes this.
  15. Voxilla

    Regular

    Joined:
    Jun 23, 2007
    Messages:
    832
    Likes Received:
    505
    I for one agree, more than 300 Watt is not great.
    Gaming PCs will now become furnaces producing half a kilo Watt of heat.
    Many parts of the world get hotter by the year and typically in the EU houses have no airco for cooling.
     
    Cuthalu and Kyyla like this.
  16. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,109
    Likes Received:
    496
    Location:
    Finland
    Any line drawn on a continuous scale is arbitrary. Yet the frog still boils to death.
     
    Cuthalu likes this.
  17. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Made my day :)
     
  18. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,242
    Likes Received:
    3,405
    I'm still waiting on your comparison point which isn't "terrible" at the same performance.
     
    PSman1700 likes this.
  19. The launch PS3 had a 400W power supply yet its power consumption hardly ever reached even half of that.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...