Nvidia Ampere Discussion [2020-05-14]

Discussion in 'Architecture and Products' started by Man from Atlantis, May 14, 2020.

Tags:
  1. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    That or bunch of fat heatpipes. The air circulated by "rear fan" doesn't leave the case, only small portion of "front fan" does via I/O-shield, most is pushed back to case.
     
  2. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    I was referring to the pics of the actual cards not the renders. Of course it could all be bullshit but its hard to imagine someone going through the trouble of making such elaborate fakes.

    Our trusty twitter leakers seem to think Ampere will drink a lot of watts. If true it would explain the funky cooling.
     
    Malo likes this.
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Seems reasonable to me. The Ti would get a 20% wider bus and 20% more SMs. That would be in line with Turing where the 2080 Ti is ~ 20% faster than the 2080 super.
     
  4. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,929
    Likes Received:
    5,528
    Location:
    Pennsylvania
    Ah ok, I missed those. Yeah I guess if they're actually real then it must be heat pipes in front of the end fan.
     
  5. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    Nvidia hasn't used the same silicon for the top-tier part and the minus one SKU since Fermi.
    2080 Ti uses TU102
    2080/Super uses TU104
    1080 Ti uses GP102
    1080 uses GP104
    and on and on and on

    Couple that with GDDR6x out of left field and this feels fake to me.
     
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Sure but that was before RDNA and Big Navi. When is the last time AMD had a competitive architecture and a big chip? The 780 Ti and 780 shared the same silicon and there's a ton of precedent before that.

    The GDDR6x stuff is clearly nonsense though.
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  7. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    Is Nvidia going to voluntarily sacrifice margins? No.
     
  8. Bondrewd

    Veteran

    Joined:
    Sep 16, 2017
    Messages:
    1,682
    Likes Received:
    846
    Between voluntary margin cuts and involuntary mss loss nVidia always picks the former.
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  9. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    You don't matter.
     
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    They certainly did that recently in response to the 5600 xt. Can’t recall too many times recently when nvidia was forced to lower prices though. This could be uncharted territory.
     
  11. Bondrewd

    Veteran

    Joined:
    Sep 16, 2017
    Messages:
    1,682
    Likes Received:
    846
    Super SKUs are the obvious response to anything Navi.
    More GPU for less money, even with more VRAM here and there.
     
    w0lfram and A1xLLcqAgt0qc2RyMz0y like this.
  12. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    xpea, John Doe, nnunn and 3 others like this.
  13. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    960
    Likes Received:
    853
    A1xLLcqAgt0qc2RyMz0y and pharma like this.
  14. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    In any case, I hope they will be better than the coolers used on Turing-FE's, esp. 2080Tis seem to suffer from unnecessarily loud idle at 41% fan pwm.
     
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Wouldn’t 350W require beefy power delivery and therefore a full sized PCB ?
     
  16. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    960
    Likes Received:
    853
    [​IMG]

    Not necessarily, This is Zotac GTX 1080Ti Mini, 250W TDP, 300W Board power limit. I think, Nvidia can choose higher quality VRM equipments and deliver better efficiency.
     
    nnunn, DavidGraham, xpea and 4 others like this.
  17. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,678
    If a 3080 is about the same computing power as a 2080ti with maybe more RT cores and tensor cores, is it really expected to draw more power than a 2080ti when it’s on a smaller node?
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  18. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Rumor says same number of SMs but nothing about clocks. Maybe they’re pushing clocks and power consumption higher to fend off Navi.

    Rumors also say the 3080 is on Samsung 8nm. Who knows how that compares to TSMC 12nm.
     
    #258 trinibwoy, Jun 10, 2020
    Last edited: Jun 10, 2020
  19. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,678
    I think the RTX2080 was roughly in the same ballpark as the GTX1080ti in terms of power consumption, so I'm hoping that trend continues and we don't see some massive ballooning of power consumption this generation. I was hoping we'd get large performance increases just from the node transition while keeping power consumption roughly the same.
     
    Cuthalu likes this.
  20. Bondrewd

    Veteran

    Joined:
    Sep 16, 2017
    Messages:
    1,682
    Likes Received:
    846
    Oh yeah they do.
    But so does AMD; for speed daemon GPU age is nigh.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...