Nvidia Ampere Discussion [2020-05-14]

Discussion in 'Architecture and Products' started by Man from Atlantis, May 14, 2020.

Tags:
  1. Kaotik

    Kaotik Drunk Member Legend

    That or bunch of fat heatpipes. The air circulated by "rear fan" doesn't leave the case, only small portion of "front fan" does via I/O-shield, most is pushed back to case.
     
  2. trinibwoy

    trinibwoy Meh Legend

    I was referring to the pics of the actual cards not the renders. Of course it could all be bullshit but its hard to imagine someone going through the trouble of making such elaborate fakes.

    Our trusty twitter leakers seem to think Ampere will drink a lot of watts. If true it would explain the funky cooling.
     
    Malo likes this.
  3. trinibwoy

    trinibwoy Meh Legend

    Seems reasonable to me. The Ti would get a 20% wider bus and 20% more SMs. That would be in line with Turing where the 2080 Ti is ~ 20% faster than the 2080 super.
     
  4. Malo

    Malo Yak Mechanicum Legend Subscriber

    Ah ok, I missed those. Yeah I guess if they're actually real then it must be heat pipes in front of the end fan.
     
  5. ShaidarHaran

    ShaidarHaran hardware monkey Veteran

    Nvidia hasn't used the same silicon for the top-tier part and the minus one SKU since Fermi.
    2080 Ti uses TU102
    2080/Super uses TU104
    1080 Ti uses GP102
    1080 uses GP104
    and on and on and on

    Couple that with GDDR6x out of left field and this feels fake to me.
     
  6. trinibwoy

    trinibwoy Meh Legend

    Sure but that was before RDNA and Big Navi. When is the last time AMD had a competitive architecture and a big chip? The 780 Ti and 780 shared the same silicon and there's a ton of precedent before that.

    The GDDR6x stuff is clearly nonsense though.
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  7. ninelven

    ninelven PM Veteran

    Is Nvidia going to voluntarily sacrifice margins? No.
     
  8. Bondrewd

    Bondrewd Veteran

    Between voluntary margin cuts and involuntary mss loss nVidia always picks the former.
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  9. ninelven

    ninelven PM Veteran

    You don't matter.
     
  10. trinibwoy

    trinibwoy Meh Legend

    They certainly did that recently in response to the 5600 xt. Can’t recall too many times recently when nvidia was forced to lower prices though. This could be uncharted territory.
     
  11. Bondrewd

    Bondrewd Veteran

    Super SKUs are the obvious response to anything Navi.
    More GPU for less money, even with more VRAM here and there.
     
    w0lfram and A1xLLcqAgt0qc2RyMz0y like this.
  12. CarstenS

    CarstenS Legend Subscriber

    xpea, John Doe, nnunn and 3 others like this.
  13. A1xLLcqAgt0qc2RyMz0y and pharma like this.
  14. CarstenS

    CarstenS Legend Subscriber

    In any case, I hope they will be better than the coolers used on Turing-FE's, esp. 2080Tis seem to suffer from unnecessarily loud idle at 41% fan pwm.
     
  15. trinibwoy

    trinibwoy Meh Legend

    Wouldn’t 350W require beefy power delivery and therefore a full sized PCB ?
     
  16. [​IMG]

    Not necessarily, This is Zotac GTX 1080Ti Mini, 250W TDP, 300W Board power limit. I think, Nvidia can choose higher quality VRM equipments and deliver better efficiency.
     
    nnunn, DavidGraham, xpea and 4 others like this.
  17. Scott_Arm

    Scott_Arm Legend

    If a 3080 is about the same computing power as a 2080ti with maybe more RT cores and tensor cores, is it really expected to draw more power than a 2080ti when it’s on a smaller node?
     
    A1xLLcqAgt0qc2RyMz0y likes this.
  18. trinibwoy

    trinibwoy Meh Legend

    Rumor says same number of SMs but nothing about clocks. Maybe they’re pushing clocks and power consumption higher to fend off Navi.

    Rumors also say the 3080 is on Samsung 8nm. Who knows how that compares to TSMC 12nm.
     
    Last edited: Jun 10, 2020
  19. Scott_Arm

    Scott_Arm Legend

    I think the RTX2080 was roughly in the same ballpark as the GTX1080ti in terms of power consumption, so I'm hoping that trend continues and we don't see some massive ballooning of power consumption this generation. I was hoping we'd get large performance increases just from the node transition while keeping power consumption roughly the same.
     
    Cuthalu likes this.
  20. Bondrewd

    Bondrewd Veteran

    Oh yeah they do.
    But so does AMD; for speed daemon GPU age is nigh.
     
Loading...

Share This Page

Loading...