Nvidia Ampere Discussion [2020-05-14]

Discussion in 'Architecture and Products' started by Man from Atlantis, May 14, 2020.

Tags:
  1. arandomguy

    Regular Newcomer

    Joined:
    Jul 27, 2020
    Messages:
    256
    Likes Received:
    364
    Luckily vendors typically supply a wide range of graphics cards that fit power budgets all the way down to <100w.
     
    Lightman and PSman1700 like this.
  2. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,242
    Likes Received:
    3,405
    But these are terrible too apparently. The issue seem to be with a wrong manufacturer?
     
    PSman1700 likes this.
  3. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    738
    Likes Received:
    355
    Discussion was about manufacturing nodes but you started to talk about cards. It is not Samsung's fault Nvidia pushed the chips that far.
     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    I agree. You drew the line though.
     
    PSman1700 likes this.
  5. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
    Quad NVIDIA GeForce RTX 3090 Graphics Cards Tested In A Single System, Insane Performance With Up To 1700W Power Consumption

    https://wccftech.com/quad-nvidia-ge...ed-insane-performance-1700w-power-consumption

    Puget Systems (via Videocardz) have released benchmarks of four GeForce RTX 3090 graphics cards running within a test rig which features an ASUS WS C422 SAGE motherboard that is equipped with the Intel Xeon W2255 CPU, 128 GB of DDR4-3200 memory, and a pair of EVGA 1600W power supplies since there are four Gigabyte GeForce RTX 3090 Turbo edition graphics cards to boot along with the entire PC.

    The performance was measured within OctaneBench, V-Ray Next, Red Shift, and Puget Systems's own PugetBench for DaVinci Resolve. The four GPUs didn't have NVLINK enabled across them and were only communicating via the PCIe bus interface. The explicit multi-GPU nature allowed by professional applications lets users take full advantage of the four GPUs without any additional driver support.

    In terms of performance in the said applications, you can note below that all benchmarks report near-perfect scaling for all four GeForce RTX 3090 graphics cards. The performance to begin with for a single-card is also notably higher than any previously released solution but having more than just 1 RTX 3090 graphics card can definitely help improve performance by a huge factor for users such as content creators who heavily rely on these applications.
     
    nnunn, Lightman and pharma like this.
  6. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,511
    Likes Received:
    24,411
    *ahem* Grow up before I start banning.
     
  7. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    In case your HDMI 2.1 home theatre only shows black screen, you might want to try it without the AV-receiver in order to isolate the error:
    https://forum.beyond3d.com/threads/hdmi-2-1-chip-used-in-many-av-receivers-probably-buggy.62083/

    This probably only concerns people with more than 4K60 displays, since it affects uncompressed Fixed Rate Link, where you can reach full 48 Gbps. With compressed FRL, it seems to work.

    Thought I'd post that here, since RTX 30 is the first graphics card that's probably affected by this.
     
    Lightman, BRiT and Jawed like this.
  8. Jubei

    Regular

    Joined:
    Dec 10, 2011
    Messages:
    559
    Likes Received:
    198
  9. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,109
    Likes Received:
    496
    Location:
    Finland
    I have no idea why you are waiting for something I haven't been talking about. Would you find the card not terrible at 400W? 500W? There is always a bit of extra performance to be had.
     
  10. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,109
    Likes Received:
    496
    Location:
    Finland
    Sorry thought this was the Ampere thread, not the manufacturing nodes thread. Agreed about Samsung though.
     
    Cuthalu likes this.
  11. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    Launches December?

    And, now, 3070Ti 10GB GDDR6 for $500? Launches early November?

    3080 10GB pushed down to $600? But only in December, alongside 3080Ti?
     
  12. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    One thing is clear. Things aren’t going Nvidia’s way. Clearly they’re having issues making their own cards and now AMD drops a competitive architecture at the same time. Unless they can get volumes up they won’t have any room to maneuver.
     
    Cuthalu and Lightman like this.
  13. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    4K "Ultimate" settings (though I think anisotropic filtering is left at Medium because he experiences a problem with higher):



    At the start of gameplay, it's about 10GB of VRAM used. At 2:35 it hits 12GB and goes higher. So, 10GB cards are going to struggle it seems.
     
    Lightman likes this.
  14. Rootax

    Veteran

    Joined:
    Jan 2, 2006
    Messages:
    2,401
    Likes Received:
    1,845
    Location:
    France
    Well, it's ram allocated If i'm not mistaken, not "used" ?

    But, yeah, somes games will be smoother will a little more vram for sure, even if it doesn't shown on bench. I mean the best benefits I had to go Vega FE with 16gb was eliminating some stutters here and there in games using a lot of vram, like FFXV with hd textures packs, W3 with somes mods, etc,... I guess swapping assets/textures in and out is not a smooth thing...
     
    Lightman likes this.
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    Not the VRAM allocated vs used thing again....
     
  16. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    I think they're still in a reasonably good position tbh. Unless TSMC can significantly ramp up capacity in the short term, Nvidia should (if they can sort out their current issues) simply be able to produce more cards than AMD. Even if RDNA2 turns out to be superior to Ampere, AMD will not be able to meet demand. Given the current market scenario, both can afford to hold prices and still sell out all they can produce.
     
    PSman1700 and pharma like this.
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    One of their biggest customers has moved over to 5nm already, another is out completely. Is there anything solid saying there's capacity issues at 7nm?
     
  18. Ext3h

    Regular

    Joined:
    Sep 4, 2015
    Messages:
    428
    Likes Received:
    497
    No, nothing. Well, capacities for the high end nodes are always high in demand. And you can be certain AMD has booked the capacities they expect to need, same as everyone else using that node right now.

    Which is also why NVidia is effectively forced to stick with Samsung until Q1 or Q2 2021, when some capacities are expected to be free due to Huawei bans and aftershock from this year's lockdown. You can't just cut the line, everything is booked.

    Before that, NVidia has only booked sufficient capacity for their professional series chips though, and could not afford to serve consumer chips from that slot. Not just because that would require to rush a tape out, but because they have only booked so much capacity, and they can only use that for the highest margin option.
     
    Lightman likes this.
  19. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,891
    Likes Received:
    4,539
    Do you have a link for Nvidia's TSMC booked capacity? The only statement made was from Jensen who indicated Samsung would be handling a small portion of Ampere capacity, and TSMC handling the major portion.

    I find it strange that leakers found many AIB's had listed 7 nm TSMC instead of 8nm Samsung on their Ampere product listings. I would not doubt that Nvidia has 7nm capacity at this point.
     
  20. troyan

    Regular

    Joined:
    Sep 1, 2015
    Messages:
    605
    Likes Received:
    1,126
    He hasnt any information. nVidia choosed Samsung because of the supply situation. They will book every free wafer from TSMC for their GA100 and NVSwitches products.
     
    Ext3h likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...