Nvidia Ampere Discussion [2020-05-14]

Discussion in 'Architecture and Products' started by Man from Atlantis, May 14, 2020.

Tags:
  1. arandomguy

    arandomguy Regular Newcomer

    Luckily vendors typically supply a wide range of graphics cards that fit power budgets all the way down to <100w.
     
    Lightman and PSman1700 like this.
  2. DegustatoR

    DegustatoR Veteran

    But these are terrible too apparently. The issue seem to be with a wrong manufacturer?
     
    PSman1700 likes this.
  3. Putas

    Putas Regular

    Discussion was about manufacturing nodes but you started to talk about cards. It is not Samsung's fault Nvidia pushed the chips that far.
     
  4. trinibwoy

    trinibwoy Meh Legend

    I agree. You drew the line though.
     
    PSman1700 likes this.
  5. Quad NVIDIA GeForce RTX 3090 Graphics Cards Tested In A Single System, Insane Performance With Up To 1700W Power Consumption

    https://wccftech.com/quad-nvidia-ge...ed-insane-performance-1700w-power-consumption

    Puget Systems (via Videocardz) have released benchmarks of four GeForce RTX 3090 graphics cards running within a test rig which features an ASUS WS C422 SAGE motherboard that is equipped with the Intel Xeon W2255 CPU, 128 GB of DDR4-3200 memory, and a pair of EVGA 1600W power supplies since there are four Gigabyte GeForce RTX 3090 Turbo edition graphics cards to boot along with the entire PC.

    The performance was measured within OctaneBench, V-Ray Next, Red Shift, and Puget Systems's own PugetBench for DaVinci Resolve. The four GPUs didn't have NVLINK enabled across them and were only communicating via the PCIe bus interface. The explicit multi-GPU nature allowed by professional applications lets users take full advantage of the four GPUs without any additional driver support.

    In terms of performance in the said applications, you can note below that all benchmarks report near-perfect scaling for all four GeForce RTX 3090 graphics cards. The performance to begin with for a single-card is also notably higher than any previously released solution but having more than just 1 RTX 3090 graphics card can definitely help improve performance by a huge factor for users such as content creators who heavily rely on these applications.
     
    nnunn, Lightman and pharma like this.
  6. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■) Moderator Legend Alpha

    *ahem* Grow up before I start banning.
     
  7. CarstenS

    CarstenS Legend Subscriber

    In case your HDMI 2.1 home theatre only shows black screen, you might want to try it without the AV-receiver in order to isolate the error:
    https://forum.beyond3d.com/threads/hdmi-2-1-chip-used-in-many-av-receivers-probably-buggy.62083/

    This probably only concerns people with more than 4K60 displays, since it affects uncompressed Fixed Rate Link, where you can reach full 48 Gbps. With compressed FRL, it seems to work.

    Thought I'd post that here, since RTX 30 is the first graphics card that's probably affected by this.
     
    Lightman, BRiT and Jawed like this.
  8. Jubei

    Jubei Regular

  9. Kyyla

    Kyyla Veteran

    I have no idea why you are waiting for something I haven't been talking about. Would you find the card not terrible at 400W? 500W? There is always a bit of extra performance to be had.
     
  10. Kyyla

    Kyyla Veteran

    Sorry thought this was the Ampere thread, not the manufacturing nodes thread. Agreed about Samsung though.
     
    Cuthalu likes this.
  11. Jawed

    Jawed Legend

    Launches December?

    And, now, 3070Ti 10GB GDDR6 for $500? Launches early November?

    3080 10GB pushed down to $600? But only in December, alongside 3080Ti?
     
  12. trinibwoy

    trinibwoy Meh Legend

    One thing is clear. Things aren’t going Nvidia’s way. Clearly they’re having issues making their own cards and now AMD drops a competitive architecture at the same time. Unless they can get volumes up they won’t have any room to maneuver.
     
    Cuthalu and Lightman like this.
  13. Jawed

    Jawed Legend

    4K "Ultimate" settings (though I think anisotropic filtering is left at Medium because he experiences a problem with higher):



    At the start of gameplay, it's about 10GB of VRAM used. At 2:35 it hits 12GB and goes higher. So, 10GB cards are going to struggle it seems.
     
    Lightman likes this.
  14. Rootax

    Rootax Veteran

    Well, it's ram allocated If i'm not mistaken, not "used" ?

    But, yeah, somes games will be smoother will a little more vram for sure, even if it doesn't shown on bench. I mean the best benefits I had to go Vega FE with 16gb was eliminating some stutters here and there in games using a lot of vram, like FFXV with hd textures packs, W3 with somes mods, etc,... I guess swapping assets/textures in and out is not a smooth thing...
     
    Lightman likes this.
  15. trinibwoy

    trinibwoy Meh Legend

    Not the VRAM allocated vs used thing again....
     
  16. Erinyes

    Erinyes Regular

    I think they're still in a reasonably good position tbh. Unless TSMC can significantly ramp up capacity in the short term, Nvidia should (if they can sort out their current issues) simply be able to produce more cards than AMD. Even if RDNA2 turns out to be superior to Ampere, AMD will not be able to meet demand. Given the current market scenario, both can afford to hold prices and still sell out all they can produce.
     
    PSman1700 and pharma like this.
  17. Kaotik

    Kaotik Drunk Member Legend

    One of their biggest customers has moved over to 5nm already, another is out completely. Is there anything solid saying there's capacity issues at 7nm?
     
  18. Ext3h

    Ext3h Regular

    No, nothing. Well, capacities for the high end nodes are always high in demand. And you can be certain AMD has booked the capacities they expect to need, same as everyone else using that node right now.

    Which is also why NVidia is effectively forced to stick with Samsung until Q1 or Q2 2021, when some capacities are expected to be free due to Huawei bans and aftershock from this year's lockdown. You can't just cut the line, everything is booked.

    Before that, NVidia has only booked sufficient capacity for their professional series chips though, and could not afford to serve consumer chips from that slot. Not just because that would require to rush a tape out, but because they have only booked so much capacity, and they can only use that for the highest margin option.
     
    Lightman likes this.
  19. pharma

    pharma Veteran

    Do you have a link for Nvidia's TSMC booked capacity? The only statement made was from Jensen who indicated Samsung would be handling a small portion of Ampere capacity, and TSMC handling the major portion.

    I find it strange that leakers found many AIB's had listed 7 nm TSMC instead of 8nm Samsung on their Ampere product listings. I would not doubt that Nvidia has 7nm capacity at this point.
     
  20. troyan

    troyan Regular

    He hasnt any information. nVidia choosed Samsung because of the supply situation. They will book every free wafer from TSMC for their GA100 and NVSwitches products.
     
    Ext3h likes this.
Loading...

Share This Page

Loading...