Nvidia Ampere Discussion [2020-05-14]

Discussion in 'Architecture and Products' started by Man from Atlantis, May 14, 2020.

Tags:
  1. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,678
    If you hit the power limit your clock and voltage drops, but utilizing the full wide gpu will still result in better performance.
     
    PSman1700 likes this.
  2. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    Yep, clock vs power is non-linear. Larger chips can thus be more efficient, up to a point. And depending on what the primary cap on performance is.
     
  3. OlegSH

    Regular

    Joined:
    Jan 10, 2010
    Messages:
    797
    Likes Received:
    1,622
    That's impressive.
    I see the most gains come from compute workloads.
    Mesh shaders have the same execution model as compute shaders but with direct interface to rasterizers.
    Here are other examples where Ampere scales almost linearly with flops, all compute workloads:
    https://www.ixbt.com/img//x1600/r30/00/02/33/56/d3d1210nbodygravity64k.png
    https://www.ixbt.com/img/r30/00/02/33/56/vray_668771.png
    https://www.pugetsystems.com/pic_disp.php?id=63679&width=800
    https://babeltechreviews.com/wp-content/uploads/2020/09/Sandra-2020.jpg
    In the case of Sandra, image processing and many other kernels are all ~2x faster over 2080 Ti.
     
    LeStoffer, nnunn, Jawed and 5 others like this.
  4. Frenetic Pony

    Regular

    Joined:
    Nov 12, 2011
    Messages:
    807
    Likes Received:
    478
    Thus showing what the arch is really for. It's really a great deal for anyone looking for rendering, or even tooling around with machine learning. Those matrix multiplication numbers are amazing, especially for $700, though maybe that's why the 24gb model costs $1500. Still a good deal, I can see AI researchers snapping them up when they're available.

    I mean, if the CDNA leaks are true it's going to have a hard time competing with this. Maybe it wasn't Nvidia's intention, but they seem to have produced a compute monster for mass market gaming prices.
     
    #1744 Frenetic Pony, Sep 20, 2020
    Last edited: Sep 20, 2020
    nnunn likes this.
  5. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    What else could it have been?
     
    Cuthalu likes this.
  6. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,393
    Who knows? Maybe they've got some inventory of 2GB modules for 3090 launch already. We haven't even heard about G6X a month ago.
    This confirms it though - and also makes 3080 20GB a 2021 product probably.
     
  7. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    Why wouldn't they just clamshell it like 3090?
     
  8. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,018
    Likes Received:
    581
    Location:
    Taiwan
    It’s probably just because of cost. I’m not completely sure but it’ll likely need more pins to connect more devices in x8 mode as the command pins will double. NVIDIA probably don’t want to make a different package for 3080 just for a 20GB SKU, but prefer to wait for Micron to make 16Gb devices instead.
     
    Lightman, pharma and PSman1700 like this.
  9. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,393
    Would add too much cost for the card to still be a "3080" IMO.
     
  10. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,397
    Initial troubles with LG TVs and RTX Ampere cards?

    https://www.forbes.com/sites/johnar...st-nvidia-rtx-30-graphics-cards/#3c616208267a

    LG OLED TVs Having Issues With Latest Nvidia RTX 30 Graphics Cards
    It’s long been feared that the crazy complications of the latest HDMI 2.1 format coupled with the big forward leap in graphics quality being offered by the next generation of PCs and games consoles would cause serious compatibility issues. And unfortunately it seems that one of the most talked about next-gen combinations, Nvidia’s new RTX 30 graphics cards and LG’s 2019 and 2020 OLED TVs, has fallen at the first hurdle.​

    Owners of both LG 9 and X series OLEDs are reporting that the TVs aren’t handling the highest quality outputs properly from their new RTX 30 Series cards - even though, on paper at least, they should. In fact, LG has often talked up the gaming potential of its recent OLED TVs, as opened up by the high bandwidth support of the 48Gbps HDMIs on its 2019 9 series OLEDs, and the 40Gbps HDMIs on its 2020 X series OLEDs.

    The two main problems being reported appear to be as follows. First, users of both the LG OLED 9 and X series are reporting a complete loss of picture (a black screen) when attempting to apply Nvidia’s G-Sync variable refresh rate technology at 120Hz frame rates. This occurs regardless of which bit depth or resolution you choose.

    The second issue seems to be restricted to X series models, and finds the TVs reducing signals output in RGB/120Hz/4:4:4 to 4:2:2 chroma subsampling. This happens irrespective of whether you have G-Sync active or not, or which output resolution you have selected. And it results in notable image degradation - as shown in the examples I was kindly allowed to reproduce here by Twitter user Sixi82.

    ...​
     
  11. RedVi

    Regular

    Joined:
    Sep 12, 2010
    Messages:
    407
    Likes Received:
    59
    Location:
    Australia
    So 18 months on there is still no video card that can properly drive my LG C9 65" with variable refresh rates at 4K...

    LG's decision to not add support for AMD freesync on C9's was bad enough (since they were first advertised as an 'adaptive sync' capable TV well before they were G-Sync certified).
     
  12. Rootax

    Veteran

    Joined:
    Jan 2, 2006
    Messages:
    2,400
    Likes Received:
    1,845
    Location:
    France
    C9 support VRR, right ? So it should work ok with amd cards (even if vrr have others problems : https://www.forbes.com/sites/johnar...uesbut-cant-promise-a-quick-fix/#57fba4ef13af )
     
    DegustatoR likes this.
  13. Voxilla

    Regular

    Joined:
    Jun 23, 2007
    Messages:
    832
    Likes Received:
    505
    #1754 Voxilla, Sep 21, 2020
    Last edited: Sep 21, 2020
    CarstenS likes this.
  14. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Does it say there explicitly it's full speed mixed precision? I didn't find any concrete evidence.
     
  15. Voxilla

    Regular

    Joined:
    Jun 23, 2007
    Messages:
    832
    Likes Received:
    505
    https://developer.nvidia.com/titan-rtx
    Built on the Turing architecture, it features 4608, 576 full-speed mixed precision Tensor Cores for accelerating AI
     
    Lightman and CarstenS like this.
  16. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    I'm blind it seems. Thanks.
     
  17. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    I was under the impression that 2000 series GPU already have G-sync working with their 2019 LG OLEDs.

    Actually this is AMD's fault, not LG's. AMD promised it would support VRR on HDMI. Freesync and VRR are not precisely the same thing.

    On the LG 2020 TVs, Freesync only works when you turn off some other feature on the TV (I can't actually remember what that feature is, sorry) because Freesync conflicts with the "flag" that is normally used for that feature.

    Why consoles with AMD GPUs have VRR support, but PCs with AMD GPUs don't seems to be down to AMD, not LG.
     
    RedVi likes this.
  18. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,393
    There seems to be a driver bug which affects all VRR enabled GPUs right now. Hopefully it will be fixed on NV side and won't require firmware updates from LG.
     
  19. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    Damn, I'd totally forgotten about ixbt:

    https://www.ixbt.com/3dv/nvidia-geforce-rtx-3080-review-part1.html

    Really excellent graphics card reviews, not the weak sauce of pretty much all English language sites. Translation to English is so good these days, too!

    Results with the Perlin noise test, which historically was very useful for pure compute comparisons shows a significant problem for Ampere as it's only 12% faster. It's a long shader and has a fair amount of instruction-to-instruction dependency. At least that's what the 3DMark 06 version shows, which is about 500 instructions. I don't have the source code for 3DMark Vantage Perlin Noise though...
     
    sonen, Lightman and trinibwoy like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...