Nvidia's 3000 Series RTX GPU [3050, 3060, 3070, 3080, 3090 now with TIs]

Discussion in 'Architecture and Products' started by Shortbread, Sep 1, 2020.

Tags:
  1. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    I often see people claiming Steam stats are flawed. What I never see is an explanation of why those flaws would favor one product over another in a large, random sample.
     
    Cuthalu, HLJ, pharma and 3 others like this.
  2. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    138
    Likes Received:
    388
    Unintentional sampling bias is always possible. However I agree with you that I have not seen any rigorous examination of these flaws or an evaluation of their statistical significance. Given that Steam's results are largely consistent with other studies, the only three possibilities are:
    (1) the flaws are statistically insignificant
    (2) the flaws are statistically significant but cancel each other out so perfectly that the results are aligned with other studies
    (3) all studies are flawed in statistically significant but correlated ways

    Of these, (1) seems to be the most likely explanation to me. The others are also possible (I've personally conducted experiments in which (2) has occurred in hilarious ways), but less likely.
     
    Cuthalu, pjbliverpool, HLJ and 2 others like this.
  3. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,424
    Likes Received:
    908
  4. HLJ

    HLJ
    Regular

    Joined:
    Aug 26, 2020
    Messages:
    529
    Likes Received:
    869
    This comment on the video made me lol:

    "Time to throw the 3090 in the trash and invest in RX 5700 XT"

    And if you pair a RTX 3900 with a Ryzen 5 1600X...well :lol:
     
    Lightman and PSman1700 like this.
  5. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    The D3D12 binding model causes some grief on Nvidia HW. Microsoft forgot to include STATIC descriptors in RS 1.0 which then got fixed with RS 1.1 but no developers use RS 1.1 so in the end Nvidia likely have app profiles or game specific hacks in their drivers. Mismatched descriptor types are technically undefined behaviour in D3D12 but there are now cases in games where shaders are using sampler descriptors in place of UAV descriptors but somehow it works without crashing! No one has an idea of what workaround Nvidia is applying.
     
    #605 Lurkmass, Mar 11, 2021
    Last edited: Mar 11, 2021
    fellix, Lightman, PSman1700 and 6 others like this.
  6. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    Has nothing to do with driver overhead and everything to do with the fact that these "low level APIs" happen to be in AMD sponsored titles. (Haven't watched the video yet.)
     
    PSman1700 likes this.
  7. HLJ

    HLJ
    Regular

    Joined:
    Aug 26, 2020
    Messages:
    529
    Likes Received:
    869
    When I skimmed it they were doing 1080p/1440p...RTX3090 at 1080p is doing it wrong ;)
     
    PSman1700 likes this.
  8. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    On Vulkan, the binding model isn't too bad on Nvidia HW. The validation layers on Vulkan would catch the mismatched descriptor type preventing a lot of headaches for their driver team compared to D3D12 ...
     
    Lightman, PSman1700 and pjbliverpool like this.
  9. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,424
    Likes Received:
    908
    Except it happens in Watch Dogs Legion too, and in pretty much all low level API games they have tested.

    It must be a decade give or take Nvidia has known about DX12 specifications. Whose fault is it at this point that their hardware has issues?
     
    #609 techuse, Mar 11, 2021
    Last edited: Mar 11, 2021
  10. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    Hence my remark about not watching the video yet.
     
  11. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    Of course, because NVIDIA can never be at fault, right? :rolleyes:
    They tested in Horizon Zero Dawn and Watch Dogs Legions, latter being NVIDIA sponsored title.
    Yes, you said you didn't watch the video yet, but still you had to go and blame it being biased test just in case.

    edit: just for the sake of it, it's showing even more dramatic difference in Watch Dogs Legions, where even 5600XT beats RTX 3070 on Ryzen 5 1600X, 2600X and Core i3-10100 at 1080p medium settings (1080p ultra 5600XT becomes the bottleneck regardless of CPU)

    Not when you're examining CPU load between cards and they used cards from other performance points too.
     
    CeeGee, Lightman and Wesker like this.
  12. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    If you have worse results in DX12 than in DX11 then it's the software's fault since it is worse at managing the h/w than the DX11 driver.
    So to answer your question - yes, this isn't a "fault" of Nvidia.
     
    PSman1700 and DavidGraham like this.
  13. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,399
    How does saying its a software fault preclude there being possible driver issues? How do you rule out the possibility?
     
  14. HLJ

    HLJ
    Regular

    Joined:
    Aug 26, 2020
    Messages:
    529
    Likes Received:
    869
    So how does the results look at 4K, Ultra settings?
    You know, how a person with a RTX3090 most likely will play the game?
     
    PSman1700 likes this.
  15. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    CPU is unlikely to be the bottleneck in any way at those settings, however it's completely irrelevant to this discussion. It's not RTX 3090 issue, not even Ampere issue since RTX 2080 Ti behaved the same way.
     
    Cuthalu and Wesker like this.
  16. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    DX12 driver is precisely made in a way to not be an issue. Considering the amount of attention which low level APIs get and the fact that you need them for RT I think it's rather unlikely to be a driver issue of such magnitude.

    As Lurkmass said above this is likely a mismatch of h/w capabilities and s/w which is a result of a badly engineered s/w in the first place.
     
    PSman1700 likes this.
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    The issues with "badly engineered s/w" aren't present on AMD in this case, so clearly it is NVIDIA specific issue for now (until Intel comes with their cards and we get results for those)
    NVIDIA designed their hardware and drivers to support the API knowing it's faults, and AMD isn't suffering from similar issues, which makes this either NVIDIA driver or hardware issue.
     
    Wesker and Putas like this.
  18. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    It could go multiple ways ...

    Microsoft chose the binding model which favors AMD HW so other vendors don't really have a choice. Similarly, Microsoft standardized dumb features like ROVs which run badly on AMD. Sometimes Microsoft takes a "no compromise" approach to these things so hardware vendors are bound to find some features difficult to implement ...

    Developers can also stop relying on "undefined behavior" because that's what they're doing so it's not unreasonable to see performance pitfalls or even crashes ...

    Maybe Nvidia could very well be at fault since they keep making hacks in their drivers instead of designing their HW to be closer inline with AMD's bindless model ...

    Who really knows at this point since the ship has sailed ...
     
  19. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,397
    Shocking news - AMD GPUs aren't the same as Nvidia's and vice versa.
    Another shocking news to you - not everything AMD does in GPUs is always great and thus not everything can and should be copied.
    In other words - this means about nothing.

    Every h/w has faults, it's the developers who should be aware of them and avoid them.

    Nope, it makes it purely a s/w issue - since there are no such faults when using other APIs on the same h/w.
     
    PSman1700, DavidGraham and HLJ like this.
  20. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    Not really. 1080p360Hz monitors exist. Try pushing max frames in COD Warzone. You need a pretty high-end gpu to pair with a high end cpu to make it work.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...