Nvidia's 3000 Series RTX GPU [3050, 3060, 3070, 3080, 3090 now with TIs]

Discussion in 'Architecture and Products' started by Shortbread, Sep 1, 2020.

Tags:
  1. gamervivek

    Regular

    Joined:
    Sep 13, 2008
    Messages:
    805
    Likes Received:
    320
    Location:
    india
    So PCAT has been really useful for nvidia there, the total system power consumption difference from the second-highest power hog was around 80W.
     
  2. Shortbread

    Shortbread Island Hopper
    Legend

    Joined:
    Jul 1, 2013
    Messages:
    5,632
    Likes Received:
    4,920
    Maybe so. But, VRS is also available on Nvidia's Turing and Ampere architectures, and doesn't seem like they're leaning much into it, not like DLSS anyhow.
     
  3. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,929
    Likes Received:
    5,529
    Location:
    Pennsylvania
    $/fps is insanely high for 3080 of course, only beaten by 5700XT.
     
    digitalwanderer likes this.
  4. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    Should I pull the trigger? Still kinda concerned about 10GB.

    Or maybe I buy that rowing machine...
     
    digitalwanderer likes this.
  5. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    I'm wondering the same. Maybe DirectStorage will help or perhaps developers will start to optimize for 10GB now that there is 4k capable 10GB card. Worst case, run 1440p and use dlss.
     
    Shortbread and TheAlSpark like this.
  6. xEx

    xEx
    Veteran

    Joined:
    Feb 2, 2012
    Messages:
    1,060
    Likes Received:
    543
    I feel disappointed. less than 30% performance improvement. Yeah they are not so overpriced this gem but I hoped more. Also 10GB in the next years will probably won't be enough for 4K gaming. That and the 400W OC :shock:
     
  7. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,044
    Likes Received:
    1,116
    Location:
    WI, USA
    I think I'm going to wait to see if AMD has anything competitive coming. There are no games that I'm stoked about that my current Pascal hardware will have trouble with anyway.

    10GB seems like a scheme to tempt you to buy that 3090. I'm not very excited about upgrading my 1080 Ti to a card with less RAM.
     
    CeeGee, Shortbread and TheAlSpark like this.
  8. arandomguy

    Regular Newcomer

    Joined:
    Jul 27, 2020
    Messages:
    251
    Likes Received:
    355
    TFLOPs, at least to me, has never been a useful performance metric by itself. The listed numbers have always just been a theoretical peak number derived from the clock speed x FPU units x Ops per unit (which is basically 2 across). So in essence comparing TFLOPs is just comparing the difference between FPU units x clock speed. If you took a GPU and downclocked it's memory to the minimum it's TFLOPs rating wouldn't even change (actually with how modern dynamic clocking works it might be higher due to increased power budget), yet we know in reality for any meaningful work load it's real TFLOPs number would go down significantly.

    Of course it's one of those useful technical marketing terms so it gets used.
     
  9. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    30% performance improvement over the 2080ti for what 60% of the price? I guess if you have a 2080ti the upgrade is not as clear cut. For everyone else, it's an easy winner.
     
  10. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    13,878
    Likes Received:
    4,724
    still worried about the ram
     
  11. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,679
    It's about the only thing I'd worry about, but I'm not that worried.
     
  12. xEx

    xEx
    Veteran

    Joined:
    Feb 2, 2012
    Messages:
    1,060
    Likes Received:
    543
    IF the 2080TI was ridiculous overpriced that doesn't mean Ampere is good, just show how bad Turing was. Of course we could not see it because AMD did not offer any point of comparison.

    I also don't understand why ppl are so happy with, 5-10% better efficiency out of a new Arch against a 2 year+ one plus a new node? Like we should see even more gains that that just by the node itself.

    And this kids is why you should never pre-order a product base on a paid review that says it will be 80% faster.
     
  13. msxyz

    Newcomer

    Joined:
    May 5, 2006
    Messages:
    122
    Likes Received:
    54
    That and power consumption. It looks like notebook users will be stuck with cards like the GF1660.
     
  14. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    13,878
    Likes Received:
    4,724
    my vega 56 is what just over 3 years now. So i'm hoping to keep my new card 3 years.

    I mean just going to the new tech would be helpful in the notebook world. however we might see the ryzen with intergrated navi 2 stuff knocking on gf 1660 performance soon
     
  15. troyan

    Regular

    Joined:
    Sep 1, 2015
    Messages:
    603
    Likes Received:
    1,123
    Why? RTX3070 delivers 2080TIFE performance with 50W less. nVidia has put a "2080 super" into notebooks. The 3070 will deliver much more performance than this card.
     
    Florin, DavidGraham and pharma like this.
  16. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    Do you want to risk being "stuck" with 10GB until a worthwhile better card launches in 2 years' time? What happens to second-hand prices of 10GB cards if 20GB cards launch in November? Or even 12GB?

    Remember texture quality is what's really going to eat VRAM and that isn't improved by running at 1440p.

    Sampler feedback is the one technique that will really make a difference in the lifetime of a 10GB card. In 2 years' time it will probably be a widely-used technique, with games being worked on right now taking the time to use the technique. But it does require a serious re-work of existing game engines.

    One of the things that I'm still struggling to understand is the VRAM impact of ray tracing. It's likely to become more common in games over the next couple of years but I have no idea whether it's going to be significant.
     
  17. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    I somewhat recall Digital Foundry found that VRAM consumption increased quite a bit when they tested Battlefield V. Maybe that'll change a bit with more optimization, but I figured there'd be a non-trivial increase from the various buffers involved.

    Maybe this video:


    I think they discussed texture settings needing to be reduced when DXR was on. (can't watch right now :embarrased:)
     
    #97 TheAlSpark, Sep 16, 2020
    Last edited: Sep 16, 2020
    BRiT likes this.
  18. msxyz

    Newcomer

    Joined:
    May 5, 2006
    Messages:
    122
    Likes Received:
    54
    Notebook variants of the 20xx GPUs are heavily throttled to make them fit for a laptop. And even at the 80W limit of the MaxQ cards, the notebook cooling solutions required already start to be heavy and noisy. The most sensible solution would be a card capable of playing games maxed out at 720p, RT included, coupled with a high performance, high quality dedicated upscaling unit. But if the upscaling process has to run on the same shader units, as it happens now with Ampere, then the upscaling performance is tied to that of the whole GPU.
     
  19. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,320
    Likes Received:
    525
    I think it turned out pretty good, all things considered:

    upload_2020-9-16_12-49-11.png
     
    pjbliverpool, Cyan and Scott_Arm like this.
  20. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    In case you are curious. I built my last computer 2013 which I'm still using. Only upgrades I did was better ssd and better gpu. I see pc technology development being slower and slower,... Not counting gpu my next planned pc build is slightly over 2000$ in parts at the moment. I might get 3080 or 3090. I'm not expecting something in 2 years time that would obsolete that build. When I look at turing to ampere performance uplift that would not provoke me to upgrade gpu. Earliest maybe 4 years from now there is something good enough to make me consider upgrading. Thinking about how good dlss2.0 already is compared to native 4k it might be 6 years upgrade cycle for gpu. dlss is only going to get better over time and I can sneak by few years with dlss on to avoid upgrading.

    I'm firmly in the boat to buy something so good that I rarely need to upgrade. I'll wait until winter though. I want something zen3 based and perhaps by that time the availability and understanding of what is good high end buy is more clear to me. Also by that time cyberpunk2077 has had enough patching that I dare to start to play it.

    I was "forced" to upgrade my original gpu to 1080ti as my old gpu just wasn't good enough for vr. 1080ti I want to upgrade to get ray tracing for new games like cyberpunk2077 and bloodlines2 and so on. But I suspect my old 4core cpu is just not going to cut it anymore for those games with ray tracing on. If there was no ray tracing I would still be happy with 1080ti.

    edit. I wouldn't be surprised if developers started to optimize 4k for 10GB memory now that there is 4k capable card with that spec. In past that was not the case and 4k was 2080ti or better only affair. 3080 and 3070 are going to sell a ton of units and can in theory run optimized 4k well. 3080 should be desirable optimization target for developers.
     
    #100 manux, Sep 16, 2020
    Last edited: Sep 16, 2020
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...