Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Discussion in 'Architecture and Products' started by Ike Turner, Aug 21, 2018.

  1. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Like sony ppl paying for 15/20 year old remasters with no graphics upgrade at all? When your ps4 (or any ps launched), there werent many games either, mostly multiplats with inferior gfx. Yes the price wasnt that high but you got modest hardware at best.

    If people want to enjoy RT now they can, more titles will follow. Your ps5 will have RT so its not a bad thing devs can get their feet wet.
     
  2. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    Surely still having the promise of a new architecture incoming while your existing architecture (after enjoying massive success) takes the hit of the glut of inventory the mining crash created is a better position to be in than having the reality of your new architecture being that of a (sales) disappointment.
     
  3. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,391
    Likes Received:
    8,605
    Location:
    Cleveland
    :confused:
     
  4. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,844
    Likes Received:
    4,456
    I don't live in a world where cheap PS1/PS2 remasters are considered system sellers for $350-$1200 playstations, so I have no idea what you're talking about.
     
  5. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,516
    Professional applications take time to implement new features, and then test them and validate them. And then validate them again with NVIDIA drivers on Quadro/Geforce lines. It's just a matter of time. Same thing happened with CUDA acceleration, OpenCL acceleration, and GPU browser acceleration. Things take time to implement. I don't get the obsession to count months after the introduction of a totally new architecture with new features like it's some sort of a sprint race to the finish line. There is no finish line.

    The situation with the crypto bubble was bound to happen regardless. Pricing Turing competitively means stagnating Pascal sales, which means bigger losses. They needed to move both of them.

    I don't see RT situation any different from the T&L situation. In fact RT situation is much better, it's supported directly from DX and Vulkan right out of the gate, and supported by major engines, and quite possibly supported by consoles as well. None of these things were present in the first days of T&L.

    NVIDIA most likely wants to make a bigger bang on 7nm, which means big dies, which means waiting for 7nm to be mature enough, which means waiting till Q4 2019. Which means 15 months without new product on top of the 26 months of the cycle of Pascal. That's too long without a new product.

    Usable at what quality level? can they provide true reflections? Soft PCF shadows? Area Shadows? dynamic GI? proper refractions? Nope. RT is an elegant solution that encompasses everything. See Quake 2 on Vulkan RTX for a proper demonstration of a complete path tracing solution.
    Highly unlikely considering it seems AMD is not even ready with a concept of doing accelerated ray tracing. They don't have an architecture, nor fallback layer drivers, nor anything really. All they do is talk about waiting for the proper circumstances to do RT, which is quite frankly is AMD's way of saying we are not yet ready to do RT. Navi is highly unlikely to support DXR at this point. AMD isn't even teasing it. They wouln't do a Radeon VII without DXR if Navi were to have DXR.
     
    OCASM, vipa899 and pharma like this.
  6. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    The introduction of hardware RT would have provided that at that time.
     
  7. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,516
    It still needs big dies to do them with the proper shading power. Which means waiting further as well. They also can't allow AMD to have a leg up on them, or even match them.

    It's obvious this 7nm node isn't really the revolution we expected. The transition from 28nm to 16nm provided much much bigger gains than this. If Intel's failed 10nm was significantly better than TSMC's 7nm, then it's not really truly a 7nm. Not even 10nm in accordance with the strict definitions. So any big jump performance is going to need big dies. As big as Pascal or more. Which means waiting for the process to be mature enough.

    NVIDIA most definitely weighed the options to use 12nm or wait for 7nm, they chose 12nm, heck they practically co developed it for Volta.
     
  8. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    The Radeon VII is 13.2B transistors @ 331mm2. How "big" would the 2080's 13.6B have been?
     
    no-X and pharma like this.
  9. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,516
    Slightly larger ~360mm2. Why take the 2080 though? let's take the Titan RTX first, this needs ~520mm2 at 7nm, which is probably not feasible at this stage.

    And this is if NVIDIA wanted to be content with doing just a die shrink. Which is I think they don't. They will do big dies, they will advance RT and AI performance even further. They could be thinking about doing another 700mm2 for the RTX 3000.
     
  10. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    Because it was what they successfully did with Pascal and launching the 2080/2070 at 1080/1070 price points but with 1080ti/1080 performance plus RT would have been a "no brainer" purchase for most. The bigger dies could have come later when, as you put it, the process was more mature.
     
    Silent_Buddha likes this.
  11. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,516
    I don't think it would be that much different, people would debate whether RT is worth it or not. And again bigger dies can't come later because NVIDIA needs a halo product at the top that is faster than previous gen. On 12nm they had this with the 2080Ti/ Titan RTX/ Volta. 7nm wouldn't give them any of that during the current time frame.

    You also are not factoring in the cost of 7nm, which is still high on it's own regardless of the die size.
     
  12. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    They would debate it to a much lesser degree than they are now if it was a value-add on top of getting better performance. Bigger dies can and have frequently come later, especially around process transitions. And it literally just happened with Pascal. Why would it be such a disaster this time?
     
  13. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,516
    RT needs the 2080Ti as the top performing chip and as the proof of concept. If all you have outhere is just the 2080, then the case for RT is not really going to be that amazing.

    Yes historically middle dies has been here first, but that's because they introduced a much bigger jump on performance compared to previous gen, this isn't the case with Turing. Hence why NVIDIA launched big Turing first out of the door.
     
  14. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,604
    Likes Received:
    11,024
    Location:
    Under my bridge
    It's no obsession. It's a discussion about nVidia's financials and underselling their expectations. What were their expectations and why? Could they have handled them better? the number of months (years) you have to wait until software supports your hardware feature directly affects its appeal and therefore sales. Ergo, the idea that nVidia should have pushed for pro software to have better support.

    True or false - if nVidia had ensured a couple of major applications had RTX acceleration at launch, interest and sales of RTX cards would be stronger than they are now?
     
    Silent_Buddha and ToTTenTranz like this.
  15. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,844
    Likes Received:
    4,456
    Can we even try to guess until these features are actually adopted in a wider scale?

    I initially thought DLSS sounded like a breakthrough of sorts - apparent wide adoption and "free" performance at minimal IQ cost. Considering the immense cost of native 4K rendering, it should be great for 4K TVs at least.
    But for all intents and purposes, DLSS is slowly entering into vaporware status at the moment.
     
  16. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    What are you suggesting? That developers try to write a complex RTX/DXR application without any hardware that can run it at a reasonable speed (no, Volta doesn't count)? That they build Turing and release it exclusively to a handful of selected developers with the vague promise of consumer hardware at some later date?

    It's a chicken and egg problem.
     
    OCASM, vipa899 and pharma like this.
  17. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,604
    Likes Received:
    11,024
    Location:
    Under my bridge
    That nVidia design the hardware, approach a couple of big applications offering to integrate RTX acceleration for them, and then release a product with software that actually uses it.

    Exactly, if everyone waits for everyone else to do something. However, nVidia were in a position to solve that, and ensure the chicken and egg were developed at the same time.

    If that's not the case, then the lacklustre sales and lack of interest were inevitable the moment nVidia decided to release RTX as is. It meant knowingly releasing a costly product that could never sell at launch because there was no demand for it. I don't think that's true. I think the demand is there in the pro imaging sector and I think demand for hardware simply needs the software to actively use it, which is something nVidia should have addressed at launch instead of pushing gaming.
     
    entity279 likes this.
  18. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,778
    Likes Received:
    6,061
    Yes. Without a doubt hindsight is 20/20 and looking back perhaps that’s what Jensun now knows where his mistake lies.

    Perhaps it’s elsewhere entirely, I’m sure there is a post Mortem happening.

    But at the same time, this is all new for everyone. RT is new. cryptomining is new. Moore’s law is done. Leaders of companies can only learn from their mistakes and make the right decisions to bounce back.

    the realities of business demand failure from time to time despite how big you are, eventually a mistake is made whether it was made back in Turing, or in pascal. You cannot constantly win forever, there’s no proper playbook for this type of thing.

    Perhaps it was enevitable and jensun felt that turning this into the next generation of graphics could dominate the narrative. Sometimes it works (SJobs). Sometimes it doesn’t.
     
    mrcorbo likes this.
  19. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,844
    Likes Received:
    4,456
    I'd have suggested them publicly announcing and selling the Geforce RTX line when the RT/DLSS enabled games were ready. And doing the same for Quadro RTX with applications.
    It's not like nvidia was quickly losing marketshare back in August 2018.


    What will happen now is, at best, games and applications will start supporting the RTX features after the marketing push for the RTX announcement has faded away for a long time.
    The valid criticism people make about FineWine on AMD also applies here. Yeah it's good that 4 years after launch AMD GPUs tend to outperform their contemporary competition, but from a marketing and sales perspective it's a wasted opportunity.

    Imagine if apple or samsung had announced some features like A.I. enhanced photos and fingerprint reader for their 2015 flagships but these weren't available for the first 6 months after release.
    A good proportion of their potential customers would have probably kept their older phones another year, or flocked to cheaper flagships from other companies,
     
  20. troyan

    Newcomer

    Joined:
    Sep 1, 2015
    Messages:
    120
    Likes Received:
    181
    They did. It takes time. nVidia is not waiting six months to release available hardware.
     
    OCASM and vipa899 like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...