Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Discussion in 'Architecture and Products' started by Ike Turner, Aug 21, 2018.

  1. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    Turing RTX is not really orientated at end users. It's orientated at developers. Developers absolutely need hardware to run their code on, and unless Nvidia wants to try to somehow take half the AAA game industry in house, they have to release hardware for them. Likewise, the RTX announcement is aimed at developers, not end users. Nvidia needs to get *them* hyped so that they will actually target the hardware in the first place. There are also indie developers to think of, who tend to be the ones exploring novel techniques. They have to buy the hardware just like every one else - they don't get special channels with Nvidia like AAA developers do.

    As for industry adoption, you already have things like Unreal Engine incorporating RTX/DXR. UE4 is pretty huge in the industry. And last time I checked, BF5 was a AAA game. But for ray tracing to really live up to its potential, we will have to wait for the next generation of engines to be built around it, but this will take *years* from now until completion of games using said engines. As it is, there are just too many bottlenecks in the way ray tracing and existing engines work. BF5 wasn't slow because of bad ray tracing hardware, it was slow because of all the extra work it had to do to accommodate a very intrusive system that the engine was not designed for. Consider that there was almost no performance difference between low quality DXR and ultra quality DXR, but that enabling either massively slowed down the entire system.
     
    OCASM likes this.
  2. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,907
    Likes Received:
    1,607
    True, and think all IHV's basically follow the same principle. Vega and the Rapid Packed Math feature did not suddenly appear in games overnight, but took developers months for it to appear in 3 supported games.
     
  3. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,383
    Likes Received:
    8,600
    Location:
    Cleveland
    Please do not turn this into a PC vs Console issue...
     
  4. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    RT needs the hardware to be in people's hands more than anything. If you can sell the hardware on other merits, the RT hardware's effectiveness isn't carrying that burden. BTW, you stating that RTX requires $1,200 hardware to impress people is one of the more damning statements I've seen about the tech.


    It isn't a jump because they chose to launch on the same process.
     
    Silent_Buddha likes this.
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,597
    Likes Received:
    11,003
    Location:
    Under my bridge
    No, they didn't. They released the hardware before the software was ready. Now they're playing catchup. nVidia worked with DICE to get RT into BFV but not with Autodesk to get RT into Arnold.

    And if it takes time and the raytracing software can't be ready for raytracing hardware for a year after its launch, what's the point in buying RT hardware now? Wait a year for when the software is ready for it.

    I don't understand the counter-arguments here. People seem to be suggesting that the best move for nVidia was to release a product with no software for it, and that they were right to expect people to spend huge amounts of money buying it ahead of software appearing, and that the lack of meeting sales target is a strange reaction by the market. "I don't know why people aren't buying RTX cards now - it's going to be really good once software comes out for it."
     
    #585 Shifty Geezer, Jan 31, 2019
    Last edited: Jan 31, 2019
  6. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,797
    Likes Received:
    2,056
    Location:
    Germany
    If not with someone outside the company, they at least could have primed their internal optix team in order to have support ready at launch. But apparently, Optix 6 has no decided-upon launch date yet.
     
    Silent_Buddha likes this.
  7. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,973
    Likes Received:
    3,050
    Location:
    Pennsylvania
    Indeed. From everything we knew at the launch, DICE had been working on their RT tech 8 months before getting hardware, using Volta and then updating their code to work with Turing in 3 weeks before the RTX release.
     
  8. Ike Turner

    Veteran Regular

    Joined:
    Jul 30, 2005
    Messages:
    1,884
    Likes Received:
    1,756
    That's exactly what I was saying. People can't the blame it on ISVs when it's Nvidia themselves who weren't even ready. Once OptiX 6 is deployed every DCC software currently using it will be able to usethe Tensor Cores in Turing GPUs to accelerate denoising.
     
    Silent_Buddha likes this.
  9. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,830
    Likes Received:
    4,451
    Geez that thread...
    All this time with the cards on the market and they can't even commit to a release window for supporting the RTX features on professional applications?
     
  10. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,907
    Likes Received:
    1,607
    I expected to see more views and replies in that link, but could be due to Optix support already working with clients. According to Nvidia they are working with NDA partners with issues regarding Optix 6 SDK which would take advantage of RTX acceleration. I can't see holding up a hardware launch that has a number of major platforms to support (Deep Learning, Data Center, Autonomous Machines, Healthcare, HPC, Self-Driving Cars, Gaming & Entertainment, Design & Pro Visualization) whose customers are not affected by the late Optix 6 SDK. I guessing the hardware currently works with Optix 5 SDK similar to Volta.

    Edit: Font
     
    #590 pharma, Jan 31, 2019
    Last edited: Jan 31, 2019
    vipa899 likes this.
  11. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    RTX for gaming has good support, for me the only important thing but can understand the pro market.
     
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,749
    Likes Received:
    2,515
    True, but that's not how software adoption functioned in the last decade, as I stated above, CUDA, OpenCL and browser acceleration came after the hardware is released, not before.
    It doesn't need 1200$ per se, but it needs big dies (for the shader power and RT/Tensor cores). Larger than possible at 7nm at the moment.

    Then what do you think are the reasons for launching at 12nm?
     
    OCASM and vipa899 like this.
  13. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    You agreed that the 2080 wouldn't be that big at 7nm. So, if you need a bigger die than that, your talking talking about a product that at 12nm costs $1,200 or another that costs even more.

    Because they misread the market and their place in it. When everything you touch has been turning to gold you tend to think you can do no wrong and start thinking in terms of what products fit your goals best over what products will best appeal to the consumer.
     
  14. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,766
    Likes Received:
    6,045
    One can imagine that they didn't have many other options, either send out something akin to Radeon VII, ship a Volta consumer derivative, ship Turing RTX, or ship nothing at all.
    The last one seems the worst of the 3. The first 2 seem like a waste of time.

    I don't think they did the wrong thing. But they are going to eat it until everyone else catches up. It's not like there are 15TF radeon's out there eating their lunch and dominating the playing field reversing years of goodwill. As far as I can see, aside from missed projections which is what we are debating, they can still safely continue this RT program.
     
    milk, DavidGraham and vipa899 like this.
  15. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    Edit: Why not, "ship nothing at all"?
     
    #595 mrcorbo, Jan 31, 2019
    Last edited: Jan 31, 2019
  16. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,383
    Likes Received:
    8,600
    Location:
    Cleveland
    If they knew about the soon to be overabundance of Pascal products in the channel they could have shipped nothing at all and ended up better off.
     
    mrcorbo likes this.
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,166
    Likes Received:
    1,836
    Location:
    Finland
    Because it's too big and expensive.
     
  18. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,766
    Likes Received:
    6,045
    I suspect that Shareholders/business/cashflow/timing/manufacturing issues are all factors at play here. We must suspect that there is a Turing/Post turing derivative lined up for 7nm already. Shipping now may be costly, sure, but it sets up their 7nm product.

    It's going to take a long time for developers to start moving towards RT. A long time. No reason to delay the discussion, if you can get developers working for the next 2 years on it now in time for a real RT launch. And when the competition actually comes into play, nvidia would have their RT matured by 2 years of input and driver fixes.
     
    vipa899 likes this.
  19. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,564
    Likes Received:
    1,981
    I edited my post to clarify what I was questioning.
     
    Kaotik likes this.
  20. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,766
    Likes Received:
    6,045
    hard to predict that ETH and other GPU mining cryptos would die off so quickly resulting in a flood of the market. If ETH stayed above 400 USD, I wouldn't have sold my 1070s either and there wouldn't have been a flood. Crypto had a massive boom bust all in a matter of 12 months. These cards are in development for much longer.
     
    DavidGraham and vipa899 like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...