Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Discussion in 'Console Technology' started by vipa899, Aug 18, 2018.

Thread Status:
Not open for further replies.
  1. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,649
    Likes Received:
    244
    I don't necessarily think it'll be better, but knowing AMD's modus operandi, it'll be a more transparent and open standard. It's been quite pleasing to see Freesync get so much support while Gsync often gets the bird, but that extends well beyond just AMD into a territory that involves many other GPU makers in different devices.

    Nvidia's form of ray tracing, while still currently sophmoric could infact be laying the groundwork for the next few years of graphics architectures because of their defacto standards and the status quo in the professional space. I don't think it'll be the 8800GTX all over again, but it's obviously shaking things up immensely. Just look at all the buzz and conversation around it.
     
    Prophecy2k, milk, Lalaland and 2 others like this.
  2. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,853
    Likes Received:
    5,654
    Location:
    London, UK
    What's that have to do with, how did you put it - the "PR gaming nightmare for AMD being two generations late with consumer RT ready graphics cards."? Look at every successful technology company around, guess what? They weren't first. Apple weren't first with a GUI personal computer, smartphone or MP3 payer. Microsoft weren't the first with a personal computer operating system, not a GUI one. Google were way late to the search party. Sony only entered the console market with the fifth generation.

    Being first means you take all the risk while your competitors observe and learn from your missteps and follies.
     
  3. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    3,685
    Likes Received:
    1,804
    So who was suppose to take the first steps towards forging (providing) the first consumer level RT ready GPUs? AMD? Intel? Because most of your statements are negative-nanny dogpiling about Nvidia providing the first steps towards such needs. I'm not seeing your point at all.
     
    Heinrich04, vipa899 and HBRU like this.
  4. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Raytracing/nvidia discussion was moved to the graphics forum yet cont on both places :)
    Some really wish to have RT in next gen i see, must be good tech.
     
    Heinrich04, HBRU and BRiT like this.
  5. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    10,853
    Likes Received:
    5,654
    Location:
    London, UK
    I thought my point was clear in the first post but to re-iterate. In terms of design, which for years has been honing ever-more flexible core hardware that can be used to assist any part of the graphics or compute pipeline, to paradigm shift back a decade to bespoke single-purpose hardware that will see questionable use because this are processor cores only deployed on high-end hardware from one manufacturer, is I can fully see the opportunities augmented raytracing will bring, but not for a while. Nvidia are asking you to pay now for bespoke hardware on the promise of what the hardware might be capable of if/when more games support it - assuming you haven't ditched your 2070/2080/Ti to the better gen 2 version in 18 months time anyway.

    Getting back on track, I would be extraordinarily surprised to see any iteration of this technology in the new consoles. If it's present, it will be so watered down as to be next to useless for the intended purpose or enhancing graphics significantly. Not unless AMD have something very cool cooking that they've not shown.
     
  6. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,509
    Likes Received:
    10,882
    Location:
    Under my bridge
    Baked Alaska? o_O
     
  7. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,175
    Likes Received:
    3,586
    That's actually pretty significant. The new hardware is a pretty significant investment. They're all-in on RT, so to speak.
     
    Shortbread likes this.
  8. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    3,685
    Likes Received:
    1,804
    I agree with you on a more unified general-purpose approach towards rendering (i.e., less latency, more efficient design, smaller possible shrinks, and of course, more developer friendly), without the need for specialized logic [cores] outside the primary rendering array. But the question becomes why did Nvidia go this route?

    If we take Nvidia at their word; that Turing architecture was 10 years in the making (I know, possible PR fluff), somewhere in R&D they must of had multiple designs (possibly baked GPUs) for both designs (one with a unified approach with rasterization and RT, and the current Turing architecture). If the unified approach had problems, what possible roadblocks stopped Nvidia from going down that path? Was it core size and complexity? TDP issues of such a core design? Yield issues of such a core design? Current tech? Cost? A mixture of everything? Something had to push Nvidia in that direction of a nonunified rendering design.

    Hopefully, AMD has figured out a more unified rendering approach.

    Let's be honest, this has always been Nvidia's approach. To get the gamers excited, more specifically the premium consumers, to invest more and more into their brand. To see what sticks, or fails. Nvidia's corporate culture of wanting to grow their (already colossus) market-share beyond their competitors.

    Disagree. If it's present, watered down, that would be a waste of die space and bad judgement on engineering. If it is present, fully capable, then more than likely first-party developers (i.e., Naught Dog, Guerilla, SMS, 343 Industries, The Coalition, etc.) would jump at the chance of using RT.
     
    #108 Shortbread, Aug 24, 2018
    Last edited: Aug 24, 2018
    Heinrich04 likes this.
  9. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    3,685
    Likes Received:
    1,804
    ...the unified yumminess. *drool*
    [​IMG]
     
  10. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    5,364
    Likes Received:
    3,854
    Generic ray casting acceleration would be pretty useful all over the game engine. It doesn't have to be a paradigm shift in rendering, it doesn't need to be forced to be used by devs. It would allow a much better lighting pass and GI as soon as the major engines support it. Hardware rays will be much more efficient regardless of the size of the chip, and it has a much better chance of being used on a fixed console than on PC.

    The same goes for tensor cores, sony have been investing into sparse rendering resolvers and CB, it would equally help PSVR foveated and non-planar rendering too.

    It comes with a big compromise in transistor count per compute block, but on the surface the gain seems worth it. If AMD comes out with something competitive to the 2070, that could end up a reasonable size and clock on 7nm (or 7nm+) in 2020.

    I might be a bit optimistic but I don't see next gen without at least some helpers for both GI (be it RT cores or some other idea) and sparse rendering (be it tensor core or something else).
     
  11. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,715
    Likes Received:
    6,006
    Considering how weak these consoles are for this generation, I’m a bit surprised by the commentary.

    Give me weak RT hardware support in 2020 than waiting until 2028 for real hardware support.
     
    Heinrich04, McHuj and Scott_Arm like this.
  12. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    596
    Likes Received:
    265
    I'm not sure where the idea that these consoles are weak comes from, given the price envelope and available tech the XB1/PS4 represent good value for the price point (excepting XB1 at launch, it became a decent offering with the S). Sure it would have been nice to have a stronger CPU core but with Bulldozer being the only game in town (Intel doesn't console, Ryzen was unavailable and ARM nope) 6ish Jaguar cores is a good compromise. I don't think we've crossed that "PC for <$400 beats PS4 Pro" point yet have we?

    If the new boxes get some sort of RT hardware that would be lovely but I don't see it as a deal breaker given the relative immaturity of H/W accelerated RT in general.
     
    milk and eloyc like this.
  13. Xbat

    Regular Newcomer

    Joined:
    Jan 31, 2013
    Messages:
    799
    Likes Received:
    448
    Location:
    A farm in the middle of nowhere
    What's the chance of the next Xbox sticking with Vega? I seem to remember reading somewhere that AMD will have a 7nm Vega.
     
  14. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,175
    Likes Received:
    3,586
    7nm Vega won't be released as a general consumer product. It's for their pro line of cards.

    I think it's most likely that Navi will be the basis of next-gen consoles. We know as much for Playstation 5.
     
  15. Xbat

    Regular Newcomer

    Joined:
    Jan 31, 2013
    Messages:
    799
    Likes Received:
    448
    Location:
    A farm in the middle of nowhere
    That's what I expect too but it's that whole 2/3 working on Navi for Sony which made me think about Vega. I still think Intel might be a long shot for Xbox.
     
  16. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,175
    Likes Received:
    3,586
    I'm not sure I believe Navi is going to be a product exclusive to Sony. It is their next gpu architecture for the general consumer. It's their late 2019, 2020 roadmap, unless they plan on not releasing a new PC gpu until 2021.
     
    DSoup and Lalaland like this.
  17. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    Well it is not likely for big N ;) Though I really wonder what their options are going to be. Nviddia produces bigger and biggerSOC that are everything but cheap. There are no off the shelves SOC that they could use in sight. It is not an emergency for the Switch but the 3DS is aging and the Switch SOC do not seem to be a good choice.

    As for Microsoft, I would not make any bet, imho the whole division needs to rebooted. THe XB1X do not have the expected effect, it is obvious they are not sure how to advertize, they speak of upcoming system, etc Are they trying to kill it as they kill the decent XB1S? Saying that they are "browsing view" is giving them quite some credit. It is a bunch of good communicators sitten on a pile of money and other resources that hardly manages to maintain the sail afloat (quite not actually).
    MSFT need to reconsider its approach to gaming and do so as only one entity. There are glaring lackig in their offering. The way Windows behaves on TV comes to mind. You can have nice gaming mini PC but the software is not OK.


    Sony well they are in a pretty good spot now, as in the PS2 era they can lauch pretty much whenever they want as long as the product is competitive from the get go and backed with solid launch titles.
     
    egoless likes this.
  18. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,142
    Likes Received:
    1,830
    Location:
    Finland
    The "RT core" is actually 72 specialized units split into SM's
     
  19. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    596
    Likes Received:
    265
    I could see Nintendo asking a Qualcomm for a derivative of their SoC family, I like my Switch but the battery life on that thing is even worse than my 3DS, but the key challenge is that will Qualcomm be bothered to spend the R&D dollars to produce an SDK for Switch2 or what have you? They make a lot of money right now just selling chips for Android and support for the that without having to help make a whole new SDK environment.

    I am genuinely unsure how much longer Nvidia are going to be able to keep the SoC division going from the most recent quarter they have an announcement that they will be supplying DRIVE to Daimler/Bosch for next year but $123 million in revenue for the last Q makes it < 3% of gross revenue and is presumably a bigger drain on R&D than the other 3 divisions which can leverage the general GPU work.
     
  20. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,563
    Likes Received:
    1,980
    Don't dismiss the fact that their priority is their professional and other high-margin parts. So, the bulk of their R&D is going to be spent on delivering products that deliver the best performance for those markets (or create new, lucrative, ones). Creating derivative designs from those products and then finding ways to leverage the technology in them for the gaming market is going to be a lot cheaper than starting from the ground up with a new design specifically targeting what is going to provide the best performance for games today and over the marketing life of the product. While this doesn't necessarily mean that this route doesn't also deliver the best result for their gaming designs, it is not a given that this route was chosen because it was the best for gaming.
     
    DSoup and Shortbread like this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...