Next Generation Hardware Speculation with a Technical Spin [2018]

Discussion in 'Console Technology' started by Tkumpathenurpahl, Jan 19, 2018.

Tags:
Thread Status:
Not open for further replies.
  1. The TU102 is a 7.5% smaller GPU than GV100 with 10% less SM / Tensor units and 25% less ROPs.
    40 ROPs of difference are certainly not nothing.

    Especially when you consider that the 2080 Ti's 68 RT units are the bare minimum to achieve those effects at 1080p60.
     
    #3061 Deleted member 13524, Oct 19, 2018
    Last edited by a moderator: Oct 19, 2018
  2. Ike Turner

    Veteran

    Joined:
    Jul 30, 2005
    Messages:
    2,110
    Likes Received:
    2,304
    I thought that the subject at hand was Ray Tracing? and in this case nobody else but Nvidia is marketing DXR/RTX support for those games. What does having console marketing rights for those games have anything to do with what we were talking about? You lost me there..
    The Coalition like all MS 1st party studios (besides Turn10 & 343i) are using UE4. They will "freely" get DXR support once it's implemented by Epic later in 2019 (the current internal branch developed with Nvidia is a total hack job done for R&D. UE4.22 will initially only support RT shadows next spring). And all those studios are also PC devs so once again we can't assume that the console versions of their games will have any kind of RT support just because of this.
     
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    No-one's shunned it off. Some realistic objections have been presented but weren't addressed with reasonable comebacks, only references to 'raytracing is coming.'

    As a pro-RT enthusiast looking forwards to the RT'd future, I still have concerns about its economic viability in a console. If RTX was $300 and rendering what we're seeing at 4K30 or 1080p60, we'd be having a very different conversation with everyone agreeing it was the future based on the data. However, the data so far is that a massive piece of silicon can raytrace but not amazingly quickly, from which we are to speculate how things will move and whether they'll move fast enough that rasterisation won't be able to keep up and a console without will date very quickly, particularly with regards using non-specialist HW to RT so that it's 'good enough'; we already know raytracing as a software solution exists and will continue.
     
  4. iroboto

    iroboto Daft Funk
    Legend Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    14,833
    Likes Received:
    18,633
    Location:
    The North
    My bad. Let me try to do my best.
    It's a representation of chumminess. The game developers are coding using DXR, nvidia releases the drivers that support DXR and it works. Ultimately the games are being coded in DXR and not some weird nvidia extension.
    Why this is important is due to the nature of DXR, it's a flag in which if you have the capability it will access the hardware, if not it will access another path entirely. This operates like any other feature in Direct X.
    If you're following me up to here. You need to remember the way that MS developed X1X. They used existing games/code and they simulate their performance and profile and make modifications to the chip to see the output before they started burning silicon tests. This is how they got great performance for hitting 4K without having to do much guess work or insane amounts of optimization on the developer side of things.

    Well assume they want to do that exact same thing again for their next console. They can already profile 4K performance, X1X games provide all the data points they need there. But they don't have any real DXR games to test against... but now they will. That DXR code will stay for consoles because it'll just be ignored, but running that code in the simulation it may not.

    They can then leverage it and profile accordingly.
    That is of course, if they were planning to have RT in a console.

    all speculation though. please ignore.
     
    turkey, OCASM and pharma like this.
  5. Jay

    Jay
    Veteran

    Joined:
    Aug 3, 2013
    Messages:
    4,032
    Likes Received:
    3,428
    I think pretty much everyone agrees (some form of) RT is the future, the question is how far in the future, next gen, or gen after.

    The problem is that we have no idea what AMD has lined up:
    • Fully programmable flexible
    • Fixed single use function
    • Separate accelerator
    • Nothing at all

    We all know that NVidia won't be in either the next PS or Xbox.
    Will amd's implementation be like NVidias, who knows.
    But if the next gen does have RT, it will be used as long as it's net affect isn't negative. Which would be a huge cockup.
    If it's usable, it will be a very good thing as it will cause it to be a reasonable base for games sooner.

    We should all be blaming amd for not knowing anything about their lineups :yep2:
     
    OCASM and BRiT like this.
  6. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Well, some of us don't believe nVidia's solution is at all optimal being based on their AI work rather than a dedicated push for realtime RT for computer games. Fixed function denoising or compute based intelligent denoising (using fat buffers instead of machine learned 2D image comparison) could be a lot more efficient, saving on the Tensor cores. Ray intersect tests sounds like something that could be added to compute units as a block?? Then you have the memory access, which might just need some advanced cache thingy (possibly a thingie instead).
     
    AstuteCobra and Silent_Buddha like this.
  7. troyan

    Regular

    Joined:
    Sep 1, 2015
    Messages:
    605
    Likes Received:
    1,126
    Like RT Cores?
     
    egoless, turkey, OCASM and 1 other person like this.
  8. Jay

    Jay
    Veteran

    Joined:
    Aug 3, 2013
    Messages:
    4,032
    Likes Received:
    3,428
    From strictly a selfish point of view, I find that a lot more interesting conversation.
    Other possible implementation for use in a console etc
    I.E. Sony or MS, said we want some form of RT acceleration, how can we go about this without having to worry about other market considerations like pro cards etc
     
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,317
    Yes, RT when it is commercially feasible as a mass market consumable (in products below 400 USD) at a fast enough speed at those price targets to be a net benefit is unlikely to look anything like NV's RTX series.

    IE - neither AMD nor NV's future DXR performant and consumer focused accelerators will likely be similar to RTX. Ike Turner is likely correct in that this was NV trying to find a way to make their pro-market GPU appealing to PC gaming enthusiasts.

    Heck, as early as NV's next consumer GPU release we might see something wildly divergent from RTX from them with less fixed function units. RTX could just as well be NV's backup plan to release something with node transitions coming slower than they might like (IE - a proper next gen consumer product may have been reliant on earlier availability of a smaller silicon node).

    I wouldn't take RTX as some predictor or foreshadowing of how future GPUs with RT support will look.

    Regards,
    SB
     
  10. beyondtest

    Newcomer

    Joined:
    Jun 3, 2018
    Messages:
    58
    Likes Received:
    13
    Now that you mention 8K, I guess that won't be the target. If not RT then maybe 60 FPS where the next gen Pro and X have a heavily improved CPU and not as much improved GPU? That kinda sounds less unless VR really takes off where higher framerate matters plus it seems to be a rabbit hole that console makers don't usually push much for since FPS is preferred for easier marketing through more improved graphics per generation.
     
  11. OCASM

    Regular

    Joined:
    Nov 12, 2016
    Messages:
    921
    Likes Received:
    874
    Hybrid rendering will be dominant for at least the next decade.

    My bet is on fixed function. First gen what matters is speed and ease of adoption. Flexibility comes later.

    OTOY uses AI denoising as well. RTRT and denoising go hand in hand. AI denoising makes use of fat buffers too.
     
    Heinrich4 and pharma like this.
  12. beyondtest

    Newcomer

    Joined:
    Jun 3, 2018
    Messages:
    58
    Likes Received:
    13
    Yikes.

    Edited.
     
  13. Allandor

    Regular

    Joined:
    Oct 6, 2013
    Messages:
    842
    Likes Received:
    879
    yes, even game engines, but not actually to get it into the final product, but to see how e.g. lightning conditions behave in real-time at design-time. RT is just easier here to get the optimal settings but this is only for design-time.
    There are other hybrid "RT technologies" that are already in use for current consoles. Well we could call them RT as nvidia promotes their RTX as RT which really is just a hybrid solution. Well there may be some lightweight hardware in the next consoles that might be used for something like lightning conditions, but nothing for reflections etc like we've seen in the BF V demos. This is just to compute intensive. But this technology can be used to get e.g. better "textures" for screenspace reflection-like effects at design time.
    Yes, MS made an API for it, but that doesn't mean it will be used for games on consumer devices. MS makes many APIs and right now, RT(X) is just a buzzword they needs to be used everywhere, just like VR in the last 2 years, or like MS needed to promote cloud-compute for the xb1.
     
    milk and egoless like this.
  14. Magnum_Force

    Newcomer

    Joined:
    Mar 12, 2008
    Messages:
    104
    Likes Received:
    70
    Probably not the right place for this, but it's worth pointing out that Nvidia's "RT Cores" are more accurately described as "BVH Accelerators" - calling them "RT Cores" is a bit like calling ROPs "Rasterization Cores".

    I think that makes sense ....
     
    AstuteCobra, turkey, vipa899 and 6 others like this.
  15. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    Why on Gods green earth should that mean that "rasterisation has reached it's limits"?
    It simply means that ray tracing is the new buzzword to try to sell new stuff to the yokels. Like VR or 3D, or... Surely you've seen this over and over during the years?
    Ray tracing has the same issue today as it always had - efficiency. And it actually has a harder battle to fight today since raster shader approaches actually do a decent job today compared with, say, fifteen years back. The fact that the prospects for lithographic advances are pretty grim, and that computing has emphatically gravitated towards mobile devices isn't helping its case either.
    The proof of the pudding will be in its eating. If ray tracing approaches will produce a better result than spending the same resources elsewhere, then it will have a case. Otherwise not, particularly in gaming which is all about providing entertainment value.
     
  16. OCASM

    Regular

    Joined:
    Nov 12, 2016
    Messages:
    921
    Likes Received:
    874
    Screen space reflections are garbage and should disappear as soon as possible.

    The rest of your post is just denial.


    Except thanks to reconstruction techniques ray tracing took a massive leap forward in recent years making it finally viable for realtime use:

     
  17. Tkumpathenurpahl

    Tkumpathenurpahl Oil Monsieur Geezer
    Veteran

    Joined:
    Apr 3, 2016
    Messages:
    1,910
    Likes Received:
    1,929
    Is there really any need to be so rude?

    Every engine worth its salt had global illumination in from before this generation began, and we can count the number of globally illuminated console games on one hand. One deformed hand, missing several fingers, at that.

    There's every chance that RTRT will see the same fate next generation: global illumination becomes the norm, and the occasional game - of a similar scope to The Tomorrow Children or Driveclub - knocks off everyone's eye-socks.

    So, before getting so salty over some rays, please just bear in mind that your arguments were applicable to GI only a few years ago, except that RTRT is a less known quantity.
     
    Allandor and London Geezer like this.
  18. VitaminB6

    Regular

    Joined:
    Mar 22, 2017
    Messages:
    279
    Likes Received:
    388
    RT may be a mid gen thing for the next console cycle. Current mid-gen's main selling point was 4K which leads me to believe they'll want to check off an obvious selling point for the next mid-gen refresh. With diminishing returns already being somewhat apparent going from 1080-4K and will be even more apparent when we go beyond that I could see RT as a great reason to upgrade to a mid-gen console. Also I'm not sure AMD or NVIDIA has tech that's ready or that makes sense for a console APU being launched in 2019/20 but I'm certainty not an expert on that.
     
  19. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,511
    Likes Received:
    24,411
    We don't tolerate that sort of behavior in the console forums, so please check your attitude.
     
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    How does that equate to rasterisation having met its limits?

    Image1.jpg

    Yay, raytracing's gonna solve all our shadowing problems. :p

    Kidding aside, raytracing is reliant on hacks to accelerate it, so the ideal, perfect renderer remains a ways off. Furthermore, that video shows the pursuit of really low-level raytracing, before RTX existed. Good quality lighting is being achieved with one sample per pixel, which is in the realms of doable as RT on compute in a next-gen console. If adding RT acceleration structures is cost effective in silicon, it behoves its inclusion, but if it requires considerable compromise of the raw shader power, it could potentially be left out without games suffering too much and maintaining maximum flexibility.
     
    Allandor likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...