Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Discussion in 'Console Technology' started by vipa899, Aug 18, 2018.

Thread Status:
Not open for further replies.
  1. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,570
    Likes Received:
    1,393
    Are we positive raytracing is going to take over?

    All rendering seems like tricks to more speedily fake the real thing. At what point can you 70, 80, 90 percent fake what raytracing does, much faster on traditional GPU's. And, the 10% different Rting makes, will people honestly care? Are people going to be that worried about really good reflections if the competitors card is twice as fast because they aren't dedicating massive area to raytracing cores?

    It'll be interesting to see it play out in my book anyway. If I'm AMD I'm trying to use my die are to make my cards 2x as fast as Turing at traditional rendering and see what happens. Of course, it's AMD, who seems to intentionally cripple themselves (why in the hell they are still mucking around with HBM while Nvidia cleans their clock with GDDR is beyond me)
     
    Lalaland and eloyc like this.
  2. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,418
    Likes Received:
    5,819
    GDDR6 is an interim solution until HBM becomes the norm. There is a physical limitation of energy per bit on an FR4 pcb (effectively a transmission line) and detectable threshold within the "eye" of the signal. The reason we have GDDR6 at all is because HBM keeps failing it's cost/performance promises. Lots of challenges.

    As I understand it, 20gbps is the end of the road on pcb without going balanced. Any effort to advance HBM will pay off in the future. So... I like that AMD tried, even if it didn't pay off yet.
     
    #142 MrFox, Aug 25, 2018
    Last edited: Aug 25, 2018
    Michellstar and Tkumpathenurpahl like this.
  3. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,181
    Likes Received:
    1,175
    I co-wrote a molecular ray-tracer 30 years ago, you mean I will finally take over the world?
    AAA titles find most of their revenue on consoles, that should tell you something about development directions right there. (Not to mention that mobile generates the greater revenue than either consoles or PC.) And there are already methods that achieve the main things that pure ray tracing solves with current GPU technology. It boils down to if ray tracing hacks given better hardware support can solve those better/faster/cheaper. If not, that transistor and power budget is better spent elsewhere. The jury is still out on that of course, although the distinct lack of hard data at launch was a broad hint about the current state of affairs.
    Ray-tracing is the room temperature super conduction of rendering. We have been able to use it for decades and have done so where it make sense. Noone has been able to make that approach efficient enough even with dedicated hardware for general gaming use. I doubt nVidias hybrid proposal is now, it’s just implemented on powerful enough hardware that their margins allow some use of raytracing while still maintainig reasonable performance. But that is not good enough for the overall industry where the push towards efficiency is total.
     
    Ike Turner and vipa899 like this.
  4. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,357
    Likes Received:
    1,382
    The 360 gave us first gen unified shaders.

    About the best thing that can happen for subsequent generations of RT hardware on PCs is for first gen RT hardware to show up in the next consoles.
     
    #144 dobwal, Aug 25, 2018
    Last edited: Aug 25, 2018
    egoless, Scott_Arm and Heinrich04 like this.
  5. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,416
    Likes Received:
    7,652
    Location:
    London, UK
    Exactly this. Nvidia know that vast majority of their profits are derive from workstation, server and datacenter purchasers. Their market segment slide suggests gaming is big but they segment markets by product and a metric ton of 'gaming' products (everything Geforce branded) go in small, medium and large server farms than gaming PCs to the extent that Nvidia updated the end user licence terms for Geforce products to exclude data centre usage (not enforceable in many countries) earlier this year.

    5-6 years back, Nvidia's R&D was focussed on gaming GPUs and everything else else was a byproduct, whereas now Nvidia's R&D is focussed on what-we-used-to-call 'big iron' and gaming GPUs are taking a backseat. 2080 isn't getting RT because Nvidia think gamers want it, it's because this is what a lot of large scale commercial server farms want in new server products.
     
    w0lfram, milk and Ike Turner like this.
  6. eloyc

    Veteran Regular

    Joined:
    Jan 23, 2009
    Messages:
    2,262
    Likes Received:
    1,374
    While I mostly agree with your abcdef points, I don't think the last thing you mention would be such a misstep. Yes, think about a company launching with RT BUT the other showing off games with better resolution, AA, physics and whatnot thanks to the current state and evolution of traditional graphics, harboured by the additional power of a new hardware that will not use any of its resources for the computationally expensive process of RT. As a visual standpoint, the competition is there.

    I think that next generation is a transitory one where RT will be resolved either by including it or by leaving it out, waiting for the tech to be more mature and cheap in the future. So, I'm sure that in the next generation that comes after that, RT will be a given.
     
    milk likes this.
  7. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,416
    Likes Received:
    7,652
    Location:
    London, UK
    Until RT technology is ubiquitous and all non-RT hardware platform are dropped, RT is more expensive because you do the solution for everything not-RT (100% of all gaming technology now) and then expend effort to support RT on the 0.01% (maybe) market that have RT hardware this year and next. That assumes that a better solution to the problem Nvidia's RT is aiming to solve hasn't been developed that makes the bespoke hardware model obsolete.

    Again, like I keep mentioning, PhysX hardware that became obsolete in gaming platforms as compute was at the core of conventional graphics hardware. If you want an idea of how technology will evolve in the next five years, you can gain valuable insights looking back at the last fifteen, tracking the technology paradigms and seeing how they went.
     
  8. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    596
    Likes Received:
    266
    https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-second-quarter-fiscal-2019

    The Pro market is the margin cream market on top of the volume gaming market for Nvidia, gaming is $1.8 billion in revenue vs $0.281 billion for pro vs $0.78 billion in datacentre. The gaming market pays the bills across the product range and the fat margins are earned in these other lines. The difference between a Pro and Gaming card is down to manufacturing quality (better PCB, solder fill, etc) and drivers. Nvidia wins in the Pro space almost by default because their Pro drivers are better and have broader support. AMD has stepped up but Nvidias investment in CUDA and the tools means a lot of vfx tools for example still rely on CUDA vs OpenCL. Hardware features has feck all to do with it.
     
  9. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,416
    Likes Received:
    7,652
    Location:
    London, UK
    No. :nope: For the reason I stated above. Nvidia report all non-pro/data/server card sales under gaming even if 20,000 1080Ti cards go into a single data centre. This is a big problem for Nvidia who, as I stated above, don't want consumer cards being used in data centres and have updated their end user licence in a bid to deter data centres buying 'gaming' cards.
     
    Heinrich04 and mrcorbo like this.
  10. Ike Turner

    Veteran Regular

    Joined:
    Jul 30, 2005
    Messages:
    1,967
    Likes Received:
    1,988
    I can attest that most game & VFX studios I've personally dealt with use Geforces instead of Quadros.
    And as stated earlier having RT today in the Geforce line is a direct by-product of developing it principally for the professional market where it was really needed especially for content creation.
     
    mrcorbo and DSoup like this.
  11. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,416
    Likes Received:
    7,652
    Location:
    London, UK
    I manage a large farm and know a lot of people who also manage farms and this is also my experience.
     
  12. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    596
    Likes Received:
    266
    Yup as someone who used to sell Quadro into the digital media space this is exactly what happened because for the same perf the Quadro card was 4-5 times the cost so even though the Geforce cards burnt out faster it was still cheaper to buy spares. It was the shift by Adobe to DirectX and ceasing to certify specific cards that really started the rush to Geforce. The Quadro space is now mostly about CAD/CAM which is largely still OpenGL and will be for a long time yet. In the customers I worked with real time RT was seen as a feature in search of a use case, animation houses need better accuracy and thus preferred offline renders, architects saw little need, now it has been 2 years since I last sold into this space but it wasn't driving much interest then.

    Oh I agree Nvidia does not want Geforce in servers as the dedicated server cards are larded to hell and back with margin same as the Quadro series which is why I describe them as "cream". Geforce is a relatively low margin relative to the rest which is why Nvidia has always invested heavily in driver support and certifications in the pro space to ensure they dominate there.
     
    mrcorbo and DSoup like this.
  13. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    PC gaming is bigger then ever, even bigger then console gaming in total, nvidia knows this. The 20 series are going to be popular in special when cheaper variants arise.
     
  14. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,416
    Likes Received:
    7,652
    Location:
    London, UK
  15. OCASM

    Regular Newcomer

    Joined:
    Nov 12, 2016
    Messages:
    921
    Likes Received:
    874
    Push towards wasteful resolutions you mean.
     
    vipa899 likes this.
  16. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    This is true, unless you don’t. Consoles represent such a large piece of determining what the base system requirements are that as long as the console manufacturers all support it, you can probably move forward with RT as a base and ignore any sort of rasterization hacks on PC. Developers will need to weigh out the cost of renderering hacks, drivers, world builders, artists and everything else that goes into a perfectly hacked scene. I see RT being built into the engine but perhaps I’m wrong. But aside from capital investment into RT methods, I don’t see it being a per scene orchestra that some faked scenes need to be.

    I suspect by the time 2020 rolls around we should have seen the 2060 RTX and below models by then. AMD should have responded by then, it’s not like DXR was announced last week.

    A majority of AAA don’t run on iGPUs very well either, and in order for them to run the games need to be cut down so badly for them to operate.

    I get all of this is new and the skepticism should be very present and up front. But we are also looking st consoles that will not launch for another 2 years.

    2 years is a lot of time for things in this landscape to change.
     
    vipa899 likes this.
  17. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Exactly my thoughts too. By the time PS5 releases something like a RTX2060 will be there, or even the next series 3070 etc. Guess console hardware will be in the same state as it was with PS4/xbox one, below medium pc hardware but good enough graphics thanks to optimization etc.
    Im thinking with AAA games like Cyberpunk, or the next game from cd projekt, or perhaps the next battlefield or even halo pc using RT tech in some form.
     
  18. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,416
    Likes Received:
    7,652
    Location:
    London, UK
    Sure. Maybe you're already a dev millionaire and are content to market your game to a target audience of a few million people with prospective sales measured in tens, or if you are exceeding lucky - a couple of hundred thousand, prospective buyers by limiting your game to technology in marginal usage.

    Can you take a quick look at games released this year on Steam and tell me how many have minimum hardware requirements less than five years old at the time of release? Let's see how many billionaire devs/publishers are out there who can afford to make such brave decisions.

    Greater than the risk of missing on the latest tech is heavily investing in obsolete-by-next-year tech. No doubt if you were posting here in 2006 you would have been advocating for Sony and Microsoft to be incorporating PhysX hardware in their consoles. How stupid would Microsoft and Sony have been to pass on that! :runaway:
     
    Lalaland likes this.
  19. Tkumpathenurpahl

    Tkumpathenurpahl Oil my grapes.
    Veteran Newcomer

    Joined:
    Apr 3, 2016
    Messages:
    1,543
    Likes Received:
    1,380
    Potentially, but there's still a chance of a launch next year.

    If the PS5's coming in 2019, the recent DXR demos won't have any bearing on its direction.
     
    BRiT likes this.
  20. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    Right I don't disagree with your points, perhaps there's a large degree of selfishness in my perspective. To be clear, I didn't really get into consoles until this gen. Prior to that it was mainly to purchase consoles for fighting games and what not, paying for Gold didn't make sense to me.

    But I do recall somewhat the pain of being a PC gamer during that period.
    When Xbox 360 was first released, it released with DX9 and some enhancements, DX9 came with a lot of new features, and 360 shipped with unified shaders as well.
    Fast forward some years DX10 is skipped, DX11 is released, and majority of games were still using DX9 as their platform. And video card after video card with newer and newer features were largely ignored, support for those features were dismal.
    I bought 1 DX10, and 3 DX11 cards, and none of them were really utilized until after 2013 as far as I can see.
    And that's brutal considering compute shaders were ready to go and out there years before XBO and PS4 landed on the scene and it still took and additional 5 something odd years for compute shaders to be a big part of rendering.
    But you can't blame that generation for holding things up, because compute shaders weren't ready, the APIs weren't communicated, nothing was ready for when they shipped.

    Now we're in the _exact_ same position again, DXR hardware is released 2 years before the next console release. Are platform holders really going to miss the boat on this one and hold us up for another 8 years? Because without console support, the PC space is going to have dismal movement.

    So I don't know how or what RT will bring to the market, I really don't know. But I do know that if consoles miss the boat on it, RT isn't going to move forward at all. And we're stuck with basically nearly 15+ years of DX11 features titles, 17+ years if you count the DX11 release date. It would mean that RT in 2018 wouldn't really ramp up to anything effective until 2030 nearly due to how long it takes to ship a title.

    This whole discussion is ironic honestly. Because prior to turing I was a believer in soft transitions and rolling generations and you believed in hard cut generations to move forward new technologies.

    Now you and I appear to have switched positions entirely on it since the release of Turing.
     
    #160 iroboto, Aug 25, 2018
    Last edited: Aug 25, 2018
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...