Next gen lighting technologies - voxelised, traced, and everything else *spawn*

So yeah, Turing's RT Cores go against' s Microsoft's DXR vision. But this just strengthens my belief that Turing is principally a Pro grade GPU aimed and conquering the Deep Learning and most-importantly (compared to Volta) the CGI industry by totally replacing CPU based render farms in the long run (which is IMO the right way to go and I fully support NVidia in this endeavour).


My thoughts exactly.

I think Turing will kill it on offline rendering, from smaller workstations to CPU-based render farms. Hardware dedicated for raytracing + hardware dedicated for AI denoising are bound to make a huge difference.
I wish I had one (plus the necessary licenses for using nvidia's algorithms...) because it probably allows SolidWorks to have a secondary 5-10 FPS window opened with the raytraced object, while working on its geometry in the main program. Radeon Pro Render already allows that, but it's too slow to be usable on a Vega 64.


But for games.. it's just not going to lift off on Turing. The performance isn't there for real-time and if even the scripted demos we saw so far have lots of shortcomings (low resolution and/or low framerate, reduced scene complexity, etc.) then the result on actual gameplay doesn't seem promising at all.

But Turing's non-RT features like DLSS and foveated rendering OTOH do seem promising.
Also assuming DLSS works properly in non-scripted gameplay, which we also don't know if it does...
 
We're not getting path tracing right out the gate but you gotta start with something.
In the offline space. We had Silicon Graphics years before N64. Nintendo didn't try and shove a cut-down geometry pipeline into their second console stating, "we have to start somewhere." 3D consoles that got going early are the likes of the Jaguar, that sucked because it was too little too early. Those that did 3D at a decent enough and affordable level were the PS and N64, and they did very well. Same with VR - it couldn't happen realistically until now when the tech made it possible, and earlier attempts over the years failed and didn't advance much at all.

We don't need raytracing in a console; we need effective raytracing in an affordable console that isn't massively gimped in some aspects to fit an early-access feature.
 
In the offline space. We had Silicon Graphics years before N64. Nintendo didn't try and shove a cut-down geometry pipeline into their second console stating, "we have to start somewhere." 3D consoles that got going early are the likes of the Jaguar, that sucked because it was too little too early. Those that did 3D at a decent enough and affordable level were the PS and N64, and they did very well. Same with VR - it couldn't happen realistically until now when the tech made it possible, and earlier attempts over the years failed and didn't advance much at all.

We don't need raytracing in a console; we need effective raytracing in an affordable console that isn't massively gimped in some aspects to fit an early-access feature.
PS1 3D quality was trash. And a lot of geometry back then had to be represented as 2D billboards because those consoles didn't have the power for full 3D graphics.

Besides, ray-tracing isn't all or nothing. Hybrid rendering isn't even a compromise since it has advantages on performance over pure path tracing.

The demos we've seen so far are the earliest phase. No clever algorithms yet. Basic techniques implemented as an afterthought on games designed purely with rasterization in mind.

Next year is when we'll start to see the good stuff.
 
Next year is when we'll start to see the good stuff.
That's missing the issue. Whether hybrid is great or not (it probably will be), how do you get realtime raytracing into a $400 console?! Do you make the consoles $800, or do you have a totally gimped RT solution, or do you gimp the rest of the rendering pipeline? If it's physically not possible to add RT into a console frame and economics, is it okay to create a console without RT? Yes. Will realtime RT suffer? No, it'll see loads of investment still. It'll still be featured on PC. Engines like UE will add it as those engines are also used for professional visualising etc.

However awesome hybrid renderers get on $1000 GPUs over the next few years, RT is not a necessity for the next iteration of consoles, unless they choose to wait until RT is a mainstream feature. People need to stop saying, "we need raytracing because it's the future!" and start presenting realistic solutions to how to get raytracing into a cheap box. And as I said, hopefully that can be done with better solutions than nVidia's, but that's not on any roadmaps so far and will come as a happy surprise if it happens.
 
That's missing the issue. Whether hybrid is great or not (it probably will be), how do you get realtime raytracing into a $400 console?! Do you make the consoles $800, or do you have a totally gimped RT solution, or do you gimp the rest of the rendering pipeline? If it's physically not possible to add RT into a console frame and economics, is it okay to create a console without RT? Yes. Will realtime RT suffer? No, it'll see loads of investment still. It'll still be featured on PC. Engines like UE will add it as those engines are also used for professional visualising etc.

However awesome hybrid renderers get on $1000 GPUs over the next few years, RT is not a necessity for the next iteration of consoles, unless they choose to wait until RT is a mainstream feature. People need to stop saying, "we need raytracing because it's the future!" and start presenting realistic solutions to how to get raytracing into a cheap box. And as I said, hopefully that can be done with better solutions than nVidia's, but that's not on any roadmaps so far and will come as a happy surprise if it happens.
Console hardware design doesn't require RT to be gimped. Jaguar cores, anyone?

NVIDIA starting to roll the ball early in PC space is good because it allows developers to familiarize themselves with the technology, creates interest and hopefully demand to have it in next-gen consoles and gives enough time to develop efficient algorithms that can be used in AAA games designed with RT in mind from the beginning. We should see some of that work by next year's GDC.

If consoles avoid RT that means it will be relegated to an afterthought for the next generation except maybe for excentric PC indie games. Sure, visualization will make use of it but performance isn't critical in that space, only quality so you wouldn't see the push for the kind of crazy algorithms we regularly see used in games.

Of course, don't expect the same quality on consoles as you would on PC.

This conversation reminded me of this:

 
Again, you haven't addressed the issues. You parrot that you want RT. Yeah, it'd be great. Now explain how to get RT in a $400 box. ;) RTX 2070 is 450 mm² at 16 nm for 6 gigarays per second. Start from there for a console design that includes functional raytracing. Are you going to design a $600+ machine, or wait three years before releasing new hardware?
 
Last edited:
Again, you haven't addressed the issues. You parrot that you want RT. Yeah, it'd be great. Now explain how to get RT in a $400 box. ;) RTX 2070 is 450 mm² at 16 nm for 6 gigarays per second. Start from there for a console design that includes functional raytracing. Are you going to design a $600+ machine, or wait three years before releasing new hardware?

And even the RTX 2070 arguably may not be fast enough to provide a compelling experience without a lot of compromises.

And while I'm rooting for AMD because we need some competition in the graphics space, I'm doubtful they are currently capable of making something smaller AND more performant than the RTX 2070 when it comes to accelerating key parts of Microsoft's DXR.

Regards,
SB
 
I've just realised this is the Architecture thread, not the console thread! :oops:

I've no complaints with what nVidia is doing in the PC space. If they want to make a pro card available for gamers, and gamers want to buy it, that's all good.
 
Again, you haven't addressed the issues. You parrot that you want RT. Yeah, it'd be great. Now explain how to get RT in a $400 box. ;) RTX 2070 is 450 mm² at 16 nm for 6 gigarays per second. Start from there for a console design that includes functional raytracing. Are you going to design a $600+ machine, or wait three years before releasing new hardware?
Are we sure these cards couldn't have been cheaper and the current pricing isn't just NVIDIA jacking up the prices because they can?

RTX 2070 and above might be necessary for current naive, "add-on style" RT implementations on PC but I think an engine designed with RT in mind from the beginning would be much more efficient, specially on consoles. Just look at Claybook and that's current gen.

Also, didn't Phil Spencer already mentioned RT for the next Xbox? We still don't know when the next gen consoles will be released so hardware prices could be cheaper by then.

Also, DXR and RTX aren't the same thing. For all we know consoles could adopt a completely different acceleration architecture.
 
PS1 3D quality was trash. And a lot of geometry back then had to be represented as 2D billboards because those consoles didn't have the power for full 3D graphics.
It amazes me how much some 3d veterans fall into the trap of calling an exciting new tech unnecessary and needless, all because some seem to be too stuck in a rigid old habits of thinking. Most developers are genuinely excited for ray tracing on PC, the reception is warm which means adoption will be warm too. It's even part of DX12 and DXR. Which means the basis of it's spread is already in place.

It reminds me exactly of the old hardware T&L days, where journalists and old mummies tried to sway public opinion in the other direction, calling the tech "HYPE" and unnecessary! We all know how that turned out in the end.

Until I see that, I have to think that the T&L hype that nVidia is simply that, HYPE.
https://www.hardocp.com/article/2000/02/04/tnl_does_work_day_3/2

And here is 3dfx dismissing T&L completely in favor of more fill rate in "traditional" graphics
https://www.anandtech.com/show/375

Now explain how to get RT in a $400 box. ;) RTX 2070 is 450 mm² at 16 nm for 6 gigarays per second.
Consoles are coming by 2020 on 7nm, by that time an RTX 2070 equivalent GPU will be much smaller and cheaper to produce. If RT is incorporated into the refresh consoles (say by 2023), it will be even more cheaper. Also consoles don't have strict fps requirements, they will make do with 30fps on 1080p just fine. You then upscale that 1080p output to whatever your heart desire. And 2070 will be capable of doing minimum 30fps at that resolution. Also by 2020 we should have 3070 which will be even more capable. Consoles can also involve their new powerful Ryzen CPU cores in RT process. Making things a easier.
 
Last edited:
Are we sure these cards couldn't have been cheaper and the current pricing isn't just NVIDIA jacking up the prices because they can?

Just look at Claybook and that's current gen.
One can argue that Claybook shows RT hardware isn't necessary as just using shaders can cover that side of rendering without taking up dedicated silicon. Compute-based RT shadows on an open GPU may be a better option for next-gen consoles.

Also, didn't Phil Spencer already mentioned RT for the next Xbox? We still don't know when the next gen consoles will be released so hardware prices could be cheaper by then.
For the console discussion, look at the die sizes. 450 mm² is the smallest RTX card, of questionable performance for realtime RT (maybe 900p instead of 1080p tracing?) at 16 nm. Console die budgets tend to be ~350 mm² including CPU and glue, although some PS's went epic.

Also, DXR and RTX aren't the same thing. For all we know consoles could adopt a completely different acceleration architecture.
I already mentioned RTX may not be the best solution and, fingers crossed, there'll be better options for consoles. RTX though isn't a great fit.
 
10x performance isn't great if it only gets you to 1/4 of your target resolution compared to the required die size. It's just a bad trade for any consumer facing purchase right now. Professional markets are free to eat it all up.
 
Compare RTX to PowerVR. Wizard achieved 300 million rays per second in a mobile GPU. Can't find measurements but a few mm² at 28nm in 2014. 2070 is 6 gigarays per second (although we've no idea if all rays are equal) on 450 mm² at 16 nm. 20 GR6500's could achieve the same, in something well below half the size, not even factoring in improved process nodes. So probably a maximum of a quarter of the size, and quite possibly a tiny fraction

I don't think nVidia's approach to realtime raytracing acceleration is going to represent the best possible
 
It amazes me how much some 3d veterans fall into the trap of calling an exciting new tech unnecessary and needless, all because some seem to be too stuck in a rigid old habits of thinking. Most developers are genuinely excited for ray tracing on PC, the reception is warm which means adoption will be warm too. It's even part of DX12 and DXR. Which means the basis of it's spread is already in place.

It reminds me exactly of the old hardware T&L days, where journalists and old mummies tried to sway public opinion in the other direction, calling the tech "HYPE" and unnecessary! We all know how that turned out in the end.

https://www.hardocp.com/article/2000/02/04/tnl_does_work_day_3/2

And here is 3dfx dismissing T&L completely in favor of more fill rate in "traditional" graphics
https://www.anandtech.com/show/375


Consoles are coming by 2020 on 7nm, by that time an RTX 2070 equivalent GPU will be much smaller and cheaper to produce. If RT is incorporated into the refresh consoles (say by 2023), it will be even more cheaper. Also consoles don't have strict fps requirements, they will make do with 30fps on 1080p just fine. You then upscale that 1080p output to whatever your heart desire. And 2070 will be capable of doing minimum 30fps at that resolution. Also by 2020 we should have 3070 which will be even more capable. Consoles can also involve their new powerful Ryzen CPU cores in RT process. Making things a easier.
Those links are great. The more things change the more they stay the same :LOL:

One can argue that Claybook shows RT hardware isn't necessary as just using shaders can cover that side of rendering without taking up dedicated silicon. Compute-based RT shadows on an open GPU may be a better option for next-gen consoles.

For the console discussion, look at the die sizes. 450 mm² is the smallest RTX card, of questionable performance for realtime RT (maybe 900p instead of 1080p tracing?) at 16 nm. Console die budgets tend to be ~350 mm² including CPU and glue, although some PS's went epic.

I already mentioned RTX may not be the best solution and, fingers crossed, there'll be better options for consoles. RTX though isn't a great fit.
You could do ray tracing on the CPU as well but having dedicated hardware helps. Specially in the dying days of Moore's Law:

https://spectrum.ieee.org/view-from...computer-architectures-and-software-languages

Who knows what level of RT hardware they will put in next-gen consoles, could be inferior even to the RTX 2070's but we also don't know how good RT can be on it with custom tailored algorithms instead of the naive ones used today. For example, using texture space shading for dynamic lightmaps that only update a few texels every frame.
 
Back
Top