Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Regarding the debate around general-purpose compute vs dedicated "fixed function" HW for RT here's Microsoft's stance on it:

https://blogs.msdn.microsoft.com/directx/2018/03/19/announcing-microsoft-directx-raytracing/

"You may have noticed that DXR does not introduce a new GPU engine to go alongside DX12’s existing Graphics and Compute engines. This is intentional – DXR workloads can be run on either of DX12’s existing engines. The primary reason for this is that, fundamentally, DXR is a compute-like workload. It does not require complex state such as output merger blend modes or input assembler vertex layouts. A secondary reason, however, is that representing DXR as a compute-like workload is aligned to what we see as the future of graphics, namely that hardware will be increasingly general-purpose, and eventually most fixed-function units will be replaced by HLSL code.
"

So yeah, Turing's RT Cores go against' s Microsoft's DXR vision. But this just strengthens my belief that Turing is principally a Pro grade GPU aimed and conquering the Deep Learning and most-importantly (compared to Volta) the CGI industry by totally replacing CPU based render farms in the long run (which is IMO the right way to go and I fully support NVidia in this endeavour). PS5/Xbox Scarlet will support RT but don't be surprised if they don't include dedicated fixed function HW for it.
 
Last edited:
@Ike Turner Many of the Nvidia demos for Turing used Microsoft's DXR API, including UE4's Star Wars demo and Battlefield 5. Neither of these are proprietary demos and would work on any gpu that has drivers that support DXR. Turing is just the first gpu to make real-time ray-tracing viable for games. It doesn't appear that Turing is counter to Microsoft's vision for ray tracing or the DXR api.
 
@Ike Turner Many of the Nvidia demos for Turing used Microsoft's DXR API, including UE4's Star Wars demo and Battlefield 5. Neither of these are proprietary demos and would work on any gpu that has drivers that support DXR. Turing is just the first gpu to make real-time ray-tracing viable for games. It doesn't appear that Turing is counter to Microsoft's vision for ray tracing or the DXR api.
Yes. As I've already said several times here exactly the same thing: all current implementations we have been shown are simply DXR + RTX for denoising (so RT part should work without much dev tinkering on non Turing hardware but slower). My post was about having "supposedly" dedicated HW blocks for RT (NV apparently doesn't want to discuss what the RT Cores exactly are a this time) which is in contrast to Microsoft statement. DXR was built so that dedicated RT wouldn't be necessary. But Nvidia had to do it for the market it was aiming for (render farms). It just logically trickled down to consumer cards. This doesn't mean that this would be the way forward for other vendors.

"Turing is just the first gpu to make real-time ray-tracing viable for games" This is Nvidia's claim but this has yet to be known given that nobody outside them has ever tested those implementations on non Turing GPUs. DXR can be implemented in many ways throughout an engine and we will finally be able to have a benchmark for it in a few weeks when 3DMark releases its latest version with DXR support alongside Windows 10 Redstone 5 (and most probably DXR enabled AMD compatible drivers). The early prototype in March during GDC was running in real-time on a "Single current GPU" (the only DXR feature in it is RT reflections). It will run a lot better on Turing GPUs... that's a given.
 
In the PS2 era there was far less flow from console to PC, in the modern era everything comes out more or less everywhere. During the PC 3D heyday DX seemed to change every other week and Nvidia and AMD both could define baselines by launching new must have features. With the 360/PS3 era that changed, console defined the standard and PC started to get console games with bells on and it's stayed that way since. This is why I say consoles define the baseline and you won't see significant investment from software makers in PC specific features let alone GPU specific features such as Turing RT. They'll get a minimal investment if NV stumps up the cash via the TWIMTBP but certainly teams won't be building RT into the base design.

Oh and I think we're mostly on the same page I prefer PC for most single player titles as I have an unhealthy obsession with decent levels of AF that console can rarely satisfy.
But consoles are PCs these days, and the developers still target the feature set of the medium PC which more or less has the same feature set available on consoles. Devs will target new graphic features when the median PC user has GPUs that include them. It's not the consoles. Its what makes sense for the majority of users. Consoles or no consoles devs are motivated to support the mainstream features. It will be a long way untill devs support fully the visual features available on the RTX 20.. series of GPUs. That will only be when those GPUs are mainstream
 
PC gaming is bigger then ever, even bigger then console gaming in total, nvidia knows this. The 20 series are going to be popular in special when cheaper variants arise.

I saw a YouTube report that stated everything xx60 and below will be labeled GTX and lacks RT features.
 
Last edited:
Just had a thought: assuming the next generation of consoles don't contain the equivalent of Nvidia's RT hardware, could the presence of real-time RT still make it easier, quicker, and cheaper for developers to fake it with existing techniques?

For example, using real-time RT to provide an accurate reference point when using existing techniques like GI and light probes.
 
I don't really know where things are going with RT in the near term but I see Polyphony did a presentation at CEDEC the other day.

http://www.polyphony.co.jp/publications/

Not sure if this is software or hardware-based, but interesting that Sony are working on it for future games at least.
From the a quick look at this paper it seems to describe their light baking solution (not real-time) for the track lighting and for the photo-mode/scapes feature.
 
Just had a thought: assuming the next generation of consoles don't contain the equivalent of Nvidia's RT hardware, could the presence of real-time RT still make it easier, quicker, and cheaper for developers to fake it with existing techniques?

For example, using real-time RT to provide an accurate reference point when using existing techniques like GI and light probes.
Ray casts are incredibly common for graphics, AI, etc. so acceleration should be useful. However, I presume the RT hardware in GPUs is highly parallelised, dealing with large batches of rays, so you'd need workloads that fit the hardware, of which ray-traced graphics are ideal as the hardware is designed for that but other stuff like AI, maybe not so?

Hmmm. I've just realised the best accelerator for next-gen is...Cell! Yes, once again! It's a great design for raytracing, would be flexible enough for everything from sporadic rays to full-on scene batched tracing, and for the same transistor count as Turing (18 billion), you could fit ~80 original Cell BBEs. At 0.25 TFlops per Cell peak, that'd be 80x0.25 == 20 teraflops of purely programmable rendering power, excluding all the advantages of a new Cell2 improved architecture that could add a Tensor-like core and other lovely thing.
 
From the a quick look at this paper it seems to describe their light baking solution (not real-time) for the track lighting and for the photo-mode/scapes feature.

That makes sense. Nevertheless interesting that Polyphony are already doing RT so early, though. Must be a good indicator that it will feature heavily next-gen one way or the other.
 
Ray casts are incredibly common for graphics, AI, etc. so acceleration should be useful. However, I presume the RT hardware in GPUs is highly parallelised, dealing with large batches of rays, so you'd need workloads that fit the hardware, of which ray-traced graphics are ideal as the hardware is designed for that but other stuff like AI, maybe not so?
Those ray casts could get plenty expensive the more interactivity you have with the world, AI etc. Thinking about Fallout world builders etc, where you’re often projecting back onto the world. Being attached to the compute queue according to the other doc, might be possible to leverage for such a thing. Would be interesting to hear more if it could be taken advantage of.

Hmmm. I've just realised the best accelerator for next-gen is...Cell! Yes, once again! It's a great design for raytracing, would be flexible enough.
giphy.gif
 
So, in the long future, maybe not next gen or nvidia 3080 hardware, but longer then that, will we see PS2 like hardware, in the sense that theres no specialised hardware features but powerfull flexibel processors?
 
So, in the long future, maybe not next gen or nvidia 3080 hardware, but longer then that, will we see PS2 like hardware, in the sense that theres no specialised hardware features but powerfull flexibel processors?
it may already be flexible - from what we've read so far, it's some additions to the Turing SM.

The Tensor core is probably the only fixed function hardware in there, which, is not necessarily directly related to RT at all. They were leveraging it for an improved AA technique.
 
Well you would expect that first party devs would of had some input on next gen consoles it's just a matter of what hardware AMD had available for next gen. So if Raytracing is viable hardware wise they will have it me thinks but if it doesn't it would be because they came to the conclusion the balance wasn't right just yet.

I feel the they don't have enough silicon for the graphics part of the APU to support Raytracing.
 

Plus, most advanced PC gaming engines already support some form RT. Adapting it to Nvidia's and AMD's wares through Vulkan/DX12 and their respective drivers, shouldn't be overly complicated. IMHO, ray-tracing will be as important to FUTURE gaming as good-quality anti-aliasing is to current gaming.
 
Status
Not open for further replies.
Back
Top