Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Maybe it's an unfair assumption on my part, but it's already a released game so I feel like it won't receive the same care as a game that's adopting ray-tracing with significant time during development. I kind of view Battlefield and Tomb Raider as tech-demo implementations, though it seems like Battlefield is in it for the long haul and will continue making improvements.
Yea they are certainly bolt ons, and as we saw with Battlefield, bolt-on RT could lead to a lot of unforseen bugs.

That being said, will still be interesting to see the results. I suspect we'll see a lot of this type of bolt-on behaviour for a lot of transitioning products over the next 2 years.
 
Yea they are certainly bolt ons, and as we saw with Battlefield, bolt-on RT could lead to a lot of unforseen bugs.

That being said, will still be interesting to see the results. I suspect we'll see a lot of this type of bolt-on behaviour for a lot of transitioning products over the next 2 years.

Looking at Metro and Control, it's quite clear their investment is more significant as they're using it more pervasively in their renderer, so I think they'll be more interesting in terms of performance benchmarks and also they'll have time and experience to optimize, as will Nvidia, Microsoft.
 
Issues like what exactly? I am interested to know because this is still a pre-computed effect right?



They can accomodate dynamic geometry, improve light-leaking, reduce staircaising from the low-resolution svt and solve for near-field global illumination.

The example with the magazine receiving somewhat random gi data because of the gi resolution being 25cm is interesting. He talks about sampling the middle between the data above and below the table, so I don't know if that's like lerp, but regardless the magazine on top of the table is affected by the lighting underneath the table because of the resolution of the svt. They can fix it by shooting rays from the geometry into the gi volume.
 
Last edited:
This is a demo from a voxel lighting solution I linked before, that hasn't developed into something usable in games yet, plus has quite a few artefacts. It's the kind of thing I think shown on RTX with really good framerates, scaled up to full scenes, would prove the value of RT hardware.

 
Last edited:
Interesting interview with Phil Spencer which talks about X Cloud and next generation silicon. Flexible silicon for multiple uses including machine learning. Tensor core could fit in here.

https://twinfinite.net/2018/12/phil-spencer-buy-ea-next-gen/

“The thing that’s interesting for us as we roll forward, is we’re actually designing our next-gen silicon in such a way that it works great for playing games in the cloud, and also works very well for machine learning and other non-entertainment workloads. As a company like Microsoft, we can dual-purpose the silicon that we’re putting in.

We have a consumer use for that silicon, and we have enterprise use for those blades as well. It all in our space around driving down the cost to serve. Your cost to serve is made up by two things, how much was the hardware, and how much time does that hardware monetize.

So if we can monetize that hardware over more cycles in the 24 hours through game streaming and other things that need CPU and GPU in the cloud, we will drive down the cost to serve in our services. So the design as we move forward is done hand-in-hand with the Azure silicon team, and I think that creates a real competitive advantage.”
 
This is a demo from a voxel lighting solution I linked before, that hasn't developed into something usable in games yet, plus has quite a few artefacts. It's the kind of thing I think shown on RTX with really good framerates, scaled up to full scenes, would prove the value of RT hardware.


I don't think you need to go that far to prove the value of RT. There's a lot of room between direct sampling of pre-computed GI and fully real-time computation, either path-traced or with lower resolution voxel representation. The more dynamic the better, but for many games that are mostly static, pre-computation will always be faster, especially because you can bake infinite light bounces, and for all scene lights. GI really is kind of the ultimate goal though. I think ray-traced AO that looks "correct" will be a bigger win that people expect, and so will soft shadows. They'll remove a lot of the flat look that games can have, and give scene much better depth. Lighting will probably be an even bigger jump, and that Remedy solution looks pretty nice. We'll see what ends up being in control, and whether it includes their "near-field" GI, which solves a lot of the flat lighting look a lot of games have.
 
With no idea what the hardware is, there's not a lot to discuss. Very unlikely to be RTX based as performance is well below other RTX demos, though it could be doing something different like 100% realtime tracing instead of hybrid. It's impressive if running on a PS4 (my guess given the high degree of noise, simple scenes, slow quality iterations, and PD's state as PlayStation developers). It may point to RT features in PS5, or not. Not even that pretty thanks to video compression!
 
With no idea what the hardware is, there's not a lot to discuss. Very unlikely to be RTX based as performance is well below other RTX demos, though it could be doing something different like 100% realtime tracing instead of hybrid. It's impressive if running on a PS4 (my guess given the high degree of noise, simple scenes, slow quality iterations, and PD's state as PlayStation developers). It may point to RT features in PS5, or not. Not even that pretty thanks to video compression!
It was stated during the presentation that it is not running on Playstration hardware.
 
Is that from 9:40? Can't make it out clearly. "We want to remind you that this an old move and not *something* playstation, path tracing in real time."?? Still doesn't give us anything to go on though. If RTX 2080 Ti, performance isn't enough for a next-gen console so we'd need something significantly faster in a PS5, but that's highly speculative. Though of course hybrid rendering would change things considerably.
 
Apparently Gran Turismo developer Polyphony Digital revealed they’ve created their own in-house real-time ray tracing tech
Awesome!
My guess is they do cone tracing, so no noise. On the windows we see some spectral banding. This hints they store infinite hit points in a FFT basis, so the cone goes through all geometry and after that it is checked where it is terminated and colors behind are rejected.
With cones it makes no sense to check for accurate occlusion anyways, and the approach allows very good parallelization allthough the 'any hit' way seems a waste. Of course one can reject terminated cones a few times to compact the work if they go front to back.
Downside: Light leaking might be hard to prevent but for reflections this may be acceptable. Outdoors nor problem anyways.
(pure speculation - as always - did not listen to audio :) )
 
There's loads of noise. You can see the blotchiness of the denoising that gets refined over time. The specular highlights show low sampling rates with how the bloom flickers (7:25). Audio doesn't go into details at all other than to say it's path tracing in realtime.
 
There's loads of noise. You can see the blotchiness of the denoising that gets refined over time. The specular highlights show low sampling rates with how the bloom flickers (7:25). Audio doesn't go into details at all other than to say it's path tracing in realtime.
The noise i see is static to the recording camera, so it must come from the recording. Blotches may come from both temporal reconstruction and video compression - hard to say, but i doubt they can trace every pixel per frame.

Edit: Although 'path tracing' makes cones more unlikely.
 
I've also just realised that the flickery bloom could come from downsampling the buffer to blur and upscale - bloom on the PS4 version of GT Sport flickers. Looking at the reflections in the red car of the bright rectangular buildings, they are very blotchy until they pop into high quality. This to me looks like a noisy sampling, and the sampling pattern could be static.
 
Haha, yeah you're right - the recording camera does not move at all, they just zoom the scene and i was fooled :) Agree about flickering from upsacle. I also got above comment wrong and assumed it IS playstation hardware. Probably i'm wrong with anything here - we'll see...
 
Status
Not open for further replies.
Back
Top