Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Posted in other RT thread...
9.2 ms, over half the rendering budget, on the fastest RT GPU. 2.3 ms for shadows and 2.5 ms for denoising, and apparently this is overall GPU time being eaten into and not with denoising happening parallel.independently of the shaders. If denoising is preventing you from shading, that's 30% of the rendering budget gone on 1080p60 shadows.

I think this speaks volumes about the cost/benefit ratio being completely off for next-gen consoles. If RT is going to happen in consoles, it needs to be something very different to RTX.
Control is still a rasterized game with first gen ray tracing as an afterthought. But more importantly, how do those numbers compare with their rasterized equivalents?
 
Control is still a rasterized game with first gen ray tracing as an afterthought. But more importantly, how do those numbers compare with their rasterized equivalents?
seems rhetorical, it wouldn't exist...
 
Check this out Shifty. This has me really excited. 95% of the pixels in this example are not rendered and then the empty space is filled by a deep learning algorithm. This makes me think how a VR headset or AR glasses with eye tracking could be used as a screen for traditional console/gaming PCs, to greatly increase performance.

The rest of the video has a lot of cool stuff. I'm gonna post it in the VR thread.
 
On top of profiling like they did with X1X. I can see support for RT, not at insane levels, but enough for it to become a baseline console feature.

The RT support has to be at a level good enough that it makes a visual impact and is a differentiator for next-gen. What that is I'm not sure.
 
seems rhetorical, it wouldn't exist...
Would be interesting to see the rendering times for the non-RT shadow/reflection/GI solution in Control.

We would then probably be able to figure out the actual difference between the two options, instead of passing judgement on a number (9ms) without knowing the context.

Wait. Am I defending RT??
That's what I meant.

The RT support has to be at a level good enough that it makes a visual impact and is a differentiator for next-gen. What that is I'm not sure.
Dynamic lighting/environments is where RT really shines. Otherwise just stick to baking.
 
2 years is a lot of time for things to change...
The problem here is the next-gen consoles need their hardware locked-down at some point well ahead of two-years-from-now release. The current 'best in class' raytracing on a a huge and expensive chip isn't really good enough for realtime. What HW are MS going to get to choose from if it's going to put RT into XBN? Has nVidia got a far more capable, cheaper solution on the drawing table already?
 
I think MS are in a great position to offer it with a mid - gen console.

They're going to keep on working on DirectX and its ray tracing support, for the few years whilst the hardware's still located exclusively in ultra expensive gaming rigs. Come 2021, the tech should have matured enough for a mid - gen console to be taped out, and Microsoft will be at the forefront, with years of experience.
 
The problem here is the next-gen consoles need their hardware locked-down at some point well ahead of two-years-from-now release. The current 'best in class' raytracing on a a huge and expensive chip isn't really good enough for realtime. What HW are MS going to get to choose from if it's going to put RT into XBN? Has nVidia got a far more capable, cheaper solution on the drawing table already?
They'll have more than 1 design, and we've seen in the past that they will likely have their designs locked in 1 year in advance. So they have at least 1 year left to make the call on RT or not. I'm not saying they don't have a 4K variant in mind, I'm sure they do, one that tries to push the baseline of DX12 and SM6 to the maximum feature set. And they have precedent of X1X running games at 4K to profile against.

For me, the problem I see isn't whether the hardware is ready for Xbox. It's whether they can get PC up to spec. They can do a lot of custom things on the console side to enable features, but they need to make it incentive enough for developers to code a RT path for both PC and Xbox (since MS is fully dedicated to supporting their titles on both now)

Given the way that RT works as a bolt on, it's possible to introduce RT as a mid-gen refresh. But I'm not sure if people will be willing to pay just to get some extra performance +RT.
 
Interesting times.

Would I want to play, say, the next God of War on next gen hardware with exactly the same level of detail, resolution and frame rate as today, but with RT goodies replacing the current shadows/reflections etc?

I think that would be quite nice and no need to go native 4K. But is it an upgrade that would make me and everyone else run to the shops to buy new hardware? Hard to tell right now.

At the same time... what would, say, the next God of War look and run like on non-RT next gen hardware if it didn’t have to waste cycles bouncing rays everywhere? Would that upgrade be visible enough?

Great, now I won’t be able to sleep.
 
Why don't developers just manage to offer the gamers 2 options?
- RT on: a pretty solid option, mostly similar to what we can see today. No significant increases in resolution, FPS or other graphical features, but RT lighting/shadowing and reflections.
- RT off: higher resolution, FPS, AA and other graphical improvements (full cloth/hair simulation instead of canned animations or a simpler simulation).

This way gamers from all tastes could benefit from a more powerful hardware, to see the improvements where they liked.
 
Local consoles = RT off
Local Consoles + xCloud Scarlet Streaming = RT on
 
Interesting times.

Would I want to play, say, the next God of War on next gen hardware with exactly the same level of detail, resolution and frame rate as today, but with RT goodies replacing the current shadows/reflections etc?

I think that would be quite nice and no need to go native 4K. But is it an upgrade that would make me and everyone else run to the shops to buy new hardware? Hard to tell right now.

At the same time... what would, say, the next God of War look and run like on non-RT next gen hardware if it didn’t have to waste cycles bouncing rays everywhere? Would that upgrade be visible enough?

Great, now I won’t be able to sleep.
There's a lot of ways to interpret how things could happen. i mean today we have performance mode and quality mode. which is < 4k + FPS vs 4K + higher settings.
Next gen it could just be < 4K + RT vs 4K + FPS (performance mode)

One thing that has been sticking with me is that most users in 2019, won't have a 4K TV still. That's something we still need to consider. And to make all games 4K and DSR them down to 1080p seems like a waste of power, users are going to have a very hard time differentiating next gen graphics from this gen.
 
Why don't developers just manage to offer the gamers 2 options?
- RT on: a pretty solid option, mostly similar to what we can see today. No significant increases in resolution, FPS or other graphical features, but RT lighting/shadowing and reflections.
- RT off: higher resolution, FPS, AA and other graphical improvements (full cloth/hair simulation instead of canned animations or a simpler simulation).

This way gamers from all tastes could benefit from a more powerful hardware, to see the improvements where they liked.
This is a debate about hardware choice. Putting in RT hardware presently means putting in less rasterising hardware. As a console engineer, you need to make the choice whether your owners are going to prefer 'RT on' or 'RT off' for the entire duration of the generation.
 
This is a debate about hardware choice. Putting in RT hardware presently means putting in less rasterising hardware. As a console engineer, you need to make the choice whether your owners are going to prefer 'RT on' or 'RT off' for the entire duration of the generation.
Even if 7nm comes into play and things change a little bit?

At any rate, if that choice is to be made, I think it's a very difficult choice, because RTRT is something that we can't just ignore nor dismiss anymore, and it may be highly marketable, as well, so... :???:
 
Even if 7nm comes into play and things change a little bit?

At any rate, if that choice is to be made, I think it's a very difficult choice, because RTRT is something that we can't just ignore nor dismiss anymore, and it may be highly marketable, as well, so... :???:

There's nothing to show for it yet. 2080 Ti class hardware at 1080p to barely reach 60fps won't cut it for consoles. The increase in die space alone means you have to cut corners where you know something works, for something that might work.
 
Why don't developers just manage to offer the gamers 2 options?
- RT on: a pretty solid option, mostly similar to what we can see today. No significant increases in resolution, FPS or other graphical features, but RT lighting/shadowing and reflections.
- RT off: higher resolution, FPS, AA and other graphical improvements (full cloth/hair simulation instead of canned animations or a simpler simulation).

This way gamers from all tastes could benefit from a more powerful hardware, to see the improvements where they liked.
Developers would make all the artistic decisions based on rasterization. RT would be just a thin coat of paint on top as is the case with the current line up of games that support RTX.

Games designed with RT in mind is what you need to show the difference.
 
There's a lot of ways to interpret how things could happen. i mean today we have performance mode and quality mode. which is < 4k + FPS vs 4K + higher settings.
Next gen it could just be < 4K + RT vs 4K + FPS (performance mode)

One thing that has been sticking with me is that most users in 2019, won't have a 4K TV still. That's something we still need to consider. And to make all games 4K and DSR them down to 1080p seems like a waste of power, users are going to have a very hard time differentiating next gen graphics from this gen.

That’s basically my point.

If we want what we call a ‘true generational leap’ we’re going to have to get around the obvious diminishing returns (DR from now on). But to do that, we need to know where that DR is hitting.

Is RT actually going to be such a visible thing that it gets around DR?

Or is higher frame rates and basically turning the volume up to 11 on rasterising going to provide a more tangible boost in our eyes?

My question was, I honestly can’t tell what I think right now. Not with the generally underwhelming showings from Nvidia so far at least.

I go back to Spider-Man which, for one, would look amazing with some RT reflections, and basically that’s all it would need really.
 
Which is at least 5 years away, if not more. RT needs to be fast enough compared to existing rasterization hacks for it to even make sense to re-build a big part of your engine for.
More than that, it economics. Why write a ray-tracing optimised engine for PC when 99% of your potential buyers don't have raytracing capabilities? Until it's ubiquitous, RT will be relegated to optional shadowing etc. If it becomes fast enough, RT could replace lighting hacks as a swap-in feature that'll look a lot better. But the market for RT optimised engines isn't there and won't be for years. If both next-gen consoles featured the same RT solution, perhaps it'd have a chance, but realistically we'll see hybrid rasterisers with RT adding some niceties for those who can afford it.
 
Status
Not open for further replies.
Back
Top