Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Why do you guys believe RT requires special and restricted hardware, although you never tried? I did, and i see no reason why it would not be fast enough.
A good feature is a feature that a majority of developers can extract significant performance, not the few. The goal here is to provide features and hardware that would bring the entire line of graphical performance for all games forward, and not the handful of developers who have very specific setups and goals and algorithms that can extract that performance perhaps for their titles and their talents only.
 
That confirms what a few people have been saying. RTX is just one way to get RTRT in our homes. Very exciting.
Which is what I've been constantly repeating for nearly a year since DXR's announcement last march. Any DX12 GPU can do it ..but it requires driver support and Nvidia will never provide that for anything other than Volta (which doesn't have RT Cores) & Turing (although they support Maxwell, Pascal, Volta & Turing in OptiX6) & AMD can't be arsed to do it given the minor benefits it will bring vs cost$$ and poor support in games now (only 2 games & counting..). AMD will most likely not have dedicated HW for it in the future which is something even 4A Games is hinting at "Those calculations can be done in standard compute if the computer cores are numerous and fast enough (which we believe they will be on the next gen consoles)".
 
A good feature is a feature that a majority of developers can extract significant performance, not the few. The goal here is to provide features and hardware that would bring the entire line of graphical performance for all games forward, and not the handful of developers who have very specific setups and goals and algorithms that can extract that performance perhaps for their titles and their talents only.

Let's take AA as a parallel. On PC, you can force a bunch of different types via the GPU's drivers. On console, you're stuck with what the developers have chosen. For most console gamers it doesn't matter, because high quality AA is so widespread nowadays.

If RTRT is viable without making use of BVH hardware - as seems to be the case - software will move towards it anyway, and you'll get what you want: the bringing forward of the entire line of graphical performance for all games.

Maybe BVH hardware is objectively the best way to do RTRT. Maybe it was just the best way to get the ball rolling. Maybe it's something we'll all LOL at in a few years' time because it turned out to be woefully inefficient compared to the approach that won. I've no idea, I just know that it's exciting to read a developer state that RTRT is coming no matter what.

So maybe, for folks with RTX cards, forcing the game to trace rays via BVH's will improve performance/fidelity. Maybe most console gamers won't care because there will be various widespread RTRT solutions by then.
 
Assumptions. At worst it will be like this: Console scaling up RT more aggressive than PC, lower framerates. Nothing new and acceptable with larger view distance and gamepad.
Why do you guys believe RT requires special and restricted hardware, although you never tried? I did, and i see no reason why it would not be fast enough.
NV could say 'you must have our awesome 1200$ GPU because it makes you shiny and more intelligent due to AI!'... and you would just believe.
I'll try damn reflections as soon as possible, if only to proof this. I'll let you know...

At worst we wont see any RT on the consoles at all, but i doubt it. Like that dev noted, its the only way forward, consoles not having RT support in some form hampers them.
This special hardware allows RT to run faster, we dont want minecraft with RT, we want next gen 2020 games with RT. You dont need a 1200$ GPU to be able to enjoy RT though, when im going to settle for one it wont be a 2080ti or RTX Titan, but perhaps a 2070, or a 3060/3070 whatever the new gpus will be named. Im going towards waiting and getting a RT gpu in a about two years, when the successor to Navi (or a highend Navi), or turing sueccessor.

You will have to proof AAA games like BFV and Metro, Atomic Heart, Tomb Raider or any of those games, can things like that be done on lets say a Titan V as opposed to a 2080TI, or in my case when a 1070 will perform equal to a 2060 in RT. I wouldnt have to upgrade my TV gaming pc for a while.
Something like a Titan V or a RTX Titan or even a 2080Ti would be a monster at compute RT/DXR even without special hardware, probably faster then what will be in PS5.

RT is very intensive depending on what resolution your aiming for. For Q2 RT a vega64 is doing 70-120fps, 2080Ti 180-280fps, this tells me the RT cores are doing their thing in full path traced games. Scale the resolution and more complex games like SWBF or Metro, or a next god of war game, and RT will take on performance, in special if we want 4k. Like i said im sure all GPU's can do RT, and next gen consoles are going to have enough TF's to do RT in compute, but performance still is important when we want next gen looking games too. Or are people statisfied with Quake 2 on them?

A good feature is a feature that a majority of developers can extract significant performance, not the few. The goal here is to provide features and hardware that would bring the entire line of graphical performance for all games forward, and not the handful of developers who have very specific setups and goals and algorithms that can extract that performance perhaps for their titles and their talents only.

Exactly, even when those few can extract enough performance for RT, it wont be fast enough in compute, for now. Like the reasons i stated above.

Any DX12 GPU can do it ..but it requires driver support and Nvidia will never provide that for anything other than Volta (which doesn't have RT Cores) & Turing (although they support Maxwell, Pascal, Volta & Turing in OptiX6) & AMD can't be arsed to do it given the minor benefits it will bring vs cost$$ and poor support in games now (only 2 games & counting..).

Yeah even PS4 can do, hell my 970 can do RT. Ive seen demos on a 750Ti. Volta (Titan V) tanks in RT, even in the Q2 RT.
Two games, and Quake 2, but more are coming, here is the list
https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/

Thats quite huge for such a new tech. 2060 adoption is just starting so install base still grows.

"Those calculations can be done in standard compute if the computer cores are numerous and fast enough (which we believe they will be on the next gen consoles)".

Yes any GPU can do it, we have to think performance though, next gen games arent going to be minecraft or fortnite.

Maybe it's something we'll all LOL at in a few years' time because it turned out to be woefully inefficient compared to the approach that won. I

Probably, like the OG Xbox had hardware features the PS2 (and to some extend GC) didnt, we saw bump mapping etc for the most only on OG Xbox, things moved to more flexible over time but graphics atleast werent hampered for some years.
 
Let's take AA as a parallel. On PC, you can force a bunch of different types via the GPU's drivers. On console, you're stuck with what the developers have chosen. For most console gamers it doesn't matter, because high quality AA is so widespread nowadays.

If RTRT is viable without making use of BVH hardware - as seems to be the case - software will move towards it anyway, and you'll get what you want: the bringing forward of the entire line of graphical performance for all games.

Maybe BVH hardware is objectively the best way to do RTRT. Maybe it was just the best way to get the ball rolling. Maybe it's something we'll all LOL at in a few years' time because it turned out to be woefully inefficient compared to the approach that won. I've no idea, I just know that it's exciting to read a developer state that RTRT is coming no matter what.

So maybe, for folks with RTX cards, forcing the game to trace rays via BVH's will improve performance/fidelity. Maybe most console gamers won't care because there will be various widespread RTRT solutions by then.
If you look in the current landscape there are a multitudes if varying engines and setups and tools in the space. All these engines are optimal at different things and most of the time it’s much cheaper to upgrade an older engine than to port to a newer engine.

We know this because we see this within a great deal of a number of companies like Ubisoft and Bethesda to name a couple. They have multiple engines with toolsets and even though Snowdrop is their latest and greatest, all their other games are running on older engines. Only Division and newer titles are starting on Snowdrop.

We’ve seen what happens when teams try to port their entire workflow and toolset to a new engine, ie BioWare. ie. Duke Nukem Forever, ie. a lot of games that underwent engine switched and the results are often more than not, terrible.

The idea of DXR is to help the transition of RT to our current landscape, which is a bolt on for the near forseeable future at least for a generation. It’s a bolt onto some older engines, and a bolt onto newer ones.

End of the day, the idea that we could have real time ray tracing performance on every setup of engine today without hardware acceleration of some form is fairly naive. And that is me giving consideration of the possibilities that we could see RTRT on compute only, though in my heart I doubt it.

When we hit our second or third generation of RT hardware I can see a flexible route coming into play. But not now.

Didn’t geometry shader start off as totally flexible? No one uses it because it performs slowly? Perhaps someone with more knowledge on the GS can share some history and insight on this one.
 
So maybe, for folks with RTX cards, forcing the game to trace rays via BVH's will improve performance/fidelity. Maybe most console gamers won't care because there will be various widespread RTRT solutions by then.

Titan Volta is suffering in RT though, its a monster of a compute GPU, it should be possible to run Q2 in atleast the same framerates as a 2080?
Most console gamers dont care cause they dont even know what RT is, probably. Those that care so much for new tech aint be bothering with a 399$ machine that wont achieve the same fidility anyway.
 
If I were AMD I sure as hell would not be in a rush to develop hardware RT acceleration. Now that they still have both next gen console contracts in their pockets, they decide what features become standard for the years to come. Let Nvidia pour their money on that endeavor. While Nvidia spends time and resources pioneering consumer level RT hardware, AMD can focus their (comparatively) scarser R&D resources on getting the fundamentals of their new arch right. Unlike Nvidia, AMD has to pick their battles much more carefully.
All in the safety of knowing no matter how much Nvidia claims RT will change the world and "just works" it will keep being a niche feature untill it's supported broadly, specially including consoles.
By a couple years, and after Nvidia has spent millions researching HW solutions, marketing and implementing patches to games and demos, the feature will still not get off the ground untill AMD picks it up and it's brought to consoles. At which point, a lot of know-how will have been developed, and it will be much easier for AMD to implement their take on the thing in hindsight, and perhaps even Nvidia will have shared more of it by then in trying to promote it more and get devs to use it better. Maybe Nvidia will even share some knowledge with AMD (however begrudgingly) just so the thing does end up becoming the industry standard they want it to.
AMD's position regarding consoles, has put them in a comfortable spot of being the gate keepers of what HW features are the industry standard for consumer games, and Nvidia can't complain because they left that space open of their own choosing.
 
If I were AMD I sure as hell would not be in a rush to develop hardware RT acceleration. Now that they still have both next gen console contracts in their pockets, they decide what features become standard for the years to come.
AMD is certainly working on their own RT hardware as they’ve said. MS is certainly working on RT for next Xbox. Just a question of Sony.

Alignment of API will be an interesting discussion in the console space.

It’s great that nvidia is pushing now for Xbox fans. As soon as XB2 is released you’ll be back patched 2 years worth of titles that use DXR for their ray tracing. That’s a hell of a RT library to launch with.

The timing for MS is very good on this.
 
Now that they still have both next gen console contracts in their pockets, they decide what features become standard for the years to come.

This wasnt the trend for all consoles except this current generation, where diminishing returns and smaller jumps are in play. This seems to be changed now though, as still being in the 8th generation, we allready see the consoles lagging behind in tech.
AMD isnt going to change the world though, your forgetting MS, Intel, Nvidia etc.

By a couple years, and after Nvidia has spent millions researching HW solutions, marketing and implementing patches to games and demos, the feature will still not get off the ground untill AMD picks it up and it's brought to consoles.

Thats you guessing though, a big company like Nvidia sure has a team of analysts of what is good to do for their economics. Also, your consoles arent the magic bullet either, Nvidia left that space cause they apperently dont need it to survive, margins are small in the console space even though volumes can be somewhat high.
Adoption of RT tech is growing and growing, 2060 being able to do RT at reasonable speeds, and MS releasing about every game on windows aside from their xbox, and focussing more and more on AAA games/studios, MS has both pc and xbox/console markets. That and they are designing Halo Infinite with PC in mind, the E3 tech demo showed Master Chiefs helmet with RT-like reflections. Since pc gaming alone is bigger then PlayStation gaming, its not att all true that its a niche market, not for RT either as install bases are growing. MS/DXR will only expand upon it. Its not att all impossible that besides Halo more games are being designed with PC in mind.

Your also forgetting MS, they support this whole DXR/RT, and seeing that Halo Infinite, Xbox's about biggest title is being developed and designed on pc as lead platform could perhaps support RT. Maybe their next console has RT and Sony's doesnt, narrowing only PS5 being left without the tech.

AMD is certainly working on their own RT hardware as they’ve said. MS is certainly working on RT for next Xbox. Just a question of Sony.

Alignment of API will be an interesting discussion in the console space.

It’s great that nvidia is pushing now for Xbox fans. As soon as XB2 is released you’ll be back patched 2 years worth of titles that use DXR for their ray tracing. That’s a hell of a RT library to launch with.

The timing for MS is very good on this.

Its hard to grasp why some dont understand this. Its never a good idea just looking only straight ahead.
MS might have a smaller install base now, but they arent going to give up, they know exactly what their strengths are, seeing MS has their hand in PC gaming market its not just to say that consoles are world changing just alone. Right now its RTX thats changing things, like DF noted game changing.
 
Last edited:
A good feature is a feature that a majority of developers can extract significant performance, not the few. The goal here is to provide features and hardware that would bring the entire line of graphical performance for all games forward, and not the handful of developers who have very specific setups and goals and algorithms that can extract that performance perhaps for their titles and their talents only.
Sounds like a good argument, but in case of DXR+RTX it is executed very badly. Yes it makes it easy to add some RT to your game, but it makes too many options impossible.
Further i doubt AAA graphics devs need a helping hand to introduce then to RT. They all have their toy raytracer at home, or they worked on RT baking tools.
And finally pretty much every game / engine has its specific setup, goals and requirements.

Something like a Titan V or a RTX Titan or even a 2080Ti would be a monster at compute RT/DXR even without special hardware, probably faster then what will be in PS5.
Last time it was the other way around. First gen GCN was faster in compute than the Titan back then. Couldn't believe it myself at first.

RT is very intensive depending on what resolution your aiming for. For Q2 RT a vega64 is doing 70-120fps, 2080Ti 180-280fps, this tells me the RT cores are doing their thing in full path traced games.
You draw conclusions based on the limited data you have. I have infinite bounce GI on quake level which looks very similar to Q2 RTX (which is limited to just one bounce), and it runs on first gen GCN at 60 fps, GI is calculated independent of rendering solution.
If you believe this, this is possible because i use an algorithm that is faster than path tracing. And this is what you constantly ignore: Better software can gain much larger speed ups than better hardware. RTX can produce more exact shadows in close ups - that's the only advantage in comparison, but infinite bounces add much more.
Again: We would have approached RT in any case, also without NVs helping hand. Simply because SSR does not work. The only reason we did not is XBox One, which is the slowest and so sets current state of the art which can't be exceeded that much anymore.

That said, hardware to make things faster is always welcome. But hardware that adds more restrictions than options is not necessary to bring RT to games. We are not in an urgent need for that.
May next gen be the opportunity to show this.
 
It is not in the AAA non F2P space.

Those singleplayer play-once games are going to be a thing on xb/pc too.

Last time it was the other way around. First gen GCN was faster in compute than the Titan back then. Couldn't believe it myself at first.

7970, R9 290x?

You draw conclusions based on the limited data you have.

Im seeing a $3k Titan V being outperformed by a large margin. Those RTX gpus can do compute rt rather fast too, but for complex games those resources are better put to use for normal rendering. For less complex games like TTC its suffice. Dont forget the bar is higher for 2020/2021 games, 60fps maybe will be a thing and 4k.
 
my general though process is that The code paths are already written for DXR if you shipped a DXR title like Metro. Once you flip the driver to agree to run the DXR path, it should run, so all the titles shipped starting at BFV to launch of next Xbox should be instantly supported.
 
Faster though. Nothing new as we saw what a 970 could do with minecraft. RTX enables RT in modern AAA games with still-playable framerates. Yes we can see RT on current GPUs like 2080 without RT hardware (like the Titan V), but performance will suffer, or the graphic complixity of the game will.
Wish my 1070 could run Metro RT at 2070 levels :)

I didn’t necessarily mean software solutions alone.

RTX is an NV thing. We still don’t know what AMD is working on - I’m not sure they know either - and it sure as hell won’t be called RTX.

So my point was, RTX is not the only way to get RTRT out there. Someone might come out with a fast software version (on very fast compute hardware) and I’m sure at some point AMD will have their method of DXR acceleration. Or we might just end up having a hybrid of both. What’s exciting is that there is potential out there to do things in different ways and we’re all here for it. Waiting. Patiently.
 
7970, R9 290x?
I had a 280x and it was more than twice faster than GTX Titan in my GI project, which was quite large back then and used many shaders, so i do not think my result was an outlier. 7950 beats GTX670 by factor of five, although both have similar gaming performance.
I do not think AMD can do this again (Kepler was just bad at compute, worse than Fermi), and it is not related at all for the topic. Also i don't like to repeat this the third time because some people get mad about it.

Those RTX gpus can do compute rt rather fast too, but for complex games those resources are better put to use for normal rendering
So you think you know how to distribute various resources to various tasks?
But then let me know what games do with compute at all, other than some fine grained culling, or linking lights to screenspace tiles for deferred shading?
Personally I think lighting is the perfect application for underused compute. Because it's not tied to screenspace. So you can do it correctly finally, which likely involves some RT.

I do not forget what the current bar is, open world demands and so forth. My stuff scales. But RTX does not! Which is another thing you never put in question although you don't know, can't know or just ignore.
So how can you present your assumptions, drawn only from observation and listening to marketing blah as facts?

You want it all: 60fps at 4K, raytracing, and all of this for 400. I suggest you leave it to the developers to get there with whatever compromise they think is best (I certainly would put RT cores on the very bottom of the list, with only tensors below that).

Im seeing a $3k Titan V being outperformed by a large margin.
Just because some rich kids are stupid enough to pay 3k for a GPU, this does not mean that GPU is any faster than the regular consumer topmodel, and you know this.
 
What other RT solutions could AMD entertain? Does the BVH or equivalent have to be on GPU? Seems not to me, with nVidia sticking it there as that's their only part in a PC. AMD however could put an acceleration unit in the CPU, or indeed elsewhere in a system. What would the ideal be?
 
Status
Not open for further replies.
Back
Top