NVIDIA's Morgan McGuire predicts that the first AAA game to REQUIRE a ray tracing GPU ships in 2023

RTG's Scott Herkelman agrees with Morgan McGuire:




If Nintendo were to release a new console in 2023, which kinda makes sense, being 6 years after the Switch, and they repeat with Nvidia, they'll likely choose whichever 2 year old SoC Nvidia has around at the time. Will that 2021 Tegra SoC have RT acceleration of any sort?

In 2023 I doubt Nintendo will have anything useful from nVidia's bargain bin, considering they stopped developing Tegra mobile SoCs a while ago. Xavier is a 30W part and it seems Orin seeks to put dGPUs completely aside in drive modules so it should be an even more power hungry chip.


Regardless, the Switch successor doesn't really need to have a nvidia SoC. By then they'll have competition from Samsung who will carry a similar GPU architecture to both next-gen consoles which could be in Nintendo's best interest to at least try to get some multiplatform titles.
But even if they do go with nvidia and order a semicustom SoC for a mobile console, I doubt they will put RT into a 5-15W SoC by 2023. Right now they're deeming the cut-down TU106 in the RTX 2060 as the minimum performance tier for practical raytracing functionality (the more recent but lower end TU116 and TU117 lack RT hardware). That's a 6 TFLOPs GPU with a 160W TDP, which will probably not fit into a mobile console by 2023.
 
What won't be the case at all? Nintendo using Nvidia SoC? Or the hypothetical 2021 Tegra not having RT? And how can you be so sure?
From a battery life/power consumption perspective for a thin formfactor mobile device its a bit counterintuitive to do ray tracing even with modern methods of denoising.
 
Unless Nintendo switch to currently non-existent but possibly under development (Hi Rys!) ImgTech PowerVR with accelerated RayTracing, there wont be any mobile friendly tech by 2022 development that will be ready for mass consumer deployment in the tens of millions for 2023.
 
"AAA", "require", "ray tracing", and "ray tracing GPU" all have enough definition wiggle room for his prediction to be spun as true or false in a few years. Prior to the RTX launch, a game being "ray traced" would have been understood by most people to mean the rendering pipeline is entirely ray tracing. Similarly I would not have considered a GPU with the features of Turing to be a "ray tracing GPU", although that's clearly the mantle it's carrying now. So it can mean just about anything (in both category and degree) because the value of its presence is largely as a marketing term to differentiate and sell products to customers whom don't know what it is, where it is, how it's being used, etc. If a very large number of consumers decide they really value the term "ray tracing" then there will be products (hardware and software) that are marketed as "ray tracing", and what it actually is will be determined by how much horsepower and TDP can be squeezed into cheap consoles and video cards.
 
TNT and GeForce 256 have more differences than T&L. For example real trilinear filtering was added, and register combiner functionality overhauled.

They might see Doom3 as a turning point because it has a special rendering path for the NV1x GeForce cards. I think it's the first game to require DirectX 7 level functionality? It also doesn't really support any DirectX 7 cards from other companies.

This. GeForce and TNT are quite different beasts, it's not just about T&L. Didn't GF256 also implement some sort of cubemapping technique that could fake "pixel shadery" effects? I can't recall many games that ended up using that, but No One Lives Forever 2 jumps in mind with some great reflections the earlier Japanese levels with waterfalls and pools. Then again, the special NV1x path in Doom3 was more for GeForce4 MX owners, which probably meant that anyone using a fast GeForce2 was also good to go, but I think using a GF2 MX or GF256 would have been a little too much for the poor thing.

An odd chart to be sure, although the prediction is not that far off probably.
 
Anyone that makes any sort of mid-term to long-term tech predictions at this point without making any reference to the fab industry trends and how feasible their new features will be with respect to transistor budgets, is a sleazy salesman not an engineer. Those timeline charts may as well have just said, "I think you are stupid".
 
In 2023 I doubt Nintendo will have anything useful from nVidia's bargain bin, considering they stopped developing Tegra mobile SoCs a while ago. Xavier is a 30W part and it seems Orin seeks to put dGPUs completely aside in drive modules so it should be an even more power hungry chip.

Maybe. But it's easy to underclock and undervolt.

Regardless, the Switch successor doesn't really need to have a nvidia SoC. By then they'll have competition from Samsung who will carry a similar GPU architecture to both next-gen consoles which could be in Nintendo's best interest to at least try to get some multiplatform titles

And who says that Samsung/AMD IP won't have some sort of RT acceleration built in? So again, that's a maybe. Maybe. There's a difference between a maybe and "won't be the case at all". One requires little proof to be part of a discussion, the other requires much much more proof or undeniable reasoning than what's been provided.

But even if they do go with nvidia and order a semicustom SoC for a mobile console, I doubt they will put RT into a 5-15W SoC by 2023. Right now they're deeming the cut-down TU106 in the RTX 2060 as the minimum performance tier for practical raytracing functionality (the more recent but lower end TU116 and TU117 lack RT hardware). That's a 6 TFLOPs GPU with a 160W TDP, which will probably not fit into a mobile console by 2023.

Again that assumes there's any (cheap) option of leaving RT out. Tu116 and TU117 also lack Tensor cores, not only RT and both combined suppose less than 10% of chip die area on Turing, with Tensor cores being 2/3rd of that difference. In a SoC that translates to even less area difference since the GPU itself is just a portion of it, and they are definitely not going to take Tensors out on the Tegra because that's the point of Tegras lately. So does it make sense ($$) to have a separate SM/TPC design for such a small (1-3%) area difference? Taking both out for a 10% change makes sense. Taking RT out for just a 2% doesn't make any sense to me.

That's a 6 TFLOPs GPU with a 160W TDP, which will probably not fit into a mobile console by 2023.

50-75W Tesla P4 says hi. 2 node jumps also say hi. But I'm not saying the hypothetical SoC would be 6 or even 4 TFlops, although I do think it would be posible. My main point as said above is that it probably won't be a SM/TPC design with Tensor cores but without RT, because it doesn't make sense to me.

If the RT would be usable or not, that's another thing completely, something that I'm not discussing at all. OP says that every platform will have RT acceleration and that's what I'm discussing, not if Nintendo will want to use it or make it a requirement. Which BTW the OP does not say, "game requiring RT" and "every platform has RT" is 2 separate claims that he's making there. And what's more "every platform" could simply mean: PC, consoles in general and Stadia-like services, for example...
 
Morgan McGuire may be wrong with his prediction for 2023, but I wouldn't underestimate his knowledge of both the software and hardware side of the business. He's a very well respected person in the graphics industry. It's just a prediction. He didn't write it in his blood.

Edit: Not to mention, he's also predicting that ray/raster hybrid will be the norm until 2034, and path tracing won't be viable until 2035. The image form that tweet is from his slide deck where he lays all of this out. So there is context, and I don't think it's totally unrealistic to think a game could be released that requires ray tracing.
 
Morgan McGuire may be wrong with his prediction for 2023, but I wouldn't underestimate his knowledge of both the software and hardware side of the business. He's a very well respected person in the graphics industry. It's just a prediction. He didn't write it in his blood.

Edit: Not to mention, he's also predicting that ray/raster hybrid will be the norm until 2034, and path tracing won't be viable until 2035. The image form that tweet is from his slide deck where he lays all of this out. So there is context, and I don't think it's totally unrealistic to think a game could be released that requires ray tracing.

If the two next generation consoles have hardware raytracing and the next iteration of RDNA has hardware raytracing, this is not something out of the scope.
 
His tweet made not have been made in good faith

Theres a difference between predicting; and already knowing if a title incoming will require RT within a specific targeted year but you are saying you are predicting.

He may have been asked to take it down by the studio.
 
Morgan McGuire may be wrong with his prediction for 2023, but I wouldn't underestimate his knowledge of both the software and hardware side of the business. He's a very well respected person in the graphics industry. It's just a prediction. He didn't write it in his blood.
Just wanted to point out that I am NOT knocking Morgan McGuire or meaning disrespect in any way to their knowledge, I just disagree with this particular prediction. I ain't in any position to make fun of someone's lack of knowledge, that's my specialty! (Lacking knowledge, not knocking people.)
 
Even if Mr. McGuire's gut feeling, insider knowledge, prediction whatever should be on spot what's the big deal here exactly? History has shown that every time something similar occurs it's usually some overpumped nonsense which wastes endless resources for rather questionable additional effects, for which the invested resources could had been diverted elsewhere far more wisely. Parade example when pixel shaders appeared where every monkey's ass had to be shiny or recently shitloads of over-tessellated nonsense that just wasted resources for nothing.

IMHO ray tracing in combination with rasterizing can add quite a bit to graphics, however I doubt I'll get knocked out of my socks with any first "serious" RT appearances in future games.
 
Game releases in 2023 requiring gfx cards that 60% of users lack. Patch making game run without RT hardware launches next day. Win win.
 
If sony/microsoft supports ray tracing then by end of 2023 it would not be surprising to see a game designed with hybrid raster + raytracing in mind. Could be console exclusive though or have horrible fallback for traditional lightning due to artist effort going into raytrace version. One big win for ray tracing is artist productivity as there is less need to tweak/bake things.
 
Last edited:
I guess that's one way for a game to require RT. :D If the game is a console exclusive that has RT, then by default it requires RT since it won't run on anything that doesn't have RT (anything other than that console).

Regards,
SB
 
The difference between hybrid ray tracing and pure raytracing is a red herring. Turing is perfectly capable of rendering with pure raytracing, but the question is why would you want to? For primary rays, rasterization is more efficient and does the exact same thing. The only think ray tracing primary rays could offer is realistic depth of field and motion blur, which are pretty low priority. The entire point of hybrid rendering is to use rasterization where it produces a physically correct result (primary rays), and ray tracing where rasterization hackery is fundamentally limited (secondary rays).

And let's be clear. Rasterization is fundamentally limited in its capabilities. Hacks like cube maps and screen space reflections and occlusion can approximate the effects of secondary rays, but they will always be innaccurate and produce artifacts. They are forced to deal with inaccurate or missing information by their very nature. Now, you can augment his information with things like depth peeling and rendering many cube maps, but more and more, in order to get accurate results, you end up implementing a ray tracer with a lossy, inefficient accelleration structure coming out of image space.

Global illumination is even harder for rasterization. You can of course bake it and precompute a lot (using raytracing...), but your precomputed lighting will suddenly be completely wrong if someone, say, blows a hole in a wall, letting in sunlight. This is one of the reasons that buildings in AAA games are usually indestructable - altering them breaks your level's lighting! Baking also breaks down when you have a large open world since you just don't have memory for static lighting for square kilometers worth of terrain at any reasonable level of detail. Not to mention lighting changes as the sun moves across the sky.

Moving toward photorealism, ray tracing not only ends up superceeding rasterization, it ends up being the only known way to do it. Rasterization is a dead end, and while there are still hacks that can push it a bit further, there exists a point where it will progress no further. That point is a long way short of the human eye looking at a screenshot and confusing it for a photo.
 
Addressing the topic of the thread more directly, games will end up requiring ray tracing. The reason is that as long as it isn't a requirement, the developers will have to maintain two very different render paths, which is that much extra work for both the programmers and the artists. Pure rasterization will end up the legacy path, and will ultimately be dropped, in the exact same way the fixed function pipeline was more than a decade ago.
 
As long as any platform, the developers would be targeting, doesn't offer RayTracing (Nintendo) developers will have to maintain 2 different render paths anyways.
 
Back
Top