NVIDIA's Morgan McGuire predicts that the first AAA game to REQUIRE a ray tracing GPU ships in 2023

As long as any platform, the developers would be targeting, doesn't offer RayTracing (Nintendo) developers will have to maintain 2 different render paths anyways.

I know I'll probably get lynched for the following, but I would like to imagine a future where Nintendo won't have any worth mentioning influence in the gaming world.
 
I know I'll probably get lynched for the following, but I would like to imagine a future where Nintendo won't have any worth mentioning influence in the gaming world.
Disagree on gaming world (even though I haven't felt like playing a Nintendo game in years), but agree on them not influencing processing hardware.

Let them do their thing on games and peripherals.
 
Doesn't NSW have a higher feature level support than PS4 and XBO? It lacks power, but it still has features. We could see RT on Nintendo provided it's part of the architectures post Turing. And if RT on minecraft is any indication of beauty with simple graphics, I could see Nintendo making great use of it.
 
Addressing the topic of the thread more directly, games will end up requiring ray tracing. The reason is that as long as it isn't a requirement, the developers will have to maintain two very different render paths, which is that much extra work for both the programmers and the artists. Pure rasterization will end up the legacy path, and will ultimately be dropped, in the exact same way the fixed function pipeline was more than a decade ago.
Err... as long as the staggering majority of gaming takes place on systems that has no hardware support for ray tracing, it can't be more than a check box for some high-end PC users to try.
The upcoming console generation will determine if there is a platform at all where doing some of the lighting using ray tracing makes sense. If it doesn't turn out to be the technique of choice on these platforms, prepare to twiddle your thumbs for another decade.
Mobile platforms stand for more than half the total game revenue, on consoles the Switch is a major part, and on PCs the continuing trend is towards laptops.
If ray-tracing isn't efficient enough to run on mobile platforms in a decade, it's niche, because the trend towards entertainment being mobile is unlikely to stop.

When will 2080Ti bandwidth be available on platforms that aren't stuck to a wall with a cord?
 
I think people are underestimating the "cloud's" impact on gaming in the near future. Are we so sure end users really need hardware support in their devices for tray racing to take off? Let's rephrase the question slightly, can a AAA game be made in 2023 that only targets devices with hardware RT or the cloud (or maybe...just the cloud!?!)?
 
Rephrasing of the question results in an entirely different conversation and discussion thread.
 
I don't mean that we should make predictions about cloud gaming in this thread, but if you believe the answer to that question *could* be yes then that eliminates the biggest hurdle in a game requiring RT hardware. Even Nintendo platforms could be supported... ;)

I think tray racing will succeed or fail based on the merits of the technology. It's still not clear that RT (or even just RT hardware) is a better use of power/die space than adding more "rasterization hacks" or simply adding more compute units/bandwidth/etc. It's still very early and no one knows all the answers yet! :) I just don't personally see phones/Nintendo/any device being the reason RT "fails" in 2023 (or in general).
 
I feel a great disturbance in this thread, like a whole new thought has screamed its way in and a split is all that can save it! :O
 
I don't mean that we should make predictions about cloud gaming in this thread, but if you believe the answer to that question *could* be yes then that eliminates the biggest hurdle in a game requiring RT hardware. Even Nintendo platforms could be supported... ;)

I think tray racing will succeed or fail based on the merits of the technology. It's still not clear that RT (or even just RT hardware) is a better use of power/die space than adding more "rasterization hacks" or simply adding more compute units/bandwidth/etc. It's still very early and no one knows all the answers yet! :) I just don't personally see phones/Nintendo/any device being the reason RT "fails" in 2023 (or in general).
What constitutes ”failing” really? I responded to a post that actually (sort of) drew a rather solid line in the sand - when will RT be so wide spread that developers no longer have to provide two different code paths for dealing with certain aspects of the lighting?
And the answer to that one could easily be: not in my life time.

Too much of forum traffic is polarized bickering. Lets be honest here, anyone with a shred of tech interest enjoys when something new comes along and there are new trade-offs to weigh and analyze. And raytracing has had an aura of being cool since I first dabbled with it a quarter of a century ago. It still does.
But whether it will gain dominance as the way to deal with certain lighting problems in modern game graphics has little to do with tech geeks being starved for entertainment, and more with markets and money.
Where are gaming software dollars actually spent? How do publishers maximize adressable markets?
How fast do platforms turn over?

NDAs will hopefully lift some time before the new console generation is launched, which will give some new relevant data, but before that we’re really in the dark. My one point is that the technology has to scale down to truly have mass market impact, and that lithography isn’t moving briskly enough to bail us out as it used to.
 
I define tray racing's success as being a core/essential part of a few AAA (i.e. not some random indie) games'/engines' rendering pipelines. I mean if you want to be strictly technical, ray tracing has long been a success. A form of "tracing rays" can be found in many games today! Does that mean games in 2018+ "required" ray tracing? :mrgreen:

I don't understand this desire to make ray tracing this "black or white" technology. For the foreseeable feature (forever?), it will always be partnered with rasterization. I do believe various "rasterization techniques" used today will eventually be replaced wholesale with ray tracing. I also believe some techniques will remain rasterization only. And I also believe for some techniques an end user will be able to "toggle" between the two. I don't think any of that will impact whether or not ray tracing "succeeds". I honestly don't think "RTX" did anything to change the general trajectory of ray tracing. What changed imo is now the adoption and research will be (greatly?) accelerated.

As for your broader point, I think the cloud can solve any scale concerns (that goes for advanced rendering techniques in general, not just RT). In addition, ray tracing has a real ability to greatly simplify content creation pipelines (gi, lightmaps, etc.). So if you think $ is the only determining factor, well... :p
 
Maybe the desire to make RT a black or white thing come from some artists ? My thinking is, designing a level, having a specific art style, etc, with classic lighting system or RT is 2 different things since the lighting and others elements are not reacting the same way ?
 
Maybe the desire to make RT a black or white thing come from some artists ? My thinking is, designing a level, having a specific art style, etc, with classic lighting system or RT is 2 different things since the lighting and others elements are not reacting the same way ?

Bad over simplification would be to claim ray tracing just works from artist POV and onus is on developers to make it fast and good quality. In traditional approach onus is on artists to tweak until things are good and have some "hacks" included to avoid things like light bleeding where it should not. This tweaking is time consuming(==expensive) and not every level/game has same level of polish.
 
Bad over simplification would be to claim ray tracing just works from artist POV and onus is on developers to make it fast and good quality. In traditional approach onus is on artists to tweak until things are good and have some "hacks" included to avoid things like light bleeding where it should not. This tweaking is time consuming(==expensive) and not every level/game has same level of polish.
The thing is though that for the foreseeable future you cannot build a game assuming that all targets support RT in a performant way. Meaning supporting RT is just extra work.

In a recent tweet Sebbbi expressed as much joy as I've ever seen his Finnish soul allow, over the improvements IcyLake (Intels not yet released 10nm laptop chip) bring to their integrated graphics. That is what multiplatform games need to run on. And it bears remembering that it will take half a decade at least before that level of performance has gained a wide penetration in laptops.

What a percent of the gaming community has access to on certain high end discrete graphics cards just doesn't matter. Tech sites and Tech forums are starved for excitement in todays computer market, and will gush over anything that comes along, egged on by any party that has an interest in the sales of new hardware (all retailers, all that live off retailer advertising, manufacturers, and so on).
But software publishers want to sell their product to anyone who might enjoy the game.

Efficiency is critical in the overwhelmingly dominant part of the market. It saves die area, money, power and so on, and that pressure won't go away. On the contrary, with the migration towards mobile entertainment and the slowing of lithographic progress, that will be reinforced. Dedicating die area to a very specific approch to dealing with a part of the lighting, an approach that additionally is valid for only a part of the adressable software market simply seems like a bad idea.

Practicality matters. The unwashed masses that actually buy games matter.
It may be that someone in the future comes up with a way to do ray and path tracing faster than the alternative methods, but then seriously bright people have been working on that problem for decades already in the film production industry. so holding ones breath is not a good idea.
As far as I can see, unless the upcoming consoles make RT performant enough to be the default method of choice, RTRT is just something graphics card vendors can push as a feature checkbox to entice upgraders, with little impact on the gaming industry as a whole.

It can still be a success for Nvidia for those who want to use that vocabulary, if they gain support for their features in important rendering packages. Maybe that's the real reason it's there in the first place. And it's cool tech. But...

(Personally, I'd like to see the lowest common denominator bar raised. LPDDR5 support, and/or integrated graphics supported by fast on package memory, and so on. Mobile moves faster than portable PCs and is replaced more often so is less of a problem. Increasing rendering efficiency as opposed to introducing very costly methods. Getting more people on board.)

Quoting Niels Bohr though, making predictions is difficult, particularly about the future.
 
I couldn't disagree more about the research. I actually feel very little research has been put into hardware acceleration of ray tracing (at the very least, no where near the effort rasterization gets). In fact I'll wager that RTX(++?) and next gen consoles will still lag behind the technology imgtec had 5-10 years ago (yeah yeah, citation needed...). The field is wide open to me.
 
Install base for new features pretty much always starts from 0. There is huge chicken/egg problem here. Do we ever want progress or not?

My guess is that at some point hybrid raster + ray tracing is the way to go and the traditional fallback is unpolished crappy looking fallback for those who don't care enough to upgrade. i.e. artists won't use much effort to manually fix light probes and whatnots.

Next gen consoles will likely move the needle to some direction.
 
I couldn't disagree more about the research. I actually feel very little research has been put into hardware acceleration of ray tracing (at the very least, no where near the effort rasterization gets). In fact I'll wager that RTX(++?) and next gen consoles will still lag behind the technology imgtec had 5-10 years ago (yeah yeah, citation needed...). The field is wide open to me.
I don't agree, ray-tracing chips have been looked into, designed, discarded...
You do have a point, of course, but it is not as strong as a PC-centric viewpoint might suggest. We will see, obviously.

Install base for new features pretty much always starts from 0. There is huge chicken/egg problem here. Do we ever want progress or not?
Sure. We have some precedence here, where new features were introduced and lithographic advances made them ubiquitous. Of course, we have also had advances that fell flat, or had a few years of glory in the lime light, and then faded.
The critical question to be clear about in such discussion is - what do you want progress for?
Is it for the hardware companies to have a new shiny to sell? Is it for the tech publications to have something to write about and finance their business? Is it for tech entertainment? Is it because you are in the business, and find it interesting and possibly a professional opportunity? Is it because...
This is important because different people argue based on different premises, and even the same individual can wear different hats at different times. I'm one, in this case.
My guess is that at some point hybrid raster + ray tracing is the way to go and the traditional fallback is unpolished crappy looking fallback for those who don't care enough to upgrade. i.e. artists won't use much effort to manually fix light probes and whatnots.
Next gen consoles will likely move the needle to some direction.
Possible, but again, what market segments are you talking about, and what time frames? "At some point" is awfully vague.
Actually, I'd argue that ray-tracing has a rather weak footing among consumers. It sounds cool, you can do the age old Toy Story comparison, but if it is something the ray tracing examples have shown, it's that it is really hard for most non experts (even if tech enthusiasts) to say what "looks better" or even indeed tell a difference at all even in static scenes! And there is good reason for that, the areas that are adressed by RTRT are also largely filtered out by the human brain. There is little reason really for consumers to pay anything extra for something they can barely notice. When will a tech disinterested Fortnite player use RT for their lighting?

I feel there is an Ivory Tower attitude at Beyond3D when it comes to these kinds of things. Pushing the tech envelope is always good, whereas advances that make tech cheaper, more accessible, less power hungry or more ergonomic find little praise. For instance in a Navi thread now there is an interesting discourse on the feasibility of a chiplet approach to making GPUs. It's interesting, because most of us doubt that we can keep growing monolithic GPUs very much as we head further along the lithographic curves. But there is NO discussion of whether the basic idea of spending even more silicon area on GPUs is actually a good direction to move in for either industry or consumers. Do we want bigger GPUs for our Fortnite gaming even if we can figure out a way to construct them?

Again, the one thing I've said about RTRT is that it needs to be more efficient than what we already got. If it isn't, then it has to be justified to normal people expected to fork up the money. How does RTRT fit with many of the big selling games and franchises trying to go mobile to reach a bigger audience, and developers gaining the best return from investment from surprise mechanics and selling digital swag? These kinds of real world perspectives and trends have pretty much been completely absent from the RTRT discussions. A tech discussion neatly isolated from the realities of the markets the tech is supposed to serve.
 
Last edited:
My view for timeline is we will likely know where ray tracing falls during 2020. Either amd pc side and/or console(s) embrace ray tracing or not. Comments from amd/console makers are pretty encouraging though.
 
I say Fall 2021 to know where RT happens to fall. If Nvidia is incapable of further accelerating RTRT then it might be considered a dead end.
 
Have previous RTRT dedicated hardware solutions been not viable due to the design or software? A bit of both but mostly just getting it supported in professional applications? I see something like RTX being successful compared to previous attempts mostly because Nvidia have a huge software stack in the field and have the resources to throw a lot of those resources and money at it to ensure it suceeds.
 
Back
Top