NVIDIA's Morgan McGuire predicts that the first AAA game to REQUIRE a ray tracing GPU ships in 2023

Also, when previous HW was released the recent progress in denoising was not a thing yet. I think that's the tipping point.
Has denoising via Tensor cores been implemented in any professional RTX integrations yet?
 
Has denoising via Tensor cores been implemented in any professional RTX integrations yet?

Have there been any talks of game developers thoughts on the tensor cores? How likely are we to see a worthwhile use of them in the nearish future?
 
I don't think JoeJ was referring to Nvidia's tensor based denoising soecifically, bur rather, just the also very recent impressive developments in research of RT denoising, and Temporal Reprojection in general, AI or compute based, real time or otherwise...
 
I couldn't disagree more about the research. I actually feel very little research has been put into hardware acceleration of ray tracing (at the very least, no where near the effort rasterization gets). In fact I'll wager that RTX(++?) and next gen consoles will still lag behind the technology imgtec had 5-10 years ago (yeah yeah, citation needed...). The field is wide open to me.

This is why I'm hoping, perhaps futilely, that IMGtec will re-enter the PC space at some point in the future. There's a lot of buzz around RT, Microsoft is interested in supporting, promoting and integrating it on PC, and more importantly IMGtec already have established IP in RT.

I'd try to see if Rhys could comment on any of this, but I'm sure if they are and he has any knowledge of it, that he wouldn't be able to talk about it. :)

But I can't help thinking that something like RT and IMGtec re-entering the PC space would be pretty exciting and worth going back to that company for. :D

Regards,
SB
 
This is why I'm hoping, perhaps futilely, that IMGtec will re-enter the PC space at some point in the future. There's a lot of buzz around RT, Microsoft is interested in supporting, promoting and integrating it on PC, and more importantly IMGtec already have established IP in RT.

I'd try to see if Rhys could comment on any of this, but I'm sure if they are and he has any knowledge of it, that he wouldn't be able to talk about it. :)

But I can't help thinking that something like RT and IMGtec re-entering the PC space would be pretty exciting and worth going back to that company for. :D

Regards,
SB
Imgtec discreet RT capable card and Intel's card coming within couple of years would be very exciting indeed.
 
I don't agree, ray-tracing chips have been looked into, designed, discarded...
You do have a point, of course, but it is not as strong as a PC-centric viewpoint might suggest. We will see, obviously.

I'm not saying it's never been tried! But let me rephrase it another way. I think one of the key aspects about imgtec's ray tracing hardware was they essentially had (incoming oversimplification) "hardware accelerated" BVH tree construction and searching (http://cdn.imgtec.com/sdk-presentations/gdc2014_introductionToPowerVRRayTracing.pdf). Nvidia's hardware doesn't have anything like that (drive.google.com/file/d/1B5fNRFwv2LsGlCBJ8oKYRiiDUtLMR4TY/view). There are still countless avenues worth exploring to accelerate ray tracing that don't directly involve firing rays.

Basically my problem with your post was "so holding ones breath is not a good idea.". I bet there will be many advancements in the future that will have non-trivial impact on performance (per watt). I mean just look at rasterization! That's been researched into the ground and they still come up with new ideas like mesh shaders. Closing the book on ray tracing research seems insane to me!
 
I'm not saying it's never been tried! But let me rephrase it another way. I think one of the key aspects about imgtec's ray tracing hardware was they essentially had (incoming oversimplification) "hardware accelerated" BVH tree construction and searching (http://cdn.imgtec.com/sdk-presentations/gdc2014_introductionToPowerVRRayTracing.pdf). Nvidia's hardware doesn't have anything like that (drive.google.com/file/d/1B5fNRFwv2LsGlCBJ8oKYRiiDUtLMR4TY/view). There are still countless avenues worth exploring to accelerate ray tracing that don't directly involve firing rays.

Basically my problem with your post was "so holding ones breath is not a good idea.". I bet there will be many advancements in the future that will have non-trivial impact on performance (per watt). I mean just look at rasterization! That's been researched into the ground and they still come up with new ideas like mesh shaders. Closing the book on ray tracing research seems insane to me!

If someone uses Power VR technology, I want to know why ray reordering is optional. This is very strange...
 
I'm not saying it's never been tried! But let me rephrase it another way. I think one of the key aspects about imgtec's ray tracing hardware was they essentially had (incoming oversimplification) "hardware accelerated" BVH tree construction and searching (http://cdn.imgtec.com/sdk-presentations/gdc2014_introductionToPowerVRRayTracing.pdf). Nvidia's hardware doesn't have anything like that (drive.google.com/file/d/1B5fNRFwv2LsGlCBJ8oKYRiiDUtLMR4TY/view). There are still countless avenues worth exploring to accelerate ray tracing that don't directly involve firing rays.

Basically my problem with your post was "so holding ones breath is not a good idea.". I bet there will be many advancements in the future that will have non-trivial impact on performance (per watt). I mean just look at rasterization! That's been researched into the ground and they still come up with new ideas like mesh shaders. Closing the book on ray tracing research seems insane to me!
Err, where has anyone said anything remotely interpretable as "closing the book on ray tracing research"?
Ray tracing and associated techniques has been researched for ages. Look at Siggraph articles from the Ice Age and forward. And hardware approaches has been both studied and produced. For recent examples look at SaarCor, or the ray tracing company that was bought by ImgTech, Caustic Graphics. There have been approaches earlier, during and after that, Google "Ray Tracing FPGA" for a ton of links. Hell even Intel has dabbled in this!
There is an industry interest in this outside gaming that actually involves a fair bit of money. Not only in animation and video effects production, but in architecture and similar modelling. Ray Tracing has been, is, and will continue to be of business interest regardless of any partial implementation in gaming.
Arguably, this is the real target of the RTX initiative. Gamers are just footing the bill.
 
What I'm claiming is all of that research is a drop in a bucket compared to the time/effort/resources that have been put into rasterization. I think it's reasonable to still expect big advancements in the future for ray tracing (and rasterization!). Let's look at the popular two level BVH approach used by most (all?) ray traced based renderers (embree, radeon rays, etc.). That structure wasn't really "decided on" until the last few years. When I was a researcher at university, kd trees and octrees were still very much part of the mix (in fact my first gpu ray tracer used kd trees). I've seen first hand huge advancements already through just software!

So while I get ray tracing is certainly not a new topic, you completely lose me with this suggestion that we've basically gone as far as we can go and we just need to wait for the transistors to catch up. I'm not convinced we even have the "right" (best) acceleration structure for ray tracing, let alone hardware that can potentially accelerate that acceleration structure (something something inception...). :mrgreen:

And listen no one, including Rys, loves Caustic more than me. But to suggest that they had all the ray tracing answers...I'm sure even they might disagree with that one. ;-) :D
 
I feel there is an Ivory Tower attitude at Beyond3D when it comes to these kinds of things. Pushing the tech envelope is always good, whereas advances that make tech cheaper, more accessible, less power hungry or more ergonomic find little praise.

That’s not fair. Pushing the tech envelope certainly requires and results in advances in cost and power efficiency in every segment of the market.

You won’t see people get that excited for integrated graphics because improvements there are seen mostly as a side effect of advances at the high end. Also, the general perception is that consoles are the lowest common denominator each generation that determines baseline software features.

Poor integrated performance doesn’t appear to be holding back the industry. Many games are still unplayable on the latest Intel IGPs at the lowest settings so they’re certainly not a baseline target for a lot of devs.
 
That’s not fair. Pushing the tech envelope certainly requires and results in advances in cost and power efficiency in every segment of the market.
Say what again? How does RTRT result in "advances in cost and power efficiency in every segment of the market"?
That's just bizarre.
You won’t see people get that excited for integrated graphics because improvements there are seen mostly as a side effect of advances at the high end.
That is actually not true generally. Trickle down technology has worked in 3D graphics because we have seen great lithographic advances. But that well has been drying up, so when is the RTRT performance on offer from the 750mm2 2080Ti going to be available at attractive costs and 4k?
RTRT is not driven by consumers. They want cheaper, cooler and more performant tech. Control is probably the best example of RTRT at present. But enabling RTRT almost halves performance and drops RTX2080Ti performance levels to below GTX1660Ti. For a visual effect that most summarise as "ridiculously polished floors".
Nor by game publishers - they want to reach the largest possible addressable market. Adding RTRT to a title is cost without corresponding gain in sales - how many more copies of the latest Metro has been sold due to RTRT? Three?
Hardware manufacturers (at least Nvidia) are the only ones who has a real interest in this.
Poor integrated performance doesn’t appear to be holding back the industry. Many games are still unplayable on the latest Intel IGPs at the lowest settings so they’re certainly not a baseline target for a lot of devs.
But from a developer perspective, it is so much better to have a larger addressable market than to support a fancy feature available to some PC gamers. It would really benefit them if IGPs were actually reasonably performant. That would be something to strive for that would benefit everyone - except of course Nvidia since they don't offer anything in that market segment, but ultimately it would be to their advantage as well if the PC games market grew.
 
...
Control is probably the best example of RTRT at present. But enabling RTRT almost halves performance and drops RTX2080Ti performance levels to below GTX1660Ti. For a visual effect that most summarise as "ridiculously polished floors".
...

I would say this characterization is incredibly unfair, and maybe intentionally misleading. The ray tracing implementation in Control is far more than just glossy reflections.
 
I would say this characterization is incredibly unfair, and maybe intentionally misleading. The ray tracing implementation in Control is far more than just glossy reflections.
Yes, Control probably represents the most sophisticated implementation of RTRT in a commercial game to date. But I’d also say that the ”polished floors” characterization is a fair description of the actual visual outcome. Even when I look at good comparison material like this at Techpowerup where I can move sliders and go back and forth over still images, that characterization remains apt.
And the very fact that I have to do this, squint at still images hunting for differences aided by knowledge of the technical background, says that it would have effectively zero influence on my enjoyment from actually playing the game.

The visual effects that RTRT helps us with are quite subtle and the kind of image information that our brains are largely wired to discard when looking at a scene. Ergo, areas that are perfect for rendering shortcuts in order to save resources for where they make a greater difference. Or simply to save.

RTRT is driven by hardware manufacturers (well Nvidia to be honest) and it may be enough to, at some time in the future, make it the default method for some (which?) aspects of lighting in real time games.
Or not.
It may also be that the method is simply unsuitable for real time applications and that other approximations make more sense in the gaming market.

But from a consumer perspective, we have little reason to encourage methods that are wasteful with our resources, be they money, energy or whatever.
 
Yes, Control probably represents the most sophisticated implementation of RTRT in a commercial game to date. But I’d also say that the ”polished floors” characterization is a fair description of the actual visual outcome. Even when I look at good comparison material like this at Techpowerup where I can move sliders and go back and forth over still images, that characterization remains apt.
And the very fact that I have to do this, squint at still images hunting for differences aided by knowledge of the technical background, says that it would have effectively zero influence on my enjoyment from actually playing the game.

The visual effects that RTRT helps us with are quite subtle and the kind of image information that our brains are largely wired to discard when looking at a scene. Ergo, areas that are perfect for rendering shortcuts in order to save resources for where they make a greater difference. Or simply to save.

RTRT is driven by hardware manufacturers (well Nvidia to be honest) and it may be enough to, at some time in the future, make it the default method for some (which?) aspects of lighting in real time games.
Or not.
It may also be that the method is simply unsuitable for real time applications and that other approximations make more sense in the gaming market.

But from a consumer perspective, we have little reason to encourage methods that are wasteful with our resources, be they money, energy or whatever.

I am a console player and I will have access to RT only next generation and before Control I was not very fan of any RTX effort on AAA games but on this game GI even without RTX is the base of the artistic direction. RTX version gives some better effects without looking some bad patch on an artistic direction base on a rasterized game...

https://s0.gifyu.com/images/Control-paper1.gif

https://s3.gifyu.com/images/Control-particles2.gif

This is high-quality gif from RTX version and the game looks very good. I think the choice made by Remedy with Global illumination, the physic engine, and particles are good for the artistical direction of the game... The compromise was to sacrifice resolution but I find it coherent.

EDIT: I don't know the performance AMD RT implementation or Sony implementation from what Square Enix said they are different but it is good to have RT as a choice for studios to create games...
 
Last edited:
I'm playing through the game on a gtx card. It does not look nearly as good as the rtx version. The differences can be massive, especially in the places where the screen space and global reflections break down. Oh and light leaking issues all over the place, especially around doors in dark areas.
 
Say what again? How does RTRT result in "advances in cost and power efficiency in every segment of the market"?
That's just bizarre.

Maybe try taking a breath and read what I said properly. I said nothing about RT.

That is actually not true generally. Trickle down technology has worked in 3D graphics because we have seen great lithographic advances. But that well has been drying up, so when is the RTRT performance on offer from the 750mm2 2080Ti going to be available at attractive costs and 4k?

Why does it matter? The cost of anything new will always be easier to bear at the top of the market. Margins are just too tight at the bottom.

RTRT is not driven by consumers. They want cheaper, cooler and more performant tech. Control is probably the best example of RTRT at present. But enabling RTRT almost halves performance and drops RTX2080Ti performance levels to below GTX1660Ti. For a visual effect that most summarise as "ridiculously polished floors".
Nor by game publishers - they want to reach the largest possible addressable market. Adding RTRT to a title is cost without corresponding gain in sales - how many more copies of the latest Metro has been sold due to RTRT? Three?
Hardware manufacturers (at least Nvidia) are the only ones who has a real interest in this.

Neither was tessellation or multi texturing or any incremental advancement in IQ. Everyone is entitled to their own opinion but I’m certainly not hankering for the same tired old graphics at 250fps. If there’s a better use of transistor budget than RT it’ll be great to see that too if and when it appears.

But from a developer perspective, it is so much better to have a larger addressable market than to support a fancy feature available to some PC gamers. It would really benefit them if IGPs were actually reasonably performant. That would be something to strive for that would benefit everyone - except of course Nvidia since they don't offer anything in that market segment, but ultimately it would be to their advantage as well if the PC games market grew.

That’s a pretty silly argument. New features take a long time to saturate the market whether IGPs get them or not. And as I said above it’s easier to absorb the cost of new tech when you have margins and a power budget to play with.
 
And the very fact that I have to do this, squint at still images hunting for differences aided by knowledge of the technical background, says that it would have effectively zero influence on my enjoyment from actually playing the game.
Guess what? you have to do this for most games these days, if you compare low to ultra settings you won't find major differences in contemporary games despite the 50% or more drop in performance between the two presets! This isn't limited to RT, but applies to EVERY major graphics settings (except textures). In fact if you set Textures and AA to ultra and set everything else to low you'd be hard pressed to notice any difference unless you squint HARD.

Have a look at this YouTube channel, it features comparisons for low vs ultra in recent games with performance figures.

Control: almost no difference, despite 70%-90% drop in fps


WRC 8: minor shadowing differences despite 60%-80% drop in fps


Greedfall: minor lighting differences despite 80% drop in fps


Rage 2: minor differences


The visual effects that RTRT helps us with are quite subtle and the kind of image information that our brains are largely wired to discard
As shown above, this isn't limited to RT effects, but applies to all other effects, so that argument isn't really an excuse to suddenly consider all graphical advancements useless and irrelevant.

If we take the example of Control, then low vs ultra present very little differences in image quality, but add RTX effects to the pile, and you suddenly have a noticeable difference in reflection and shadow quality.
 
Guess what? you have to do this for most games these days, if you compare low to ultra settings you won't find major differences in contemporary games despite the 50% or more drop in performance between the two presets! This isn't limited to RT, but applies to EVERY major graphics settings (except textures). In fact if you set Textures and AA to ultra and set everything else to low you'd be hard pressed to notice any difference unless you squint HARD.

Have a look at this YouTube channel, it features comparisons for low vs ultra in recent games with performance figures.

Control: almost no difference, despite 70%-90% drop in fps


WRC 8: minor shadowing differences despite 60%-80% drop in fps


Greedfall: minor lighting differences despite 80% drop in fps


Rage 2: minor differences



As shown above, this isn't limited to RT effects, but applies to all other effects, so that argument isn't really an excuse to suddenly consider all graphical advancements useless and irrelevant.

If we take the example of Control, then low vs ultra present very little differences in image quality, but add RTX effects to the pile, and you suddenly have a noticeable difference in reflection and shadow quality.

Can't agree more, gone are the days when there was a big difference between low and ultra in the graphics department. Actually, even when RTX is reducing performance alot, it mostly has a bigger visual improvement then from say mid to ultra settings.

But from a developer perspective, it is so much better to have a larger addressable market than to support a fancy feature available to some PC gamers. It would really benefit them if IGPs were actually reasonably performant. That would be something to strive for that would benefit everyone - except of course Nvidia since they don't offer anything in that market segment, but ultimately it would be to their advantage as well if the PC games market grew.

The pc games market, or the whole gaming market is actually growing, RT has a future and Nvidia/AMD are investing in the technology because they know it will benefit them in the long run.
RTX can be found already in the 2060, and prices will probably go down once AMD's variants come in the near future.
 
This is why I'm hoping, perhaps futilely, that IMGtec will re-enter the PC space at some point in the future. There's a lot of buzz around RT, Microsoft is interested in supporting, promoting and integrating it on PC, and more importantly IMGtec already have established IP in RT.

I'd try to see if Rhys could comment on any of this, but I'm sure if they are and he has any knowledge of it, that he wouldn't be able to talk about it. :)

But I can't help thinking that something like RT and IMGtec re-entering the PC space would be pretty exciting and worth going back to that company for. :D

Regards,
SB

Considering this vacancy: https://www.imgtec.com/careers/vacancies/?job=496887 things don' t look that good at this stage and there's an overall endless silence surrounding the company. Under the Chinese government umbrella the idea of designing their own GPU hw (and not just IP) doesn't sound that absurd anymore but for that they obviously need to find some serious additional engineering talents it seems and even then I wouldn't expect anything all that ambitious considering what China would be actually aiming for with it. IF I could only imagine for a possible consumer product something like a cheap GPU for everyone, which isn't bad per se - au contraire, but I fail to see how it would need anything like RT in the foreseeable future.
 
Last edited:
Back
Top