Should full scene RT be deprioritised until RT solutions are faster? *spawn

RT is just DXR, since the introduction of DXR we have seen adoption of more than 215 titles (with dozens more being planned), not to mention numerous mods, remixes and apps. We have moved from ray tracing a single element to multiple elements to path tracing .. objectively speaking, that's more adoption and progress than any recent version of DirectX (DX10/DX11/DX12) .. DX12 especially was a laughing stock in adoption rate, but DXR was and still is the opposite.

215 in 7 years is terrible, no matter how you put it.

With Sony and AMD now fully on board, the pace of adoption should be massively accelerated.

Adoption rate is always dictated by the performance of the low-end segment, which has sen terrible RT performance uplifts.

One of the most popular GPU's on STEAM (RTX 4060) still has lower RT performance than the 2018 RTX 2080ti (according to Techpowerups RTX 4060 review)

There's a big possibility that even the RTX 5060 won't offer the 15% average RT performance increase required to match what the 7 year old RTX 2080ti has.

That is what is going to ultimately murder the RT adoption rate.

If you want RT for the masses, then you need to give the masses the performance to use RT in the first place. And that is something Nvidia and AMD are massively failing at doing.
 
Adoption rate is always dictated by the performance of the low-end segment,
No, adoption rate is only the number of games/apps/engines supported. RT is now adopted by every engine on the planet, there are games that won't launch without an RT GPU.

One of the most popular GPU's on STEAM (RTX 4060) still has lower RT performance than the 2018 RTX 2080ti (according to Techpowerups RTX 4060 review)
RT capable hardware now comprise most of the hardware in the Steam survey (~75%). So adoption of both hardware and software is going super well. The 4060 plays games with RT well at 1080p resolution given it's limited 8GB VRAM buffer, you can't ask it for more than that.

And I remember it performing like ass until it was patched.
Games run like ass all the time, even raster games devoid of any RT still run like ass to this day. This is not how you measure adoption.
 
Better, and it was large played on displays that didn't have a native resolution, so obtaining good performance while maintaining a sharp level was IQ was possible.
"Better"? It run in 480p with 30 FPS on a xbox 1. Here is a video with a Geforce 3 TI 200:

Cant find any benchmarks from 20 years ago. Battlefield 5 with Raytracing ultra runs with 60 FPS on a 2080 TI in 1080p.
 
No, adoption rate is only the number of games/apps/engines supported. RT is now adopted by every engine on the planet.

And yet there's still no sign of game adoption rate increasing substantially.

there are games that won't launch without an RT GPU.

Literally a few in 7 years, so nothing to brag about.

RT capable hardware now comprise most of the hardware in the Steam survey (~75%).

And the vast majority of it doesn't have the performance to effectively play games with RT enabled.

So adoption of both hardware and software is going super well.

215 games in 7 years begs to differ.

The 4060 plays games with RT well at 1080p resolution given it's limited 8GB VRAM buffer, you can't ask it for more than that.

The 4060 averages 31.5fps at 1080p in AC: Shadow with RT enabled.

That's playing games 'well' to you?

Games run like ass all the time, even raster games devoid of any RT still run like ass to this day. This is not how you measure adoption.

I never said it was.
 
Yes, but it's a completely thing to the HDR I referenced in his thread.

The HDR change you referenced was a much, simpler change than RT or SVOGI and not comparable at all in terms of effort or complexity. If you’re ignoring that aspect then you can easily find lots of stuff to complain about.

Either way SVOGI has no compelling long term benefits that would entice an engine developer to choose it over RT. It’s dead end tech.
 
And yet there's still no sign of game adoption rate increasing substantially.
There are, you are just not seeing it. 215 games (not counting mods/apps) in 6.5 years is stellar actually.

And the vast majority of it doesn't have the performance to effectively play games with RT enabled
They do, just not at max settings. Mid range GPUs are not necessarily meant to run games at max settings.

The 4060 averages 31.5fps at 1080p in AC: Shadow with RT enabled
It averages 31fps without RT as well. Besides, performance is not the same as adoption rate. You keep mixing the two. 4060 owners don't need to run the game at max settings, in fact they rarely do.
 
Developer interest in SVOGI died when they realized the 8th-gen consoles were too weak to support it and it stayed dead when it was revealed that the 9th-gen consoles had HWRT. It's really that simple. Metro Exodus EE and Indiana Jones proved that even a machine with weak HWRT capability like the Series S could still handle diffuse RTGI, so choosing SVOGI over HWRT is pointless for any platform with any HWRT support at all, no matter how weak. Support for Pascal, little Turing (GTX 1600), RDNA1, and Vega is mostly irrelevant for developers making $70 current-gen only games with real-time GI, and if for some reason you want to support those obsolete architectures anyways Ubisoft's last few games have demonstrated that software RT is a viable fallback for HWRT.

There's a limited number of GPUs still in use with enough compute performance for SVOGI but no HWRT capability, and an extremely limited number of games that would benefit from real-time GI but need to target those older GPUs. For every other game there's no reason to consider anything other than baked GI or RTGI.
 
Last edited:
I’m referring to the HDR point he raised in this thread. Referring to HDR frame buffer formats. E.g. HL2.
Yes, but you replied to his reply about HDR screens. The argument against HDR adoption should come from responding to a different point davis.anthony raised if you aren't continuing that specific line via arandomguy. ;)
 
Waiting for hardware adoption before adding software support is an absolute backwards way of thinking.

Why would anyone buy better ray tracing hardware if no one was supporting it?


In fact this entire subject sounds suspiciously like an AMD fan discussion. "My AMD card doesn't support it, so shouldn't we wait until it does before games start supporting it?"

Just imagine if we had this mindset 20 years ago. 2005 and still making games in a 4:3 square screen 1024X768 format because not everybody had HDTV or widescreen monitors yet. Remember how the Xbox 360 launched with no HDMI connector because "not enough people had HDMI HDTVs" to bother supporting it? And are we all now saying that was a brilliant move and we should do the same with games now?
 
In fact this entire subject sounds suspiciously like an AMD fan discussion. "My AMD card doesn't support it, so shouldn't we wait until it does before games start supporting it?"
Even an RX 6600 can handle diffuse RTGI; AMD has already supported it for three generations now. Personally, I've read more comments from Pascal users complaining about Alan Wake 2 and Indiana Jones than RNDA1/Vega users.
 
Back
Top