GPU Ray Tracing Performance Comparisons [2021-2022]

Better lighting cannot compensate for texture detail and vice versa.

It can, actually. I'd say lightning is the nr1 aspect to sell quality visuals. I even remember back in Half Life's 2 times, when they were marketing Lost Coast, specifically because of the improved lightning, hdr. Was watching a trailer where Valve also had that stance - lightning is number 1.
 
I honestly can't say that we have any issues with texture resolution in games these days and can't really see why it would need to be higher.
More textures in one frame may be an interesting change but this would increase the amount of assets to produce and affect game's budgets.
I'd say that RT and things like Nanite are far more likely candidates at increasing VRAM usage going forward, not textures.
 
Computer graphics is essentially the simulation of light and its interaction with matter in a 3D world, as viewed from a discretized 2D viewport. All aspects of the simulation are important to invoke suspension of disbelief in the viewer. Lower-fidelity simulation of one aspect will be apparent when juxtaposed against higher-fidelity simulation of the others.

Textures are a crude way to approximate the interaction of light with matter. The past decade has seen major improvements in the fidelity of this simulation via physically-based material pipelines. Personally I think there's a stunning difference between the end of the PS360 era vs. the end of the PS4 era, and I would attribute a majority of this improvement to improved material simulation. The *resolution* of material properties (textures) reaches diminishing returns at some point, and becomes far less important in terms of overall fidelity. Essentially what I'm saying is we've made major strides in "textures" over the past decade, they are not the fidelity bottleneck at the moment.

In contrast, light transport is still a bunch of horrific hacks. It stands out. We still see floating objects without proper contact shadows, vanishing screen space reflections, and nauseating cube maps. Artists spend effort faking GI with fake lights. Ray tracing is the most straightforward way to correctly simulate light transport. And no, we don't need full-on path tracing to realize its benefits. We've seen a range of effective hybrid solutions between Insomniac, Remedy's and 4A's stuff, all of which can reach 60fps on existing commodity hardware. But it still needs careful dev effort to balance effectiveness with performance. Think about that -- it's already viable and showing massive visual benefits, but there's also a huge runway in front for further improvement. That makes RT *the* prime candidate for graphics evolution in the years ahead.

I don't think texture *resolution* of all things is in need of comparable improvement.
 
It can, actually. I'd say lightning is the nr1 aspect to sell quality visuals. I even remember back in Half Life's 2 times, when they were marketing Lost Coast, specifically because of the improved lightning, hdr. Was watching a trailer where Valve also had that stance - lightning is number 1.

It can very well be the most important aspect, but that does not give it ability to compensate for texture detail.
 
It doesnt replace the need for textures, but its more important. Its a bigger factor in an image looking good. If you can only chose one of the two, good lightning is the one you want, not textures.

I actually managed to locate that ancient video i was talking about, with Valve and lightning :D

 
When it released, Half Life 2 (the main game) had stunning textures for its time. On the flip side, the lighting engine was beginning to show signs of age. Static lightmaps actually worked quite well for the look that the game was going for, but you could see resolution issues with the lightmaps. And yeah LDR lighting had run its course. So it makes sense that lighting improvements would be the next major step for the Source engine.

I recall that the first few instances of FP32 based HDR lighting weren't all that great since people were still figuring out the right tone mapping equations. Games from that era have this typical surreal bloom effect. I'm thinking of AOE3, Gears of War and other UE3 based games (there were also a ton of horrid fake-HDR bloom effects but that's another story). I think maybe Uncharted 2 is around the time when people began to figure out how to do tone mapping right.

With the HDR pipelines figured out some of the focus moved back to materials/shading, and that led to the physically based materials we have today.

And we're now back to lighting. Tick-tock. Of course I'm grossly simplifying the cadence of these things, but that's kinda how I've perceived things from 50,000 ft. over the past decade and a half.
 
I honestly can't say that we have any issues with texture resolution in games these days and can't really see why it would need to be higher.
More textures in one frame may be an interesting change but this would increase the amount of assets to produce and affect game's budgets.
I'd say that RT and things like Nanite are far more likely candidates at increasing VRAM usage going forward, not textures.
I think texture resolution is still far too low to produce photo realistic graphics. Lighting is definitely the area that needs more work though.
 
I think that Far Cry 6 in particular with the HD texture pack is a sign that some hardware/drivers need to start handling memory overallocation better because ultimately it's going to be more convenient for developers to rely on having this mechanism in place ...
 
I think that Far Cry 6 in particular with the HD texture pack is a sign that some hardware/drivers need to start handling memory overallocation better because ultimately it's going to be more convenient for developers to rely on having this mechanism in place ...
Hasn’t Nvidia historically been superior in this area under DX11 and prior? Is there much they can do to circumvent poor application behavior from a driver side under DX12?
 
I think that Far Cry 6 in particular with the HD texture pack is a sign that some hardware/drivers need to start handling memory overallocation better because ultimately it's going to be more convenient for developers to rely on having this mechanism in place ...
From Far Cry modding discord apparently

pEOwyid.png


Co10VSa.png
 
Ubisoft is used to be embarrassed one way or an other. Without any consequences, so why not release this game like this. Not a lot of ppl care (sadly).
 
Its gonna be pretty embarrassing for Ubisoft if that turns out to be true. They would have effectively released higher hardware recommendations to get around an easily rectified software bug. Recommendations that just happen to heavily favour the games sponsor.

HD textures are the new tessellated concrete barriers?

I don’t understand what that discord post is saying. Does installing the HD pack result in lower quality textures being used in-game?
 
Its gonna be pretty embarrassing for Ubisoft if that turns out to be true. They would have effectively released higher hardware recommendations to get around an easily rectified software bug. Recommendations that just happen to heavily favour the games sponsor.
Exactly.. and they're even doubling down on the issue being VRAM capacity.. and not any kind of bug:

NNb8PDl.png


It would make them look even more embarrassing indeed..

Hopefully this issue can be 100% replicated and verified and perhaps @Dictator could make an update video bringing this specific discovery to Ubisoft's attention front and center in hopes of a proper fix.
 
What we need is eye tracking. That is really the magic bullet. If we can spend resources where the user is looking we could have amazing visuals with the hardware we have now and it would fix VR as well.
 
What we need is eye tracking. That is really the magic bullet. If we can spend resources where the user is looking we could have amazing visuals with the hardware we have now and it would fix VR as well.

Proper eye tracking would be magical but it will only solve for shading frequency. It won’t solve environment detail.
 
Back
Top