Value of Hardware Unboxed benchmarking

I am. The old renderer gives me a sense of many leaves casting shadows while RT shadow seems to come from hollow branches.
Far things cast blurry shadows. It's called life.

GYNFRfJakAA27CN.jpg

next to a noise-free rasterized effect be proof that it is a problem
Here is COD Black Ops 6 at max settings. 100% rasterization, 0% ray tracing, 100% noise galore.

Call of Duty  Black Ops 6.jpg
 
RT doesn't have "a noise problem", that's just complete and utter BS on part of HUB.
Graphics in general has a noise problem, and RT is in fact making that problem less.
Okay maybe we are progressing here: rtrt being a subset of "graphics in general" means that rtrt can infact have a noise problem. Now I would just remind you that Tim's video series isn't focused on "graphics in general", but on ray tracing specifically.

In my experience, the noise artifacts in rtrt effects are visually quite novel and so are the denoising solutions. Maybe I'm just tripping but to me the type of noise that I see in Q2RTX is very different from the noise that I see in Q2 Vanilla.

It's a topic worthy of exploring in video format and might be fun to look at in a couple of years when the tech has developed further. The video came off as neutral and informative to me, not sure where the need to hyperfocus on the title emerges from.
 
Tim’s video is fine in isolation. There are real issues with RT rendering that are worth talking about and resolving.

The issue is that HUB has never to my knowledge made a video about “any” raster artifact of which there are many. So either they don’t actually care about artifacts or have somehow developed an interest in only RT artifacts.

This leads to the obvious conclusion that they’re only doing it for clicks which is understandable since that’s how they make their money. Either way let’s not pretend HUB cares about IQ in general when this is clearly just an RT hit piece.
 
Usually if you’re highlighting a problem you should frame it in the context of the alternatives. If you don’t provide the context no one can judge the scale or severity of the problem.

It would actually be interesting to see a response to that type of “fair and balanced” analysis from HUB. The comments under the current video are hilarious. Their content is really a lightning rod for the anti-RT folks. If you believed the stuff on there you would think SSR and shadow maps are pristine, games looked better 10 years ago, nobody in their right mind uses RT and Nvidia has 10% market share.
 
I can only recommend to watch this part of the Cyberpunk comparision without sound:

Asked yourself what looks better and what are the problems of each side. Then watch it with sound.

I havent seen any Black Myth in this video. So i think we should not let this example get past by:
Reflections:

Shadows:
 
Sounds like you need to go outside during the day and look at some trees and get a better idea of what they're trying to recreate. Shadows cast by leaves several metres off the ground should be blurry
It sounds like you should have read a few posts before the one you replied to; I acknowledge that. But this is computer graphics for video games; reality does not matter.
This is exactly what I mean when I say that some people are so used to old artifacts that they consider them ground truth now.

Here's how forest canopy sun shadows look in real life:

The same goes for you. But also do note what volume the tree crowns cast. The RT renderer doesn't do that.
The non-RT shadows in Indiana Jones are really noisy with a lot of pop and erratic changes. Plus outright wrong with that crazy LOD!
I know and agree and regarding the noise I have made no point to the contrary. I am comparing two wrongs, both are painfully compromised to me.
 
But this is computer graphics for video games; reality does not matter.
Reality does matter as what you see around you is just that - reality. Your eye expect "good graphics" to look like the real world even if its stylized to look in a particular way (most movies are and that's fine).

But even if we disregard that and imagine a game which wants to show graphics based on a set of rules for light propagation which is completely different from reality you will still get better results with RT because this is basically the only option which allows you to implement such set of rules universally and without artifacts. Everything else is a set of various hacks which don't really have much in common with actual lighting.
 
I feel this in part going back to an now age old issue in that some subset of gamers really prefer sharpness and very defined and clean objects/geometry over all else.
 
I feel this in part going back to an now age old issue in that some subset of gamers really prefer sharpness and very defined and clean objects/geometry over all else.
Nah, it's a simple costs issue. You need to shell out a sum to have a GPU on which you will be able to play with RT, and until this "issue" won't resolve itself over time naturally we will be getting these weird takes that RT is making graphics worse or just isn't worth it. It's a way some people are coping with the situation in which they simply can't run it on their PCs.
 
Ray tracing is generally thought to be this trade-off between visuals and performance. Turning on ray tracing improves the visual quality of the game, providing more realistic lighting effects, at the cost of reduced fps performance. But there's actually a second trade-off that is talked about less. Noise. Having now examined a wide variety of ray-traced games, this is an issue that keeps cropping up. In some ray-traced titles enabling the feature does improve lighting quality and realism, but the presentation also becomes noticeably more noisy, whether that's grain, sizzling, boiling or visual effect lag. In this video I'm going to explore these noise issues across a variety of games, including recent releases, because I think this problem is often glossed over when talking about ray tracing. Game developers need to do a better job of minimizing noise in games, and no, Nvidia's ray reconstruction technology isn't a perfect solution, as we'll show later, but also hardware needs to get much faster and more capable so that higher resolution and higher ray count effects can be implemented.

I wish youtube had transcripts (maybe it does for some videos?)

This was the opening statement from the video, basically the thesis. I hadn't gotten around to watching the whole thing, but I do have a number of problems with this off the bat. First is that he's saying noise is a trade-off for ray-tracing, so that's plainly suggesting that he thinks ray-tracing is more noisy than non ray-tracing. But that's selective or subjective about what kind of noise you're talking about. Blurring from TAA is noise, for example. All under-sampled rendering is noisy. Everything is under-sampled. If he wants to say he thinks this type of noise is worse, that's fine, but turning off ray-tracing just trades off one type of noise for others.

He's seems to be talking about DXR exclusively, but screen-space ray-tracing can exhibit all of the same problems. GI, reflections are sort of the main culprits as many people have already pointed out. There's also stuff with volumetrics and ray-marching that can have noise that are also done in screen-space. Essentially ray-tracing and low sample counts can have the same noise whether it's DXR or screen-space.

I also don't agree that the problem is glossed over. It's been one of the main discussion points since DXR was released and the first batch of games came out. Even Nvidia hasn't totally glossed over it. They've spent a lot of time talking about de-noising and ray-reconstruction. Here's a presentation on sampling and noise-patterns that's really exploring where things are going in terms of being able to generate good quality images from low-sample counts and being able to de-noise them well. I think gamers are fully aware, and the industry is investing in solving the problem. Ultimately it's a subjective issue whether someone turns it on or off.


Overall I'm sort of 50/50 on this. I think he generally means he doesn't like the type of noise that DXR games typically exhibit and making a video about that and providing examples is fine. I do think it's a little disingenous to not acknowledge that you can find the same type of noise in games that have screen-space effects. I don't think there's any obligation to go over every type of noise (TAA blur) in the context of this video, but it would have been good to highlight more pros and cons like where DXR can remove other types of noise like light-leaking.
 
Reality does matter as what you see around you is just that - reality. Your eye expect "good graphics" to look like the real world even if its stylized to look in a particular way (most movies are and that's fine).

But even if we disregard that and imagine a game which wants to show graphics based on a set of rules for light propagation which is completely different from reality you will still get better results with RT because this is basically the only option which allows you to implement such set of rules universally and without artifacts. Everything else is a set of various hacks which don't really have much in common with actual lighting.
Maybe my eyes don't have that expectation and certainly not my reasoning. Realistic graphics are often joyless and suppress good gameplay. We can do so much better than realism.

I can't wait for that RT-promised land to come. So far I got more bland graphics.
 
Maybe my eyes don't have that expectation and certainly not my reasoning. Realistic graphics are often joyless and suppress good gameplay. We can do so much better than realism.

I can't wait for that RT-promised land to come. So far I got more bland graphics.
Tell that to CyberPunk 2077.
Or Indiana Jones.
Hell,even try and tell that to Quake 1 or Halflife1 RayTraced 🤷‍♂️
 

Solid investigation.

The basic summary is this:
  • Out of the 6 games they tested, 5 of them showed a performance drop of 2-3% when the overlay was enabled with the defaults.
  • The exception is Hogwarts Legacy, which shows a ~7% drop (!) when the overlay is enabled with default settings.
  • However, in those majority of those cases, this performance loss can be completely mitigated by turning off the Photo Mode/Game Filters option as seen here:
1734444309814.png

1734444473028.png

The exception again being Hogwarts Legacy still showing a very small performance hit even when photo mode/filters off, but as Tim describes it, it's basically 'margin of error' stuff.

1734444614627.png

For the most part, keeping the overlay on but just disabling the photo mode/filters results in no performance loss.

Here's the kicker though - they were smart enough to test the overlay with the old Geforce Experience. Regardless of settings, GFE showed no performance loss whatsoever:

1734445045941.png


So Tim's conclusion is that this is likely a bug, When it was introduced though is the question, considering how long the Nvidia App has been in beta, if this was the behavior from the outset it speaks to come kind of gap in Nvidia's QA. Simply having the filter/phoot mode enabled but not actually activated should not induce a performance hit, no matter how minor.

The good news is the GFE results demonstrate this is not an inherent problem with the drivers/Windows so hopefully it can/will be addressed in future updates.
 
I wish youtube had transcripts (maybe it does for some videos?)

This was the opening statement from the video, basically the thesis. I hadn't gotten around to watching the whole thing, but I do have a number of problems with this off the bat. First is that he's saying noise is a trade-off for ray-tracing, so that's plainly suggesting that he thinks ray-tracing is more noisy than non ray-tracing. But that's selective or subjective about what kind of noise you're talking about. Blurring from TAA is noise, for example. All under-sampled rendering is noisy. Everything is under-sampled. If he wants to say he thinks this type of noise is worse, that's fine, but turning off ray-tracing just trades off one type of noise for others.

He's seems to be talking about DXR exclusively, but screen-space ray-tracing can exhibit all of the same problems. GI, reflections are sort of the main culprits as many people have already pointed out. There's also stuff with volumetrics and ray-marching that can have noise that are also done in screen-space. Essentially ray-tracing and low sample counts can have the same noise whether it's DXR or screen-space.

I also don't agree that the problem is glossed over. It's been one of the main discussion points since DXR was released and the first batch of games came out. Even Nvidia hasn't totally glossed over it. They've spent a lot of time talking about de-noising and ray-reconstruction. Here's a presentation on sampling and noise-patterns that's really exploring where things are going in terms of being able to generate good quality images from low-sample counts and being able to de-noise them well. I think gamers are fully aware, and the industry is investing in solving the problem. Ultimately it's a subjective issue whether someone turns it on or off.


Overall I'm sort of 50/50 on this. I think he generally means he doesn't like the type of noise that DXR games typically exhibit and making a video about that and providing examples is fine. I do think it's a little disingenous to not acknowledge that you can find the same type of noise in games that have screen-space effects. I don't think there's any obligation to go over every type of noise (TAA blur) in the context of this video, but it would have been good to highlight more pros and cons like where DXR can remove other types of noise like light-leaking.

There is a type of noise that is unique to RT techniques because they are stochastic techniques whereas raster is analytical (though "raster" images can of course include stochastic effects as you mention). The mathematical basis for image formation is completely different. Analytical techniques give an exact answer to an equation (to some precision) whereas stochastic techniques only give an estimate of the answer to an equation (which is usually very difficult or impossible to solve). There will always be noise associated with estimating a solution with random variables and that type of noise is unique to stochastic techniques. I do think aspects of stochastic methods should probably be more broadly discussed if the enthusiast community wants to understand ray-tracing as well as it understands raster. I'm biased because I'm a stats nerd, but hasn't it gotten a bit boring simply discussing ray-tracing in the most remedial manner, ie, what effects its used for? Maybe its time to discuss things like bias and variance and their tradeoff in different rendering/denoising solutions (not a direct reply, meant generally)?

There is also the fact that image quality for a stochastic technique should depend on sample size, yet there is no discussion of image quality differences with respect to framerate. Is that because there is no difference, there is no difference at playable framerates, or there is a difference but it hasn't been investigated? Ie, if you have a 4070(X)(X) doing 30 fps in some full ray traced scene and a 4090 doing 60 fps with the exact same settings, is the image quality in each frame from the 4070 exactly the same as the 4090 even though it only has half the samples per time? You'd expect that to be true with an image formed mostly from raster but its not a given that it would be true with an image formed mostly from ray tracing.
 
Back
Top