No DX12 Software is Suitable for Benchmarking *spawn*

I'd argue this isn't an opinion, it's just plain wrong.

Not wrong for me. I wouldn't use the 2060S for RT other than to see it and then immediately disable it. Even with DLSS, the performance hit just isn't worth it in the vast majority of current "RT" enabled games. Metro: Exodus and maybe Control would be the only 2 that I might consider it, but if I can't run the game at a locked 60 FPS at 3200x1800 (my current standard gaming resolution on a 55" display) with at worst DLSS Quality setting, then it's just not something I would enable.

I prize fluid gameplay in a window at acceptable to me IQ settings over pushing the graphics settings so high that it either impacts my game play experience or forces me to play in a smaller window. Non-standard and pretty specific, yes. But I want to play games how I want to play them and not how someone else tells me I should play it.

Of course, it's a moot point at the moment since I'm not willing to pay out the wazoo for a 2060S. :p But maybe in the fall Ebay will get flooded by them and I might take the chance on a used GPU for the first time ever.

I'm excited for the future of RT, but it's currently not quite there yet for me. It's a similar situation to how I immediately disabled shadows in all games up until about 2 or 3 years ago when I started allowing shadows to be on in some games where it wasn't so distractingly "wrong." With no shadows it was also distractingly wrong, but off offered a lot of extra performance I could use for other IQ settings that weren't so wonky to me.

Regards,
SB
 
He acknowledges all of what you said in his summary at the end of the video.
Verbal acknowledgment is just worthless, when you have all of these charts showing how the 5700XT is faster than 2060 Super, without even bothering to test the 2060 Super with DLSS, as it should be, or without showing how much better IQ the 2060 Super can access through RT. He uses the video to just reinforce his old view, nothing more.
 
But I want to play games how I want to play them and not how someone else tells me I should play it.
Which makes a 2060S a better fit for you since you'd have way more options in how you want to play your games on it.
As I've said blanket statements about RT makes no sense. Take the upcoming FH5 for example - if it will keep it's RT usage to Forzavista on PC then it won't even be capable of affecting the gameplay performance in this title and will only improve the "photo mode". Would you prefer for RT to be absent from the game in such case too?
 
Which makes a 2060S a better fit for you since you'd have way more options in how you want to play your games on it.
As I've said blanket statements about RT makes no sense. Take the upcoming FH5 for example - if it will keep it's RT usage to Forzavista on PC then it won't even be capable of affecting the gameplay performance in this title and will improve the "photo mode". Would you prefer for RT to be absent from the game in such case too?

It's not gameplay, so I don't really care about "photo mode." I'm not a screenshot junkie. :p

I'm going to pick the video card that allows me to play the game at the settings I want at a resolution I've chosen at the framerate I want (currently 60 FPS, but as soon as I get a HDMI 2.1 card, that will be 120 FPS minimum). DLSS Quality (Balanced and Performance have too many artifacts that are visible to me in most games that those aren't useable for me) might bring it up to that point in some games where another card can't.

So for 2060S the RT is a non-factor, but DLSS Quality might be something that could give it an advantage. Or it may not be enough. I'll make that decision when it comes time to choose a card that fits in my budget. Basically whenever cards get back to reasonable prices and I can make a determination for what fits in my budget at that time, that'll be the card I get.

The requirements will be ability to run games at 120 Hz in a 3200x1800 window at settings that are acceptable to me (which will certainly be better than my 1070 :p). Whatever fits into my budget with the best IQ at that resolution and framerate will likely be the card I get. For the current NV and AMD generation cards, RT isn't really a factor. So it basically comes down to DLSS, but DLSS isn't available in the vast majority of games that I play (mostly Indie and AA, I've played 1 AAA game in the past year) so it doesn't have much of a bearing on my choice. At the level of games I play, there's far more likeliness for FSR to be available or be implemented, but the quality hit for FSR means that it also doesn't have much bearing on my purchasing decision for the current generation of GPUs.

So, it basically comes down to basic non-RT, non-DLSS, non-FSR rendering speed. But if the performance is so close as to be basically the same between two cards, then RT might tip the balance as something I can look at and then disable just to see what I can maybe look forward to in the next generation of GPUs.

Regards,
SB
 
It's not gameplay, so I don't really care about "photo mode." I'm not a screenshot junkie. :p
Just drop it, there are some people here for whom only their subjective view matters and everyone disagreeing with that specific view are just wrong no matter what.
 
For the current NV and AMD generation cards, RT isn't really a factor
It is when you're getting releases which won't even launch without RT support. The amount of such releases will increase with the phasing out of support for previous generation of consoles. Both 5700XT and 2060S are fast enough to deal with games from new console generation (if only at reduced settings / resolutions) but one of them may not have the features required to run them. So it is a factor already.

Just drop it, there are some people here for whom only their subjective view matters and everyone disagreeing with that specific view are just wrong no matter what.
There is nothing "subjective" in anything I've said here today.
 
Last edited:
I hope this gives more of an idea of what B3D PC discussions are reduced to now with anything Nvidia related. There are no more "discussions" here.
 
I hope this gives more of an idea of what B3D PC discussions are reduced to now with anything Nvidia related. There are no more "discussions" here.
With HUB as the source of discussion what did you expect? I suppose they are using RBar to increase performance but not DLSS in games that support it.
 
Not wrong for me. I wouldn't use the 2060S for RT other than to see it and then immediately disable it. Even with DLSS, the performance hit just isn't worth it in the vast majority of current "RT" enabled games. Metro: Exodus and maybe Control would be the only 2 that I might consider it, but if I can't run the game at a locked 60 FPS at 3200x1800 (my current standard gaming resolution on a 55" display) with at worst DLSS Quality setting, then it's just not something I would enable.

I prize fluid gameplay in a window at acceptable to me IQ settings over pushing the graphics settings so high that it either impacts my game play experience or forces me to play in a smaller window. Non-standard and pretty specific, yes. But I want to play games how I want to play them and not how someone else tells me I should play it.

And there's obviously nothing wrong with having this preference. But there's a world of difference between saying "I wouldn't choose to enable RT because I favour framerate and/or image quality over core graphics" and saying "RT isn't viable at all on a specific GPU".

The 2060S has demonstrated RT performance on par with or even ahead of the PS5 and XSX even without DLSS. If we were to accept HUB's conclusion on this then we would also have to accept that RT isn't viable on those consoles, to say nothing of the XSS. But clearly that would be a statement that flies in the face of reason given how many RT supporting games are being released and enjoyed on them.

The definition of unviable is "not capable of working successfully" or "can't be done". How can anyone look at benchmarks like this and think those statements can reasonably apply?

doom-eternal-geforce-rtx-2560x1440-ray-tracing-on-nvidia-dlss-performance.png


I'm excited for the future of RT, but it's currently not quite there yet for me. It's a similar situation to how I immediately disabled shadows in all games up until about 2 or 3 years ago when I started allowing shadows to be on in some games where it wasn't so distractingly "wrong." With no shadows it was also distractingly wrong, but off offered a lot of extra performance I could use for other IQ settings that weren't so wonky to me.

Regards,
SB

I understand what you're trying to say although I'm not sure this is the best analogy for RT given that the whole point of RT is to do away with elements of the lighting and shadowing model that look distractingly wrong. RT isn't equivalent to the shadow problem you describe above, it's the solution to it. ;)
 
Last edited:
How damn hard it is for people to understand that not everyone shares exact same views on things?
You think RT is the gift from gods and whatnot? Good for you, there's plenty of reviews to choose from which cater to your views
You think RT isn't relevant at this time? Good for you, there's at least one site which agrees and provides relevant benchmarks for you too.
Same goes for DLSS, FSR and whatever else someone comes up with.

Grow the F up. If you don't consider HUB or [insert any site] catering to your views, then don't. But don't try to force your view to everyone and dismiss reviews which don't happen to fit your needs from everyone else and their needs.
 
Reviews and benchmarks are not "personal views".
What kind of games and settings you want to see tested are. And not everyone has to cater to what you want and it doesn't make them any less worthy or relevant - except for you.

Hell, even people trying to recruit new followers to a religion aren't that stubborn.
 
Hardwareunboxed did run a survey which showed that the majority of their viewers didn't care about raytracing. Until a new survey shows a major force in favor of raytracing, or they decide themselves that raytracing no longer can be ignored prior to that, I don't see why people need to question their methods.

I can understand the minor part of their viewers making noise about their lack of raytracing, but for the other pro-raytracing crowd, there is indeed no shortage of sites providing those results.

If your viewers specifically tell you what to focus on and you go the other direction it's not a wise choice.


Reviews and benchmarks are not "personal views".

The numbers from the benchmarks aren't "personal views", but whether setting X or Y are important, even the games that you do end up testing, definitely are.
 
What kind of games and settings you want to see tested are.
What games and settings are getting tested influence the results which you're getting. Which is why it is a rule of thumb to provide as much data as you can instead of resorting to "personal opinions" on what you think about features like RT.

And not everyone has to cater to what you want and it doesn't make them any less worthy or relevant - except for you.
When a lot of people consider your reviews worthless and irrelevant due to how you handle testing and the presentation of data then they kinda become such, and not only to me.

The numbers from the benchmarks aren't "personal views", but whether setting X or Y are important definitely are.
The numbers are a direct result of what settings are being used for benchmarking.

Hardwareunboxed did run a survey which showed that the majority of their viewers didn't care about raytracing.
Also this - well this is just hilarious. So the majority of viewers who they've been telling for several years now that RT sucks and kills performance without showing much data to back that claim up think that RT is bad and don't care about it? Wow what a result!
 
It’s ridiculous to call their benchmarks synthetic. They are perfectly viable. And DLSS does not offer better quality than native when used at 1080p, the most likely resolution for this performance tier.
 
The numbers are a direct result of what settings are being used for benchmarking.

Yes, what is this even an answer to? As said, the numbers aren't personal views, but their decisions to skip setting X and/or Y due to whatever, or decide what games to test, that are personal views. In this case foregoing raytracing because the vast majority of their viewers don't consider it important.

Also this - well this is just hilarious. So the majority of viewers who they've been telling for several years now that RT sucks and kills performance without showing much data to back that claim up think that RT is bad and don't care about it? Wow what a result!

They did have special videos focusing only on raytracing too, but do consider that their viewers aren't robots either, if they were convinced on what they saw of raytracing, either from HU or other sites, they are free to voice their own opinion that raytracing should have a bigger part in their future tests. Worst case for HU, they could even lose viewers.

It would be interesting however if eg Sony or Microsoft could show us the numbers whether most of their userbase decide to play with raytracing on or turning it off for better performance.
 
Last edited:
Back
Top