Nvidia giving free GPU samples to reviewers that follow procedure

This is only a big deal because it is nVidia tech. If RT was AMD only tech, no one would give a shit and everyone would be arguing the exact opposite of what is being argued now. This is how it has gone historically, and sadly, nothing has changed.

Not true.

The internet went gaga for async compute when AMD was well ahead. Nobody argues about it now because Nvidia has caught up. Same way nobody argues about tessellation now that AMD has caught up. And one day nobody will be arguing about raytracing either. People don't change, only the things they argue about do.
 
Not true.

The internet went gaga for async compute when AMD was well ahead. Nobody argues about it now because Nvidia has caught up. Same way nobody argues about tessellation now that AMD has caught up. And one day nobody will be arguing about raytracing either. People don't change, only the things they argue about do.
I'm fairly sure I remember async compute being dismissed and regarded as a hack to work around AMD's poor hardware design at Nvidia's expense. I also remember people viewing low level APIs quite negatively and considered them to be "AMD constructs."
 
I'm fairly sure I remember async compute being dismissed and regarded as a hack to work around AMD's poor hardware design at Nvidia's expense. I also remember people viewing low level APIs quite negatively and considered them to be "AMD constructs."

Is that any different to people dismissing RT because it’s not in the majority of games released in 2020 even though it’s present in many of the biggest titles? The reality is that async was an important advancement and RT is also.

Putas mentions bad faith and I completely agree. Dismissing one of the most fundamental and exciting advances in rendering tech on a graphics forum because it hasn’t dominated the market yet after just 2 years is as bad faith as it gets.
 
Is that any different to people dismissing RT because it’s not in the majority of games released in 2020 even though it’s present in many of the biggest titles? The reality is that async was an important advancement and RT is also.

Putas mentions bad faith and I completely agree. Dismissing one of the most fundamental and exciting advances in rendering tech on a graphics forum because it hasn’t dominated the market yet after just 2 years is as bad faith as it gets.
Very few people are dismissing it. Being of the opinion that its contribution to visual quality hasn't yet been worth the performance hit is a completely reasonable view. Backing up Nvidia because an independent reviewer doesn't meet their criteria of RT content is absolutely unreasonable. I'll tell you something I don't remember though, people shitting on any review that didn't have enough games utilizing async compute via "AMD constructs." The overly aggressive Nvidia fans consistently display more bad faith than any group I've ever come across in tech.
 
Last edited:
Most people here are not fanboys and are not "overly aggressive". Someone said I was a nvidia fanboys because I believe in showing what the card can do in a X vs Y review. And I didn't say "RT is great", I just said "talk about it, good or bad, show the results when testing compatible game". I'm not a fan of a dedicated video outside of the review. I don't believe it's an overly aggressive stance... (And for the record my last 3 gpus are R290=> Fury X => Vega FE, so...)

But some youtuber will shout "FANBOYS" when people disagree or critic their work....
 
Last edited:
You can't cover everything in every video, that's a hopeless cause. Even if they did include RT games there would be tons of other things one could bring up that particular video doesn't touch at all. What makes RT optimized for single manufacturer so special it needs to be included everywhere?

No but in a video of 40 benchmarks, several of which are already RT enabled, you could at least include a few RT benchmarks, especially in those games that as mentioned, are RT enabled. Both the GPU's in question can enable RT. Both the GPU's in question are high end. PC gamers tend to want to run at the highest settings possible on a high end GPU. So showing results for games at lower than their maximum settings just because the lowered settings significantly favour one vendor over the other is blatantly misleading IMO.

Which is better, todays reviews for todays software or todays reviews trying to predict future? Especially with software* that has been only optimized for one manufacturers RT hardware, how well do you think they could ever represent games of tomorrow with RT designed to fit consoles and both manufacturers?

*We're only now starting to get AMD optimized titles, 2 out there now but they've been dismissed by the RT-crowd because "they barely use RT at all".

You're making assumptions about the direction of RT based on no evidence. Using you're own logic, is it better to benchmark based on todays available RT games or trying to predict the future by not reviewing RT at all because "todays RT games probably aren't representative of future RT games"?

Some of the biggest releases right now are RT enabled so it makes sense to at least include those in the review, and yes of course that should include the ones that perform relatively better on AMD hardware. People can make their own judgement about whether they represent the future or not.
 
Compared to the mass of games released without in same time frame, it is.
Which is the fate of any new feature, ignoring testing that new feature because it's still new is both dumb and misleading.

When DX12 was new, or any DX version for that matter, all reviewers tend to make sure their reviews contain a game with the new version, heck, Vulkan games are fewer than DXR games, yet every review often contains at least one Vulkan game.
 
Which is the fate of any new feature, ignoring testing that new feature because it's still new is both dumb and misleading.
It's not, when you have valid reasons for ignoring it*, like in this case the fact that for many, current uplift in graphics RT provides is not worth the performance hit it causes. Their general audience agrees with them, too.
*they're not really ignoring it, they ignored it in that particular review but have plenty of RT content in other videos
When DX12 was new, or any DX version for that matter, all reviewers tend to make sure their reviews contain a game with the new version, heck, Vulkan games are fewer than DXR games, yet every review often contains at least one Vulkan game.
Well this is just simply not true for Vulkan, PCGamingWiki lists nearly 100 Vulkan titles, in fact it's close to DX12 numbers, they list 103 DX12 titles. And 2180 DirectX 11 titles.
As for DX12, I'm pretty sure it took a while for every review having at least on DX12 title, especially when even big profile sites chose to use DX11 in many titles instead of DX12 because it performed (usually slightly) better.
 
Very few people are dismissing it. Being of the opinion that its contribution to visual quality hasn't yet been worth the performance hit is a completely reasonable view.

It is a completely reasonable view.

Backing up Nvidia because an independent reviewer doesn't meet their criteria of RT content is absolutely unreasonable.

Not sure where you see this happening. The vast majority of people in this thread seem to agree that Nvidia was wrong for strong arming HWUB. That doesn’t mean there can’t be debate on what constitutes a reasonable graphics card review in 2020.

I'll tell you something I don't remember though, people shitting on any review that didn't have enough games utilizing async compute via "AMD constructs." The overly aggressive Nvidia fans consistently display more bad faith than any group I've ever come across in tech.

Well for one there were far fewer games using async compute than RT in a similar period. I think it was Ashes and Hitman for a long time and the former barely counts as a game so it never rose to the same level of importance. E.g. async was never a marketing feature like RT is and was completely ignored by the console crowd whereas RT isn’t.

Btw Nvidia’s haters are just as “overly aggressive” as it’s fans. That’s not a very helpful observation.
 
No but in a video of 40 benchmarks, several of which are already RT enabled, you could at least include a few RT benchmarks, especially in those games that as mentioned, are RT enabled. Both the GPU's in question can enable RT. Both the GPU's in question are high end. PC gamers tend to want to run at the highest settings possible on a high end GPU. So showing results for games at lower than their maximum settings just because the lowered settings significantly favour one vendor over the other is blatantly misleading IMO.
There is nothing misleading about their benchmarks. The lower setting is to have playable framerates. Otherwise it's simply useless anyway. In a way, I am pro max settings. Max everything out for Cyberpunk 2077 at 4K for the RTX 3070 and see how useful those numbers are. Maybe then people will stop nitpicking and using baseless accusations as a way to promote their biased agenda.

Why should a video that focuses on comparing as much as 40 games have to throw in RT results? WHY? There are many channels out there that tested specifically for RT. Let me ask you something... Who else out there tests over 40 games? If there is any other channel or website that does this regularly, I don't know them. Do you? That is quite valuable information and is a LOT of work. No one else does this, yet, if you want RT content, they have a few dedicated videos for them, and as mentioned earlier, there are a bunch of other online resources that do.

Whining about not including RT in these videos of 40 games is EXACTLY the problem. I don't see anyone here saying that they should have included SAM results in the same video. I didn't see the same thing for Async compute either, nor DX10.1, nor rapid packed math etc. The media/reviewers are expected to cater to nVidia at every turn, just because it is nVidia tech. This very loud and extremely annoying group of people wants RT to be mentioned at every turn, not because it is useful information, but because they want to hammer nVidia's tech into everyone else's mind, pretty much as a propaganda to force them down others' throats, and, I have a huge problem with that. Especially because it is still not a useful feature in practice.
 
There is nothing misleading about their benchmarks.

Of course there is. Anyone reading that review in isolation to help them decide between the 2 GPU's would conclude that the 6800 is universally faster and get it. But in fact they would have purchased the slower GPU for any game that uses Ray Tracing. Which is likely to be a lot during the useful life of the card.

The lower setting is to have playable framerates. Otherwise it's simply useless anyway.

This simply isn't true. RT is plenty playable on a 6800 at resolutions lower than 4K. 4K isn't obligatory, especially on a card in this performance category. I'm sure a lot of PC gamers would rather run at 1440p with Ray Tracing than 4K without. Especially if the game can upscale to native res as many can these days.

In a way, I am pro max settings. Max everything out for Cyberpunk 2077 at 4K for the RTX 3070 and see how useful those numbers are.

It will net you a locked 30fps. Nothing wrong with that. Or you could drop down to 1080p for a near locked 60fps.

Maybe then people will stop nitpicking and using baseless accusations as a way to promote their biased agenda.

You need to stop taking this so personally. I'm in no way biased towards AMD. I own an AMD CPU and have owned AMD GPU's in the past. Prior to their launch I was completely open to the possibility of my next GPU being RDNA2 based. But if I'm buying a GPU in this price/performance category, I'm not willing to compromise on what will likely be one of the flagship graphical features for the next few years.

Why should a video that focuses on comparing as much as 40 games have to throw in RT results? WHY?

Because not doing so is ignoring a major aspect of both GPU's performance potential to the obvious advantage of one of them. I'm pretty sure that's already been mentioned above.

There are many channels out there that tested specifically for RT. Let me ask you something... Who else out there tests over 40 games? If there is any other channel or website that does this regularly, I don't know them. Do you?

That's completely irrelevant. They didn't even need to add any other games to the review. Just test those games that do support RT with it turned on. It's highly likely that 6800 owners playing those games are going to want to play them with all settings turned up as high as they can be.

Whining about not including RT in these videos of 40 games is EXACTLY the problem. I don't see anyone here saying that they should have included SAM results in the same video.

It's entirely different. SAM is only available to a small subset of people that would be using that GPU. As Zen2 owner for example, I can't take any advantage of it so it's irrelevant. RT is available to 100% of all users of those 2 GPU's so there's simply no excuse for not showing how they perform with it.
 
Firstly, anyone can quote mine and misrepresent a position. If you're going to do that, I will not discuss anything with you.
Ok. Let's tackle some things.

Many gamers and developers disagree with you.
The amount of people agreeing on something has no bearing on whether something is true or not. That is a fact.

Developers like RT because it eases their development. That does not mean that it is actually practical for the end user at this point in time with the current hardware.

Gamers that think RT is a practical feature at this point in time are either content with 30fps, are simply blind nVidia followers or are unknowledgeable and fall for marketing. Just to copy one person with an actual RTX card;

"I've got a 3080, I tried the whole DLSS and RT, play with it enabled in Cyberpunk BUT at 1400p I still can't always maintain 60fps with DLSS set to quality, I would not use any other setting because you can absolutely tell the difference so for me RT is a bust atm. I paid £875 for my GPU and with the Quality setting in DLSS which basically renders the game at 1100p? I don't always get 60fps!! So if anyway here is buying a GPU that is not a 3080 don't bother with the RT because even DLSS wont save you"
https://www.techspot.com/community/...-41-game-benchmark.266962/page-3#post-1862014

They ended up maxing everything else out and disabling RT. And that is the most practical way to still use these current cards, even if they support RT. And if you think that performance will get better over time with these cards, you're delusional. At best we're going to get better visuals with the same large performance drop. More likely is a higher performance cost for better visuals. And considering the current performance is barely playable (if even that), it's not worth the cost.

Of course there is. Anyone reading that review in isolation to help them decide between the 2 GPU's would conclude that the 6800 is universally faster and get it. But in fact they would have purchased the slower GPU for any game that uses Ray Tracing. Which is likely to be a lot during the useful life of the card.
See what I wrote above.

This simply isn't true. RT is plenty playable on a 6800 at resolutions lower than 4K. 4K isn't obligatory, especially on a card in this performance category. I'm sure a lot of PC gamers would rather run at 1440p with Ray Tracing than 4K without. Especially if the game can upscale to native res as many can these days.
Why would RT be obligatory over 4k?

It will net you a locked 30fps. Nothing wrong with that. Or you could drop down to 1080p for a near locked 60fps.
DLSS performance has a large impact on image quality. It's not exactly viable to call it 4K anymore.

You need to stop taking this so personally. I'm in no way biased towards AMD. I own an AMD CPU and have owned AMD GPU's in the past. Prior to their launch I was completely open to the possibility of my next GPU being RDNA2 based. But if I'm buying a GPU in this price/performance category, I'm not willing to compromise on what will likely be one of the flagship graphical features for the next few years.
Instead, you'll buy a GPU on that price category to run games at 1080p just to turn on RT. That's much better, right...?

Because not doing so is ignoring a major aspect of both GPU's performance potential to the obvious advantage of one of them. I'm pretty sure that's already been mentioned above.
I guess then they should have also included things like power consumption numbers, maybe with Radeon Chill enabled... Overview of the software UI, the overclocking options... Input lag...
At what point does it become ridiculous to include things? Because if you're going to select one specific feature from one specific vendor, the only thing that that screams out is bias. And if you don't see that, I can't help you.

That's completely irrelevant. They didn't even need to add any other games to the review. Just test those games that do support RT with it turned on. It's highly likely that 6800 owners playing those games are going to want to play them with all settings turned up as high as they can be.
Testing 40 games is irrelevant now... Ok... I guess we're done here.

It's entirely different. SAM is only available to a small subset of people that would be using that GPU. As Zen2 owner for example, I can't take any advantage of it so it's irrelevant. RT is available to 100% of all users of those 2 GPU's so there's simply no excuse for not showing how they perform with it.
You really can't see the irony here...? You pretend that somehow there are a bunch of people that have RT enabled cards... Most people won't spend more than $300 on a graphics card... So that already puts all RT cards as a niche. And that would be assuming that these cards are actually practical in using RT at all for all those people that do spend so much money on them... Considering Intel CPUs can also use SAM and has pretty much already be enabled, I wouldn't be surprised if that subset is larger than the amount of people that can use RT.

It's quite obvious that if SAM was nVidia's tech, you would be begging for Hardware Unboxed to include those in their benchmarks also. Because neither of them are more or less niche than the other.


I'm done here. Things are obvious. One can't expect politicians to speak out against their own party. And this thread is a green party. Goodbye.
 
Didn't know Steve was posting here.

Anyway, happy new year everyone, take a deep breath, gpus are not a very important topic all things considered ;)
 
Anyone downplaying RT in 2020 or 2021 is not keeping up and are thus irrelevant.

Microsoft provied the API for DirectX Raytracing (DXR)
Vulkan practically copied Microsoft API and implemented it.

The software side is there.

NVIDIA does Tier 3 Raytracing in hardware (Ampere, Turing) and shader based Raytracing (Pascal)
AMD does Tier 2 Raytracing in hardware (RDN2) on PC and consoles.
Intel will support DXR in their upcoming GPU's

The hardware side is there.

Raytracing is coming if you like it or not.

But it is alway the SAME whine with new techonology.
How long was DirectX10 GPU out before any games were out with support?

We have been through this SOOOOOOOOOOO many times.

Hardware Transform&Lighting
Anisotropic Filtering
Anti Aliasing
Shader Models (SM1.0, SM2.0, SM3.0 etc.)
DirectX (DX 9.0(a/b/c), DX10(.1), DX11(.1) & DX12(.1))

Everything we have done is graphics have been to EMULATE raytracing to give better visual fidelity.

Now it is here, but since performance is not even...some people are sour grapes as always.

TL;DR
Do benches without the last features in the lastest games = You have declared yourself obsolete and irrelevant.
 
Developers like RT because it eases their development. That does not mean that it is actually practical for the end user at this point in time with the current hardware.

That would be true if they weren’t also providing non-RT implementations of shadows, GI and reflections. Adding RT to your game is more work not less.

Gamers that think RT is a practical feature at this point in time are either content with 30fps, are simply blind nVidia followers or are unknowledgeable and fall for marketing.

Do you honestly believe that RT in games today is only playable at 30fps? See that’s what happens when reviewers don’t cover stuff properly. People end up grossly misinformed.
 
Very few people are dismissing it. Being of the opinion that its contribution to visual quality hasn't yet been worth the performance hit is a completely reasonable view. Backing up Nvidia because an independent reviewer doesn't meet their criteria of RT content is absolutely unreasonable. I'll tell you something I don't remember though, people shitting on any review that didn't have enough games utilizing async compute via "AMD constructs." The overly aggressive Nvidia fans consistently display more bad faith than any group I've ever come across in tech.

I take it you didn't interact with Intel fans for the past 3-4 years?
 
Back
Top