Nvidia giving free GPU samples to reviewers that follow procedure

So reviews should be dictated by polls now? Reviews should be objective not subjective or under the influence of the reviewer personal opinion.
I wasn't talking about what should or should not be included in reviews, but that their attitude towards it is not as insane as the post I quoted suggested.
 
They are consistently beating the drum of RT being not relevant or important even when consoles are sporting it and when PC games left and right are supporting it.

That is fine in my book. 20 games per year isn't left and right, supporting it does not mean it is an attractive choice and consoles never had big influence on PC gaming.
 
That is fine in my book. 20 games per year isn't left and right, supporting it does not mean it is an attractive choice
But it's an important factor in the visual experience of high end GPUs, one that shouldn't be skipped or undermined under any circumstances.

consoles never had big influence on PC gaming
PC gaming leads the way, and this is a PC gaming feature. And it's a big part of Ultra graphics right now, it can't be ignored on a whim.
 
That is fine in my book. 20 games per year isn't left and right, supporting it does not mean it is an attractive choice and consoles never had big influence on PC gaming.
Problem is top AAA blockbuster games are using RT and DLSS. You can't ignore it
Another issue, Hardware Unboxed never talked about Minecraft, which is the second most popular game ever, and it's stunning path traced RT patch (out of beta this week). Of course, it kills AMD so again, they simply ignore it...
 
Cyberpunk? One game, #1 spot on Steam. Have you found out where Godfall and Dirt 5 is?

I mean people buying PS5 and Series 5 and they dont even have five "true" games...

Is it really a reply to me?

But it's an important factor in the visual experience of high end GPUs, one that shouldn't be skipped or undermined under any circumstances.

Good thing they did not skip it. I still don't see why is it that important.

PC gaming leads the way, and this is a PC gaming feature. And it's a big part of Ultra graphics right now, it can't be ignored on a whim.

Good, so no need to drag consoles into this.
 
Another issue, Hardware Unboxed never talked about Minecraft, which is the second most popular game ever, and it's stunning path traced RT patch (out of beta this week). Of course, it kills AMD so again, they simply ignore it...

Released December 8th and you already demand it in reviews?
 
I find the ray tracing issue a bit interesting and I wonder how much of it just comes down to how they are classified as "max settings+" basically or have hardware lock. Would the sentiment be slightly different if they were rolled under "max settings?" Would it be different if all cards were supported even if there was no hardware acceleration? And of course the the vendor bias angle.

Let's stick with lighting for instance. When ambient occlusion was first rolling out and later SSAO, and latter HBAO/HDAO/etc. you could argue those were limited in roll out, had a tremendous performance hit, and arguably a low visual difference. But was there mass sentiment in wanting to not use them in tests until wide adoption? Or was it just because they were bundled under "max settings" they were tested regardless?

Or how how about more recently with the shift to DX11+ APIs, whether that be DX12 or vendor specific ones like Mantle? DX12 adoption rate certainly wasn't high early on, but there was a lot of pressure even then to test with early synthetics. Mantle support never ended up higher than ray tracing. Mantle support also never ended up anywhere near DLSS. Yet at least to me recollection there was a heavy push that one vendor had better low level API performance than the other for the time.

Also what if say 1 or 2 games over the next 2 years requires >12GB for the max texture setting? Are those games relevant at that setting?
 
HardwareUnboxed tested Metro Exodus and Shadow of The Tomb Raider without RT at all, what is the justification for that move? It's a standard practice in reviews to run the games at max settings, and RT is the ultimate max setting in any PC game right now, excluding it from testing is plain stupid and ridiculous.
Especially the integrated of Metro Exodus is a bad example though, because it dials back some options from „Extreme“ and even „Ultra if you will: Overall quality is back to Ultra (from Extreme) and Shading Rate % back to 100 (from 200), additionally, it enables DLSS as a default in the RTX preset.
 
Released December 8th and you already demand it in reviews?
the RTX beta version is out for months and it works on both AMD and Nvidia. Other reviewers have no trouble to include it in their benchmark suite (Linus, Gamer Nexus, etc)
 

And it starts.


Are you saying that Cyberpunk has a bunch of unoptimized brute force RT effects requested by Nvidia to harm AMD? Since we all agree conspiracy theories are stupid I'm sure you have information to support that position.

So nvidia acted nefariously (i.e. castrating performance on AMD and previous-gen nvidia cards) on past occasions when they had a specific advantage in geometry processing.
Now they have a specific advantage in RT performance, and Cyberpunk 2077 is already showing pretty terrible RT performance across all RT-enabled GPUs, with the advantage that it hurts Ampere GPUs less.
Just like hairworks in Witcher 3 Hairworks killing performance on all GPUs except the freshly released Maxwell at the time.

And when faced with these facts you still want proof of.. what exactly?


You're expecting someone to leak a secret tape recording of Jen Hsu Huang telling out loud his plans, like this was a comic book or marvel movie? A video recording of a nvidia engineer pointing a gun at a CDPR dev to force him to write some code?
We're observing:
- a Nvidia-funded game like Cyberpunk getting an astronomical use of raytracing effects to the point of making games barely playable on $1500 graphics cards.
- a hardware reviewer getting blacklisted out of Geforce FE review material for not focusing their reviews on raytracing performance.

What other realistic fact do you need to observe, to declare this isn't just conspiracy theory? More examples of games getting performance-killing raytracing implementations? Exactly how many more?
I suggest we set those goalposts right now, so they can't be changed on either side down the road.





BTW, this is nothing like Crysis and Cyberpunk doesn't compare to it.

The "can it play Crysis" meme comes from the first game that had spectacular visuals at the cost of terrible performance among all cards. It didn't run any code from IHVs and it ran pretty bad on all architectures. Before the RV770 came out it ran better on G80/G92 cards but those were clearly superior to R600/RV670 cards across the board. Cyberpunk isn't punishing all graphics cards by default on regular rasterization, like Crysis was.



Cyberpunk is punishing all graphics cards on the (lack of sufficient) acceleration of one very specific feature where nvidia excels on their latest architecture. It's exactly like what they did with geometry performance on Maxwell (Hairworks on Witcher 3, super detailed concrete slabs in Crysis 2, geometry-based godrays on Fallout 4, etc.).
 
Last edited by a moderator:
I find the ray tracing issue a bit interesting and I wonder how much of it just comes down to how they are classified as "max settings+" basically or have hardware lock. Would the sentiment be slightly different if they were rolled under "max settings?" Would it be different if all cards were supported even if there was no hardware acceleration? And of course the the vendor bias angle.

The lack of support on AMD hardware certainly is/was a major factor. But I think the biggest reason is that RT had such an outsized performance hit that it really couldn't be used as a default option in reviews. The best reviewers could do in that situation is show two sets of numbers with RT on and off.

The subjective opinions about the benefits of RT are most certainly driven in a large extent by vendor bias. Many of the people sharing those opinions (myself included) have never even experienced the tech first hand. Hopefully now that AMD is in the game we see less of that nonsense but it won't go away completely.
 
the RTX beta version is out for months and it works on both AMD and Nvidia. Other reviewers have no trouble to include it in their benchmark suite (Linus, Gamer Nexus, etc)
Yes, beta version. Since when are people expecting beta-builds to be reviewed anyway? If it wasn't RTX, it wouldn't have been at all. What next, demand games to be reviewed the second devs have gotten it to say Hello World! on screen?

Source? On what basis are you evaluating the efficiency of their implementation?
Every review? He said performance, not efficiency.
 
The real reason for the blacklisting is HWU showed that the RT performance (impact) on 3000 and 2000 series is almost exactly the same for the vast majority of titles.
 
Yes, beta version. Since when are people expecting beta-builds to be reviewed anyway? If it wasn't RTX, it wouldn't have been at all. What next, demand games to be reviewed the second devs have gotten it to say Hello World! on screen?
It's not a standard game. It's the second most sold game ever. A game played by 73 million players and that got up to half million concurrent online players. If it's not relevant, then nothing is
 
Why should youtubers get free hardware ? Or anyone for that matter ? AMD, nvidia, intel, I don't care, it's all publicity. No point of giving hardare to people not looking at it the way you want to, by "not caring" about the main point of your product... Don't review graphics cards if you don't care about RT in 2020 :/

Now, saying "I don't think it's important" while showing infos about it is perfectly fine, but not even benchmarking it like they did in the 6800xt review is a joke...
 
He said the game is barely playable. That is hysterical nonsense. You don't have to play it at 4K RT Ultra.
You should have included that part in the quote, I thought you were only replying to the part in the quote. You can't play it at 4K RT Ultra on anything out there, even NVIDIAs own benches put it at very cinematic 22 FPS on 3090 + i9-10900K

It's not a standard game. It's the second most sold game ever. A game played by 73 million players and that got up to half million concurrent online players. If it's not relevant, then nothing is
Standard or not, beta game reviews are exception to the norm and RTX is the only reason Minecraft beta got to reviews. The game has gone through many other big updates and changes in beta builds before too, none of which have been reviewed.
 
Back
Top