Value of Hardware Unboxed benchmarking

I have never watched any LTT stuff so I can't comment on their testing, but both HUB and GN have always delivered trustworthy data and are quick to correct any mistakes.

Steve did it for you. There's like 30 minutes of just going over their incompetence, and it's largely just from the past couple of months.
 
And man, are some of the fuckups here outrageous - and this is largely recent stuff! Not to at least mention some eye-popping ethical concerns - such as the pretty disgusting Billet Labs debacle. Jesus.
He said he couldn't afford to pay someone to redo the testing correctly and then auctioned off the prototype. What an absolute power move :LOL:
 
Seems a bit dickish to me.

Verging on, if not outright criminal. Company property was loaned to them, it was requested to be returned - twice - and Linus gave it away. There is also the concern that a competitor may have obtained it. I doubt they have the funds for any legal action against LTT but its not outrageous to think they have a case.
 
Guess this thread is as good a place as any:


Just a brutal takedown. I've never regarded LTT as other than entertainment (...of sorts?) at best, and find Linus generally annoying so don't watch it much regardless. Even so, I kind of naively assumed they somewhat viewed themselves in at least the 'edutainment' category as well, which makes it so utterly bizarre that they would chose to actually touch all this off by 'calling out' Gamers Nexus's testing methodology.

And man, are some of the fuckups here outrageous - and this is largely recent stuff! Not to at least mention some eye-popping ethical concerns - such as the pretty disgusting Billet Labs debacle. Jesus.

Dammit, didn't think to check this thread for the video!
 
Perhaps the part about them testing with the wrong GPU and saying it's crap, but the part where they sold the prototype seems like comic book level villainy. :ROFLMAO:
7vr358.jpg
 
Hardware Unboxed put up a response viddy to Steve's a few hours ago:


Breaks down how you'd expect: Tim actually critical, Steve jumping in to water the critique down. His dumb "hey everyone makes mistakes, we do to" defense is a common one I've seen from those desperate enough to try and paint LTT in the best light, thankfully Tim corrected him that it's not simply about fuckups which yes, everyone has - it's how egregious they are, the frequency, and the laughably weaksauce way they're 'corrected', if they are are - there's just no indication Linus even recognizes the scope of the problem and actually wants to change. Like even GN's plea that this hopefully pushes Linus to invoke better practices is being incredibly genteel, their framing that this is a result of Linus being a slave to the content mines is actually very generous! Linus's response to this cements my opinion that he's basically the same guy he started out as - a marketing dweeb for a shady online retailer that makes silly tech videos which are largely promotional fluff like the majority of the tech 'press', and is wholly unsuited in temperament and ethics to gravitate towards actual serious testing and product evaluation. This is wholly a Linus problem.

Steve ending the video by basically thanking LTT for popularizing PC tech and that he's been a 'net benefit' is pretty pathetic. Overall a weak response that didn't really address GN's critique in any substantial way, and at least from Steve seemed mainly geared to get everything back to the way it's always been and hopefully Linus won't be too mad at him.

Not to mention they didn't touch on the egregious conflicts of interests brought up in the video either. It's not simply calling out 'tech mistakes' ffs. Steve's basic summarization of the event being "Make sure your house is in order before calling someone out!" is basically painting this whole event as existing due to an inappropriate comment by an LTT staffer and hey, reviewing is hard and make sure your ducks are in order before casting stones - but that's not the issue. It may have been the final straw, but it's clear from GN's video that this isn't just interpersonal beef, there has been a longstanding history of negligence and that should bother you if this is the most popular channel in same space you work in and want respect for. Your response shouldn't be "Well all in all, Linus has made things for the better", you should care about how kind of light this paints the youtube tech industry in.

Steve's responds to Linus's 'response':

 
Last edited:

So apparently Nvidia doesn't have "a driver overhead problem due to s/w scheduling" (or w/e b.s. Steve said back when he benched five games and made wide reaching claims).

His conclusion that "on an individual level, the results were really interesting" and that "Nvidia's overhead issue really did rear its ugly head in that example", because in Hogwarts Legacy, it was the one game that showed a difference.

So it still does have an 'overhead issue', because they noticed a difference in one game out of the 19 they tested. 🙄
 
His conclusion that "on an individual level, the results were really interesting" and that "Nvidia's overhead issue really did rear its ugly head in that example", because in Hogwarts Legacy, it was the one game that showed a difference.

So it still does have an 'overhead issue', because they noticed a difference in one game out of the 19 they tested. 🙄
"On an individual level" there were games in his own benchmark where the results were the opposite suggesting that AMD also has "an overhead issue" but since he sees only what his self confirmation bias allows him to these were just ignored, as usual.
 

So apparently Nvidia doesn't have "a driver overhead problem due to s/w scheduling" (or w/e b.s. Steve said back when he benched five games and made wide reaching claims).

I don't think Steve's claim that Nvidia has a driver overhead problem because of "software scheduling" has any technical basis that can be substantiated elsewhere. I don't remember where this claim came from. I think it came from a discussion he had during an interview, or something like that.
 
There are differences in the way Nvidia and AMD drivers load the CPU - or at least there are/were for DX11.

Here's an in depth article from Intel on the subject. I read it years ago, can't remember what it says, but yeah there can be performance implications.

 
There are differences in the way Nvidia and AMD drivers load the CPU - or at least there are/were for DX11.

Here's an in depth article from Intel on the subject. I read it years ago, can't remember what it says, but yeah there can be performance implications.


I'm sure there are large differences in the architecture of the drivers. I just think specifically what HUB kept referring to as Nvidia's overhead from having a "software scheduler" didn't really have any basis in reality, at least not that I could find substantiated anywhere else.
 
I don't think Steve's claim that Nvidia has a driver overhead problem because of "software scheduling" has any technical basis that can be substantiated elsewhere. I don't remember where this claim came from. I think it came from a discussion he had during an interview, or something like that.

It came likely from here, he boosted this video when he discovered it:


I also posted about this when I encountered it years ago and was quickly debunked here, but I've noticed it got quite a bit of traction in some circles and people basically repeat it verbatim.

(Also the author is a complete nutter going by his Twitter, antivax/climate change 'skeptic' among a host of general wingnut beliefs)
 
Last edited:
I also posted about this when I encountered it years ago and was quickly debunked here, but I've noticed it got quite a bit of traction in some circles and people basically repeat it verbatim.
The origin of the myth came many many years ago, NVIDIA did their scoreboarding in hardware with their Scalar architectures such as Tesla (GTX 200) and Fermi (GTX 400), while AMD relied on the software compiler to do scoreboarding with their VLIW5 and VLIW4 architectures, it wasn't as efficient in utilizing hardware as NVIDIA, which explained why AMD GPUs fell behind in compute and several other workloads at that time. Tech savvy people got the idea that hardware is far better than software and took note. In the end, It wasn't working well for AMD so they returned back to hardware with GCN iterations.

NVIDIA on the other hand made a rather bizarre switch in an effort to increase power efficiency, they made Kepler a mix between hardware and software scoreboarding and also by making a subset of the cores Super Scalar (1/3 the cores) instead of pure Scalar, relying on the software compiler to feed this subset (which is among the reasons that explain why Kepler aged so horribly). NVIDIA quickly got back to full hardware in Maxwell, Pascal, Turing and the rest .. etc. But the press didn't seem to pay much attention to that change, and some of the people that took note of the hardware > software fact didn't seem to notice too, so they kept repeating that NVIDIA still uses software scoreboarding to explain some of the early differences in DX12 performance between AMD and NVIDIA, and then it stuck ever since.

In reality, the problem is related to a limitation of the API itself as the DX12 resources binding model is defective for NVIDIA hardware, it was designed for Mantle style hardware (AMD) primarily, so on other hardware it generates tons of extra calls and descriptor instructions, adding more CPU overhead on NVIDIA hardware. Developers need to take care of that when coding with DX12 to circumvent these pitfalls but few actually pay full attention.

You can read about it in details here down below.
 
NVIDIA on the other hand made a rather bizarre switch in an effort to increase power efficiency, they made Kepler a mix between hardware and software scoreboarding and also by making a subset of the cores Super Scalar (1/3 the cores) instead of pure Scalar, relying on the software compiler to feed this subset (which is among the reasons that explain why Kepler aged so horribly). NVIDIA quickly got back to full hardware in Maxwell, Pascal, Turing and the rest .. etc.
Tesla and Fermi had h/w for instruction stream reordering, this was dropped in Kepler+ because it didn't provide much benefits with simple MADD pipelines and was just eating power for no benefit. There are no downsides to this move, only advantages nor does this have any relation to any "scheduling" at all. I don't think that AMD ever had anything like this in h/w so they aren't affected by that.

There are some differences in how IHVs do command queues which may be attributable to some differences in scheduling but this alone doesn't say much about "driver overhead". Also a driver doing more work may in fact be beneficial in the end as this work may result in shaders being better suited for the GPU h/w for example. So "driver overhead" in itself isn't exactly always a bad thing either.
 
Back
Top