Value of Hardware Unboxed benchmarking

nVidia replaced the 5800 Ultra with the 5900 Ultra in May 2003. So people believed that most of the problems would have been fixed. But with the HL2 presentation it was clear that the whole line up wasnt ready for Pixelshader 2.0 only calculations.
 
By their logic, Hardware Unboxed would have recommended NVIDIA's FX seried over ATI, since there was so few DX9 games present, and it wasn't known if DX9 would be worth it.

They did even worse actually, they completely ignored DXR/DX12U, and recommended GPUs that didn't even support it (RDNA1) over GPUs that did actually support it.
 
The 2000’s were really the heyday for graphics reviews. Those guys made an attempt to actually understand the hardware and explain the technical details to their audience. Aside from DF I can’t think of a single outfit that even attempts that today. It’s just bar graphs and theatrics emphasis on the theatrics.
 
Current situation is partially due to IHVs preferring to withhold technical information and supply "easy to understand" but often misleading marketing materials instead I feel.
The h/w got complex enough for reviewers to need some sort of a proper technical education in silicon engineering to understand how it works.
Even the h/w designers themselves sometimes have "eureka!" moments when working with the h/w they made, figuring out how to use it in a different, more efficient ways which weren't thought of at the design stage.
So it's not only a reviewer landscape change I think but also the hardware change, and the market expansion too.
 
By their logic, Hardware Unboxed would have recommended NVIDIA's FX seried over ATI, since there was so few DX9 games present, and it wasn't known if DX9 would be worth it.

They did even worse actually, they completely ignored DXR/DX12U, and recommended GPUs that didn't even support it (RDNA1) over GPUs that did actually support it.
For good reasons.
 
You clear your shader cache, and you compare against the same sections you played previously to make a determination if stuttering has improved. This is really basic stuff.
HUB has sub par technically knowledge, they often demonstrates this repeatedly on several occasions, the only thing they are really good at is testing an isnane number of GPUs/CPUs, that's how they made their reputation, don't expect anything deeper or more insightful than that.
 
HUB has sub par technically knowledge, they often demonstrates this repeatedly on several occasions, the only thing they are really good at is testing an isnane number of GPUs/CPUs, that's how they made their reputation, don't expect anything deeper or more insightful than that.

They also block people on Twitter who prove them wrong and provide sources as to why they're wrong.

I've lost all respect for them the last 6 months as I can see more and more holes in their testing and opinions.
 
The latest TLOU patch makes some strides in CPU/GPU usage, but also once again lowers VRAM required.

Again, further cementing Alex's point that HBU got their panties in a twist over: Don't use a singular, obviously broken game as the bellwether for impending VRAM disaster.

Don't buy new cards with 8GB, sure - pretty much everyone is in agreement here. But there is obviously reason to expect better management when you know the hardware the majority of your userbase actually has, and apparently it's not an unreasonable ask.

1683744339658.png
1683744169825.png
1683744182586.png
 
Last edited:
HUB never used TLOU as their *sole* reasoning for why 8GB isn't enough. They even made a whole video going over this. But y'all aren't concerned with that, cuz it goes against the narrative.
 
Now they are using "bandwidth" as an argument. Look at the TLoU result from the 4060TI review. The card is slower than a 3060TI - but only in their benchmark. Computerbase, Daniel Owen and Co. the 4060TI is at least as fast or even faster.

"Never change a running system".
 
Now they are using "bandwidth" as an argument. Look at the TLoU result from the 4060TI review. The card is slower than a 3060TI - but only in their benchmark. Computerbase, Daniel Owen and Co. the 4060TI is at least as fast or even faster.
I noticed that too. Guru3d results are similar with Computerbase so not really sure how to account for the review differences.
 
I noticed that too. Guru3d results are similar with Computerbase so not really sure how to account for the review differences.
I don't know if TLoU has a built in benchmark, but if it does I doubt HUB is using it. They try and take their benchmarks from gameplay.
 
What an embarrassment. But let's all remember to blame the developers, right guys?


....w...what? Nvidia shit out a poor product so devs get a pass on broken games? The poor vram management of games - and for many of those, improved significantly with subsequent patches - is but a small subset of the problems of recent AAA games. Cards with 24gb are still getting pso and traversal stutter. Despite HBU's dumb take that vram was really the only barrier to TLOU being regarded as a good port (so it's a solid port now, I guess?), that's far from the case.

Even the supposed "vram defenders", Digital Foundry, just spent 20 minutes in their latest DF direct shitting all over the 4060 series as a value prospect even before their review. It's not either/or. 8GB is ridiculous for a $400 card. PC AAA port quality is subpar. Both opinions can, and are held simultaneously by many.

Can you like, tone down the neogaf component in your posts a bit, maybe cut it by 30% or so. Would be appreciated.
 
Last edited:
Back
Top