Value of Hardware Unboxed benchmarking

I never made such an argument for/against RT, so can’t comment on that.

I’m simply showing reviews of older cards to show that enabling 4*AA did not lead to an instant -50% performance penalty. You only got crazy drops like that when the GPU was VRAM or bandwidth limited, or with R600’s obvious architectural problems.

I don't get this. If you ignore the bottlenecks that can cause performance issues, like memory bandwidth, then performance isn't actually that bad?
 
Yet look what happens when we get a card with more memory + bandwidth:

View attachment 12763
View attachment 12764
Ta-da! The price of AA drops.
I'm not sure what I'm supposed to be seeing. 144 down to 85. 92 down to 70. These are still big drops.

I never made such an argument for/against RT, so can’t comment on that.
It's important to bring back the current AA talk to the wider topic that spawned it, how graphics scaling across GPU tiers has changed (or not) over time.
I’m simply showing reviews of older cards to show that enabling 4*AA did not lead to an instant -50% performance penalty.
I think that 50% being leaned on too hard. It's more a discussion point that a high end feature had a big impact. Obviously over time those specific features became less costly, but then other features became the differentiators between "Good" and "Extreme". So for each GPU (maybe console) generation, we ought to look at what the Elite Rig features of the time were, that RT serves now, and how did those features impact performance on the GPUs of that time.

You only got crazy drops like that when the GPU was VRAM or bandwidth limited, or with R600’s obvious architectural problems.
Actually, reasons don't particularly matter. All that matters is how performance was impacted by game features at the time. It doesn't matter if RTRT tanks framerate because GPUs aren't fast enough or haven't enough RAM - the end user experience is just a tanking framerate as a result of enabling the highest-end rendering features.
 
Last edited:
Bringing things back to Hardware Unboxed, Steve is an interesting polarizing figure. I'm pretty sure he games on intel and Nvidia, or at least Nvidia. They also do cover HDR in their monitors unboxed stuff. They don't go super in-depth with the actual games and how it's implemented, but they do look at the capabilities of the monitors to some extent. I do think their pursuit of "value" has lead them to some bad recommendations, like early Ryzen for gaming. I can understand with gpus, recommending something like the 5700(xt?) over the 2060. It really took a long time for their to be a game that really drastically was better on 2060, and I think that's Alan Wake 2. At the time the cards came out, it would have been really hard to predict how good dlss would become because the first version was pretty bad. I think now you're probably getting better performance out of the 5700 but better image quality with the 2060 if you're using upscaling, because FSR isn't great. All in all, not a terrible recommendation. I do kind of agree with recommending things based on their capabilities at the time, and not years in the future.

Lately their recommendations for gpus have been a little harder to plot, but that's because use case is going to change things significantly. If you only game at native resolution, vram is going to be a bigger problem. So if you're a native 1440p or native 4k gamer, that vram is going to make a big difference. Their concern about 8GB can be reasonable or unwarranted depending on how the card is being used. 16GB is definitely the safe choice, but they've been kind of warning about vram limitations for a while now, and it also feels like predicting the future somewhat.

I have a general issue with pc component reviews, and I think it's just a symptom of the tools available and the time it takes. It's just really hard to understand true performance without profiling, so it's all an estimate. You benchmark the software (not the hardware) and then try to come to some conclusion about the hardware based on general patters, but the only real data they have is frame times. And then subjectivity is going to creep in based on game selection, scene selection, preference for frame rates or resolution etc. It's not shocking that reviewers can come to different conclusions or recommendations without being "shills" for a particular product.
 
I read this topic a while ago and thought "people are exaggerating, it seems to me that some are big fans of Nvidia", in short, I thought the topic was biased. Maybe I still think so, but today I saw something terrible from Hardware Unboxed.


They made a video saying that the Intel Arc B580 GPU had terrible performance on old hardware. They tested the GPU with a Ryzen 2600 and found that the GPU performs worse than a RTX 4060 and when using a powerful CPU like the Ryzen 9800X3D, this does not happen and the B580 has superior performance.

But this is super dishonest. On the Intel support page, it is stated which minimum CPUs should be used with the GPUs. And this comes from the previous series, Alchemist.

Most AMD Ryzen 3000 series Processors (excludes AMD 3000G-series Processors) / AMD 500 Series motherboard with Smart Access Memory enabled
10th Gen Intel Core Processors / Intel 400 Series motherboard with Resizable BAR support enabled

 
I read this topic a while ago and thought "people are exaggerating, it seems to me that some are big fans of Nvidia", in short, I thought the topic was biased. Maybe I still think so, but today I saw something terrible from Hardware Unboxed.


They made a video saying that the Intel Arc B580 GPU had terrible performance on old hardware. They tested the GPU with a Ryzen 2600 and found that the GPU performs worse than a RTX 4060 and when using a powerful CPU like the Ryzen 9800X3D, this does not happen and the B580 has superior performance.

But this is super dishonest. On the Intel support page, it is stated which minimum CPUs should be used with the GPUs. And this comes from the previous series, Alchemist.



Is Battlemage different from Alchemist in this regard? They've always said you need resizable BAR to effectively use their GPUs. IDK if a Ryzen 2600 era system would have support for that. IDK if it depends solely on the motherboard/BIOS or if the CPU model also matters.
 
Lol, recommended the ryzen 2600 over a 9600K because it's a more powerful cpu. This is one of those areas where they just lose me. Maybe, just maybe if you're using motherboard defaults, but I doubt it. Maybe if you're looking at average fps, or only well multi-threaded games, but I doubt it.

Also I find it weird that he concludes this is likely an architectural issue that can't be solved with an update. No idea what he's basing that conclusion on. This could easily be driver-related.
 
Last edited:
It may seem like a petty observation but Steve's always come across as arrogant and narrow minded to me, which I'm seeing again in his response to one of the top comments on this latest Intel GPU video.

Someone pointed out that the commenters calling for more testing with older CPU configurations have been vindicated by this recent issue and his response is "A broken clock is right once a day." Instead of "You're right, we ignored a flaw in our testing methodology and will endeavour to be more thorough next time." Which strikes me as smug and dismissive of his audience and "the consumer". Especially considering he's a tech reviewer so accuracy and rigor should be his top priority, and his fans blowing him about being a consumer advocate are misguided. He'd rather protect his ego than improve his methodology.
 
Intel even puts this information on the box!

package5.jpg
 
Probably has something to do with rebar support which Intel GPUs very much require for some reason.
I think so. It's unclear if rebar works on pre Zen2 Ryzen. I think it is possible to turn it on if your mobo supports it with any CPU. I wouldn't know how to confirm if it's actually functioning.

Still don't know why he tested on a CPU that is so clearly below the minimum requirement. 3000 is the minimum. 2000 < 3000. There may be some kind of overhead problem but this should be determined on a system that is officially supported.
 
I think so. It's unclear if rebar works on pre Zen2 Ryzen. I think it is possible to turn it on if your mobo supports it with any CPU. I wouldn't know how to confirm if it's actually functioning.

Still don't know why he tested on a CPU that is so clearly below the minimum requirement. 3000 is the minimum. 2000 < 3000. There may be some kind of overhead problem but this should be determined on a system that is officially supported.

It's kind of odd that both channels didn't mention the recommended spec on the box. There's an argument to be made that it should work with older cpus, but if it's clearly labelled on the box and everything works well with 10th gen intel, 3000 series ryzen, then it's a much different story. Would be nice to see that tested.
 
I think it's important to note that beyond PSU and space limitations, it's not standard for any GPU to have explicit "system requirements". It's not fair to fault consumers for not reading the minimum requirements on the B580 box when these "professionals" who do this for a living seem to have been blindsided by the incompatibility themselves. Steve very clearly doesn't even know for sure *why* the performance issue is there based on his statements in the video and comment section. Blaming the consumer for not doing enough research isn't fair when even the professionals who are supposed to be the source of that consumer research and literally do this for a living don't have the answers.
 
I think it's important to note that beyond PSU and space limitations, it's not standard for any GPU to have explicit "system requirements". It's not fair to fault consumers for not reading the minimum requirements on the B580 box when these "professionals" who do this for a living seem to have been blindsided by the incompatibility themselves. Steve very clearly doesn't even know for sure *why* the performance issue is there based on his statements in the video and comment section. Blaming the consumer for not doing enough research isn't fair when even the professionals who are supposed to be the source of that consumer research and literally do this for a living don't have the answers.
half a year ago Geforce drivers started requiring a certain instruction (sse 4.2) and that makes gpus requiring newer drivers have de facto system requirements


 
I think it's important to note that beyond PSU and space limitations, it's not standard for any GPU to have explicit "system requirements"
All GPUs have drivers which have OS requirements which in turn have rather specific system requirements. It's more that in case of Arc the requirements are a bit unusual due to their reliance on rebar support by the h/w and the OS.
 
I think it's important to note that beyond PSU and space limitations, it's not standard for any GPU to have explicit "system requirements". It's not fair to fault consumers for not reading the minimum requirements on the B580 box when these "professionals" who do this for a living seem to have been blindsided by the incompatibility themselves. Steve very clearly doesn't even know for sure *why* the performance issue is there based on his statements in the video and comment section. Blaming the consumer for not doing enough research isn't fair when even the professionals who are supposed to be the source of that consumer research and literally do this for a living don't have the answers.

Found a Reddit post from two years ago that talks about the requirements. Resizeable bar is only officially supported on zen2 and newer, not zen/zen+. They tested a zen cpu and it seemed to work, but the official requirements make sense. These requirements shouldn’t be a mystery to reviewers. The big question is if zen2 also leads to utilization problems.

 
Back
Top