I'm late to the party, but what the hell.
This pretty much sums up [H]'s gaming benchmark segment:
Let's just cut to the chase. You will see a lot of gaming benchmarks today that just simply lie to you
This is the angle [H] is going for, and obviously has in the past. You see dear [H] clergy members, other sites aren't actually reviewing the processor, they're spreading
hype - while the vanguards of decency and truth here at [H]ardocp will show you the light. Sensationalism that's designed to point out the sensationalism of other sites (that actually test the product in question). Rolleyes doesn't cut it here, I need a barf smiley.
Of course there are benefits to [H]'s approach, manual run-throughs instead of canned demos certainly provide more data than endless Quake4 timedemos a lot of sites engage in whenever a new GPU or CPU hits the streets. So I do appreciate they're doing something different, the problem here is that it's completely unsuited to testing a CPU.
Actually, even outside of CPU testing the problem with [H]'s method as well is that it places so much control in the hands of a reviewer. 40fps average for Oblivion is the target, because....uh, why? There are eye-candy whores and there are also framerate whores. With other sites, I can determine at what settings I'll need to run at to get the framerates I want, I can then make the judgment based on my own personal preferences on whether it's worth the cash outlay. What if a 1152 x 864 setting gives me 50fps? How much FPS will I lose if I go for 4X AA instead of 2X? Is the game CPU-bottlenecked that it can
never equate to the refresh rate of my LCD? I'll never know by a Hardocp review.
It's perfectly fine to illustrate that you don't need top-end components to produce a good gaming rig and diminishing returns start to kick in when you move up the $$ ladder, that's definitely somethig I think the general public should be educated on. I fear people think that you absolutely need the $1000 processor and $500 GPU to get anywhere with modern PC action games, evidence pointing to the contrary is good the industry as a whole me thinks.
But this wasn't a "What difference does a CPU make to a high-res modern game?" article - it was supposed to be a
review of a frickin' CPU. With Hardocp's logic, the second ATI/Nvidia release their new GPU's and the bottleneck is somewhat alleviated, the Core 2 becomes
a better processor due to another product entitely out of the hands of Intel? That's simply idiotic. Heck, Hardocp could just start benching the new GPU's at higher resolutions, after all 1280*1024 is not
really the intended target of these new GPU's, that's "last years" res. So nothing changes then as the bottleck will once again be the GPU.
Games this year and next will need more CPU. They will also need more GPU power, HD space, networking bandwidth. Simply because some components of the PC haven't
caught up with Intels CPU does not negate the CPU from being an excellent product, you always should look at the price/performance scale compared to the same products in its class when doing a review about a specific piece of hardware. Right now, from the benchmarks most other sites give, it's apparent that a $300 CPU from Intel beats a $900 CPU from AMD. That's better performance for a 1/3rd of the price, which is stunning, and it brings benefits to all - even those not on the high end - because AMD has to drop prices in return.
Somehow I don't think the market would accept [H]'s argument if they came from AMD:"Look, there's no point for us to drop prices, as when you play games on a 30" LCD there's no advantage to Intel!". They're drastically cutting prices because they have to - there's a much better alternative on the market right now. AMD's price cuts are the best review Intel could get.
Edit: Hell, I was being kind to AMD's current fX62 price - it's more along the lines of $1100, not $900.