jvd said:
The fact that they have have 3 diffrent cards running at diffrent aa and aniso settings and then compare what card you should buy based on that is just stupid
If the 5700 sucks at a certian res and aa and aniso performance then it should be marked that way and you should be shown the performance .
It's not so strange as you would think. Actually, when you think about it, they have the BEST way of comparing... Let me explain:
The current generation videocards cannot easily be compared. Suppose you run both the Radeon and GeforceFX in 2xAA. Would that be fair? Not at all! It's well know that the Radeons AA is much better than the FX's. Considering the AA results, it's probably more fair to run the Radeon at 2xAA and the FX at 4xAA...
Equal card settings does not mean that you're comparing apples to apples!!
And that not only holds for AA, but also AF, etc etc. Even without AA and AF you can see IQ differences between cards.
A GOOD reviewer should remember that a video card can exchange speed for image quality and vice versa. Actually, that's what most of these 'driver optimizations' do! Loose some image quality, to gain speed.
You also do it yourself, when setting optimal settings for a game.
Because there is a relation between image quality and speed, a benchmark should either:
1) set a certain IQ level, and compare the speed
2) set a certain speed, and compare the image quality.
Most reviewers
think that they're doing (1) because they set the same resolution and AA and AF settings. As I explained above, they are wrong. Meaning they run different IQ, and then compare speed. That's meaningless.
Furthermore, you have the silly thing that in those reviews you see framerates of >100 where it really doesn't mean a thing when one is faster than the other (Quake?). And who cares if card x is 50% faster than y, when both are doing < 10 fps. Can't play the game anyway. Even though those number could be fair, they are meaningless!
What HardOCP is starting to do, is benchmark according to (2). Set a certain speed, and see what image quality you did. As I explained above, nothing wrong with this approach.
Actually, it has quite a number of advantages:
* It's mostly very well possible to set equal speeds, because of the large number of graphics settings. So, they DO compare apples to apples.
* You're not dependable upon differences in AA, AF algorithmen. It doesn't matter HOW they do it, only the results count.
* Even 'driver optimizations' are more fairly treated. If they exchanged IQ for speed, then they don't win anything in this kind of benchmark. If they really improved speed, without damaging IQ, they do win.
* It's closest to the actual gameplay situation. It's the same as someone at home would do. Achieving a good balance between speed and IQ.
Ofcourse, you DO have to be aware of this different kind of reviewing. You have to READ the review, and not just quickly browse through it.
But that's just a matter of getting used to it.
You'll probalby think, it's also more dependend on the reviewer, because he has to assess image quality.
But think about this: a PROPER review based on the first method, should ALSO assess image quality, select the settings that produce equal image quality, and only then compare the framerate the cards achieve. (Unfortunately, they're not doing it...)
So, there's really not much difference there. And actually, I think it's more reliable setting equal speed, than equal IQ.... That's just a number. Let the judging of IQ be left to the reader posting screenshots...