Personally, I've always thought it would be nice to conduct a review (or at least part of a review) as follows:
(1) Choose a representation of games to benchmark on (um... like always)
(2) Determine what the "desired framerate" level is for smooth gaming, like 60fps or whatever is deemed appropriate
(3) Change resolutions and IQ settings to find perhaps two or three combinations for each card in each game that yields approximately the desired performance level
(4) Present screenshots for comparison of what each card "buys" you interms of IQ at the desired gaming performance
A review done in this manner would thus focus less on the absolute speed of each card, and instead show the gamer just what image quality differences there are between the cards at comparable performance levels.
For example, suppose that both cards can run QIII @ 1600 x 1200 with IQ maxxed, FSAA maxxed, and AF maxxed at the desired performance level (I'm not saying they can...). Screenshots then reveal that the AA and AF IQ differences are fairly small (they may not be, but this is hypothetical). In that case, the two cards would more or less "tie" in this benchmark, since they both deliver at the desired performance level with maximum image quality. If you are a gamer concerned only about QIII, then you would know that for you, the extra money is not buying you anything (which might not be so obvious looking at graphs showing one card scoring double what the other one is... what difference does 200 fps vs. 100 fp make?).
OK, now most games (and probably QIII) are not like that. At some point the framerate will drop below the "threshold." Screenshots will then show what each card offers to the gamer at the "smooth gaming" performance level. If you as a gamer don't see much difference (as in, if you aren't particularly bothered by aliased textures or geometry) then you might decide that the impressive looking graphs aren't all that impressive after all. I mean, if the reviewer is telling you that a $125 video card will run your favorite game just as fast as that $400 video card, and you don't see that much difference in IQ, then why should you spend the extra cash?
And, if you really do care about image quality, then the review will show you just what you are most interested in: how much better will things look for the extra hundred or two bucks? You might take one look and the screenshots and think "Holy crap, that's beautiful," in which case you have your answer... for $400 that picture can be yours at smooth performance levels.
Well... that's just my opinion anyway. I've never really seen a review approached in that manner. They always simply show screenshots at what is supposed to be equalivalent image quality (though, it hardly ever is), and they hardly ever show screenshots at the settings they are testing for performance comparisons. Approaching it from an "equal performance" instead of "equal quality" angle would more closely mimic how the actual gamer uses the card: fiddle with resolutions and settings until they find a combination that gives them maximum image quality without dropping below their preferred "threshold" of performance.
Why shouldn't a review do the same?