Why can't anyone be bothered to normalize their fps*? It's not really rocket science, you can google it into excel in five minutes max and have much more valuable data.
*yes, i realize they have tables normalized to a geo mean of fps, but that's like so...
edit: I feel I need to spell it out once. With geo means, you potentially (numbers very much exaggerated) do this:
Game A
Card n0 120 fps
Card n1 240 fps
Card n2 390 fps
Game B
Card n0 60 fps
Card n1 30 fps
Card n2 10 fps (for example too little VRAM)
Your geo mean results:
Card n0 avg. 90 fps
Card n1 avg. 135 fps
Card n2 avg. 200 fps
Your (flawed) conclusion: Card n0 is slowest, card n1 is mediocre and card n2 is uber!!! You not only completely miss the fact that you can barely play Game B on card n1 and almost not at all on card n2, your conclusion points in the opposite direction!
I think you mean arithmetic mean (which is flawed)?
Using geometric mean, your example would be like:
Card n0 mean: 84.85 (sqrt(120*60))
Card n1 mean: 84.85 (sqrt(240*30))
Card n2 mean: 62.44 (sqrt(390*10))