Question

Don't forget to take one for yourself too while you're at it.
yep.gif
 
I'm not sure what Reverend is getting on about but I personally just scan Dave's game benchmarks - I am rarely interested solely in how Company A's new product fares against Company A's old product. When it comes to game benchmarks B3D's reviews are at the same level as any other sites - just a bunch of numbers on a graph, Dave doesnt seem to add much color to those numbers in his commentary anyway.

Where B3D sets itself apart from all the other sites is in the architectural discussion/explanation. You will often see this aspect of Dave's work linked from other forums. For game benchmarks people usually go to FS, Xbit, Hexus, Anand, HardOCP et al.
 
trinibwoy said:
I'm not sure what Reverend is getting on about but I personally just scan Dave's game benchmarks
Oh yes you're sure what I'm talking about, because you just explained it in the same sentence (with the correct reasons in the unquoted part of your comment)!

I just read Dave's R520 benchmark article and the overriding feeling (concern?) I have is that reviewers know far too little about games as they do about synthetic benchmarks (for obvious reasons... authors of synthetic tests explain why and what their app tests so reviewers know exactly what parts of a piece of hardware they're concentrating on... Dave understands the relevance of such synthetic apps more than most "reviewers", who usually just post "explanations" that we can surmise ourselves). In Dave's R520 benchmark article, I am frustrated that Dave can't really explain why HDR has such contrasting performance penalties in Far Cry and in SCCT, other than to offer his speculations, which more importantly, he didn't spend more time on (investigating).

For a site that does such an excellent job knowing, and explaining to the public, what the results of synthetic tests mean, the games benchmarks sections are jolts, because they are so bereft of the same kinds of explanations you tend to expect after reading the extrapolations of the synthetic tests results. I am sure -- nay, I am convinced, this is due to 2 reasons -- lack of desire to ask game developers what is actually happening in a game demo used for benchmarking, and that game developers are not helpful in this regard (the two being reasons for one another, chicken-and-egg stuff).

It's hard to relate to Dave's comment that games are (the only) apps that thoroughly tests a piece of hardware on all fronts when we don't really know what aspects are being tested more in any one particular game demo used for benchmarking.

Yes, games are the most stressful apps for3D hardware. That's like saying life is difficult. But beyond saying adding a wife and a kid (AA and filtering) makes life more difficult (i.e. makes hardware work harder), which are basically Duh! stuff, we don't know why this is so specificly.

That is why it is very important to know exactly what happened when new latest-and-greatest drivers results in performance improvements in games (whether throughout a game or only in demos used in benchmarks, especially if it's the latter case for reasons the FBI should be called in... or in this case, Dave should be called in). Why? Where? Nothing is offered.

Not enough is being done when it comes to extrapolating the results of games benchmarks, beyond the understandable lower performance with higher AA and filtering levels. If the problem is that game developers just don't bother answering queries, then perhaps it is not good wasting time providing game benchmarks at this site when we don't really know what's happening compared to using synthetic test apps.
 
Last edited:
Reverend said:
I am sure -- nay, I am convinced, this is due to 2 reasons -- lack of desire to ask game developers what is actually happening in a game demo used for benchmarking, and that game developers are not helpful in this regard (the two being reasons for one another, chicken-and-egg stuff).
Completely untrue on both accounts.

Not enough is being done when it comes to extrapolating the results of games benchmarks, beyond the understandable lower performance with higher AA and filtering levels.
Possibly a fair comment and one that I've been conscious of for a while now; I've been trying to make a concerted effort to improve in this area and once I've finished the benchmarking article it will be far less of an issue.

*****

As a small aside but certainly meant as a retort to your comments above Rev - take a look at your own last review on this site; where was any of the information, the lack of which you've been complaining about, in your article? Now compare that review to the one previous to it which contains considerably more detail - the point I'm making is that there are usually good reasons as to why the depth of games benchmark results analysis varies from review to review; making blanket statements that it's because (a) we can't be bothered to find the info out or (b) devs won't tell us anything is simply unfair and highly misleading.
 
Aren't games benchmarks used to see how well the card perform in real case scenarii ?!
What is being stress is then of little relevance then, it's a way to know what you'll get when using it.
 
trinibwoy said:
I'm not sure what Reverend is getting on about but I personally just scan Dave's game benchmarks - I am rarely interested solely in how Company A's new product fares against Company A's old product. When it comes to game benchmarks B3D's reviews are at the same level as any other sites - just a bunch of numbers on a graph, Dave doesnt seem to add much color to those numbers in his commentary anyway.

I have to say that i agree. The benchmarks are not why i come to this site, far from it.
 
Back
Top