My perspective on 3dmark

incurable said:
I agree with the part about benchmarking one application and how it can only tell you the behaviour of that very application, but that applies to 3DMark too, doesn't it?
It doesn't really apply to 3DMark. The difference is that games (even using same engine) often use optimized code for different cards, while 3DMark doesn't. 3DMark shows the overall performance (and of course features) of the hw. And that's only one difference. ;)

incurable said:
btw: The whole notion that using 3DMark is somehow superior to benchmarking real games irritates me.
Nobody has ever claimed that 3DMark is "superior", and games should not be used. The difference is what I just wrote. Benchmarking games really shows you how well that particular hw performs on that particular game. Not engine, not overall, only that 1 game. Here comes the "strength" of 3DMark. As it isn't optimised for any specific hardware, it shows the overall performance of the hw. How well that hardware really performs and compares to other hardware. Apples to apples. No manufacturer specific extensions, no hw specific engine optimizations, etc..

incurable said:
To be honest, I don't think it has any merit.
Really? Well, please do check the 3DMark usage report we have posted a while back. I think it gives you some insight on this subject:

http://www.futuremark.com/companyinfo/benchmark_online_usage_report_2002.pdf

At the end of the day, I personally think it is more or less up to the reviewers to really start to explain in the reviews how the benchmarks they use really work, and for what reason they were chosen (I don't mean "Hey, we use this game as our benchmark because I love it, it's fun and it kicks ass!!"). I'd love to know the technical reasons for someone choosing a game as a benchmark. What makes it useable, unbiased, what makes it better than the others, what specific features the engine supports, etc. You know, the technical side of it. I have wondered many times why some reviewers really choose certain games, or certain benchmarks. They simply don't explain it to the readers that well (unless you call the "...because I love it, it's fun and it kicks ass!!" an in-detail explanation.. Well, I don't!).

We are always here to answer questions concerning 3DMark, and to help reviewers to understand how it works, and why it works, etc! ;)
 
worm[Futuremark said:
]The difference is that games (even using same engine) often use optimized code for different cards, while 3DMark doesn't.

So 3DMark uses the same code for all cards?
That's news to me! :rolleyes:
 
So 3DMark uses the same code for all cards?
That's news to me!

No, 3DMark uses code that is designed to meet different levels of Direct X compliance. DX7, DX8.0, DX8.1, DX 9.0. It doesn't care which cards from which manufacturers support which standards.

It doesn't say "is this an nVidia card? Use PS 1.1 shader path."

It says "Does this card support DX 8.1 shaders? No> Then use DX 8.0 shaders".

It's a fine point, but a valid one.

Look at the difference with Doom3: Doom3 doesn't say "Does each card support arb_fragment_program? Then use it!"

Doom3 says "Is this an NV30? If so, use NV30 path, not ARB2."
 
Hyp-X said:
So 3DMark uses the same code for all cards?
That's news to me! :rolleyes:
I think you know what I am talking about... 3DMark doesn't check what (or which vendors) hw is being used. It only checks what DX capabilities the hw can do, and goes on according to that. No manufacturer specific codepaths, only pure Direct3D.
 
Well put ben6 :D . I think that if 3Dmark03 is used for DX9 class cards only there will not be such an uproar as to its usefullness. Is it perfect ? No. Is it usefull in context ? I would think so. What other options are available now to test these newer cards ? From what I have read on these fourms no one has suggested anything :( . BTW where can we read your reviews.
 
worm, would you call 3dMark03's code optimized or unoptimized?

If it's optimized what's it's optimized for?

What was the targets of such optimization and how did you check whether you reached that target?

If there were alternative solutions for implementing a feature how were the selected one chosen? (eg. most simple code, fastest on CardA, etc.)
 
nelg said:
I have read on these fourms no one has suggested anything :( .
Same here! I have seen some negative comments, but no suggestions or ideas HOW it should be done, or what is/was missing. Any constructive feedback is very important, and good for us. We want to know what you guys think we should have included, what tests should have been done, etc..
 
Joe DeFuria said:
Look at the difference with Doom3: Doom3 doesn't say "Does each card support arb_fragment_program? Then use it!"

Doom3 says "Is this an NV30? If so, use NV30 path, not ARB2."

Possibly, but not necessarily. . . It's possible that the NV30 path is simply higher on the path hierarchy than ARB2. The former is, after all, somewhat more advanced.
 
Hyp-X,

A small Q&A? ;) To get the best and most accurate answers, either you email them to patric@futuremark.com, or then I can forward them to him. I'd love to answer them, but think it's better to let Patric (or someone of the coders of 3DMark03) answer those. Just so that there aren't any confusions.

So, will you do it, or shall I? :)
 
Joe DeFuria said:
It says "Does this card support DX 8.1 shaders? No> Then use DX 8.0 shaders".

I think you wanted to say "Does this card support PS1.4? No> Then use PS1.1 shaders".

Otherwise it doesn't makes sense...
 
Ostsol said:
Possibly, but not necessarily. . . It's possible that the NV30 path is simply higher on the path hierarchy than ARB2. The former is, after all, somewhat more advanced.

According to JC, the nv30 path looks WORSE than the ARB2 path (but only very very slightly).
 
Sharkfood said:
MadOnion's stated policy on cards are to use features that 2 or more manufacturers support.
At the time of 3DMark2000, only NVIDIA cards had HW T&L, yet this was chosen and other cards "fell back" to software T&L or CPU T&L (SSE/3DNow!).

I don't see how this differs from PS1.4 and fallbacks to PS1.1->1.3 in 3DMark03. In fact, this is a much more comparable "fall back" as we are talking HW vs HW, just different implementations.

It's hard to hit a running target.

Imagine being a game developer, wanting to release a game with "cutting edge" technology 3 years from now. Would you target DX9 or DX10? What would your minimum spec be?

For a forward-looking benchmark I could imagine it's even more difficult. In the 3DMark2000 timeframe I remember both S3 and 3dfx promising to come out with HW T&L cards... But only NVIDIA managed to achieve it back then.

What was the original schedule for GeForceFX? Q3/2002? If NVIDIA's execution had worked perfect, they'd have a DX9 part out there, and followup coming soon. Maybe they said so when the benchmark specs were being made, dunno. 8)

(and NVIDIA's execution has been near-perfect in the past)
 
worm[Futuremark said:
]Hyp-X,

A small Q&A? ;)

Yep. Altough I'm not a reviewer, I'm only a game developer (not exactly your target with your benchmark), I'd still like to know. ;)

To get the best and most accurate answers, either you email them to patric@futuremark.com, or then I can forward them to him.

Would you be so kind to forward them?
You can post the answers here or send them to me (in case you think they are too technical).

I'd love to answer them, but think it's better to let Patric (or someone of the coders of 3DMark03) answer those. Just so that there aren't any confusions.

Ok, that's fine.
Actually it's cool if one can get real answers from a company - not just PR.
 
To be fair, S3 did release the Savage2000 chip with HW T+L. However, also to be fair, it never worked properly...
 
Hyp-X said:
Yep. Altough I'm not a reviewer, I'm only a game developer (not exactly your target with your benchmark), I'd still like to know. ;)
:oops: Never knew you are into developing games.. May I ask on what project? ;)

Hyp-X said:
Would you be so kind to forward them?
You can post the answers here or send them to me (in case you think they are too technical).
Done. I emailed him the Q's. I have a hunch that I won't get them back before tomorrow, but I'll post them here (or send them to you) as soon as possible. Cool?

Hyp-X said:
Ok, that's fine.
Actually it's cool if one can get real answers from a company - not just PR.
;)
 
ben6 said:
To be fair, S3 did release the Savage2000 chip with HW T+L. However, also to be fair, it never worked properly...
If I'm not mistaken, the first official drivers had the HW T&L disabled for the S2000? :? At least I recall something like that..
 
Yes, I'm not sure if T+L actually ever worked as S3 intended it to. I may be wrong in my recollection however.
 
ben6 said:
Yes, I'm not sure if T+L actually ever worked as S3 intended it to. I may be wrong in my recollection however.

There was never an official non-beta driver with T&L working that I recall from my days with that card. I believe there was a beta driver that had it enabled, but it was iffy.
 
No, T&L never worked on the savage 2000. Actually it was not T&L it was T&L&C. The broken portion was in the Harware Clipping. Which they could never get a driver work-around to properly deal with all the issues caused by it.

Hype-X

If you are a game developer liek you say you are, then you should recognize that PS 1.2, and 1.3 offer no compelling reason to use over ps 1.1

Being that PS 1.1 is the most common denominator for DX8 cards it is the obvious choice for a fallback. PS 1.4 Is obviously the PS of choice for DX8.1, i dont see how anyone can honestly argue against that.

Being that all PS 2.0 hardware comming out will also gain the benefits offered from PS 1.4... why Would you handicap future DX8 games that use heavy PS routines by making them run PS versions that offer no real benefits to any cards??? Intead of benefiting all the PS 2.0 hardware, as well as all the current Dx8.1 classes hardware?
 
Hellbinder[CE said:
]Being that PS 1.1 is the most common denominator for DX8 cards it is the obvious choice for a fallback. PS 1.4 Is obviously the PS of choice for DX8.1, i dont see how anyone can honestly argue against that.

Being that all PS 2.0 hardware comming out will also gain the benefits offered from PS 1.4... why Would you handicap future DX8 games that use heavy PS routines by making them run PS versions that offer no real benefits to any cards??? Intead of benefiting all the PS 2.0 hardware, as well as all the current Dx8.1 classes hardware?
Well i think it'll depend on the extra work needed for implementing PS1.4 and the fallback to PS 1.1 which is the commun denominator. Do the game dev want to work the extra bit? That's the question.
 
Back
Top