Late, noisy, HUGE!!! - makes 17k Mark...

Status
Not open for further replies.
T2k said:
Anyway, 3DMark03 is coming and is introducing a new line of tests again.

Wow? Shall we forget the word: "FSB"? :p

Here's some news for you: FSB does affect gaming performance. If you jack up your FSB in Quake3 or ALMOST ANY OTHER GAME you'll get higher FPS. For being a 3D graphics forum, people here sure seem ignorant of how game performance works. :rolleyes:

I think the bottomline is that many of the people here are just being babies. You want some esoteric BSbench that test JUST certain aspects of a graphics card, ignoring the big picture which is gaming performance. Unfortunately, like it or not, gaming performance depends on your whole system not just your expensive, overclocked R9700 Pro.

And it all comes down to the bottomline this: if someone made a highly complex benchmark it 1) wouldn't give any more accurate information and 2) no one would use it.

Actually, I think some people are just crying because their favored company isn't on top. I remember back when the original Radeon came out, all the fanATIcs were crying about how 3DMark was so biased. Well, I guess the tables have turned...maybe it wasn't so biased after all? :rolleyes:

Doomtrooper said:
No its not that simple, so in Futuremarks management its ok to Release V1 of 3Dmark to the public and all DX8 class cards (at the time) get huge score increases from Nature..all DX7 class cards were tough luck to you..yet when DX 8.1 is released its not ok now to change the scoring but ok to all DX8 class ??

Who said rewriting another game test..rewrite Nature for Pixel Shader 1.4 support...how would that have 'shifted the scoring'.

The difference is they were both 3DMark2001, whereas the other was a difference from 2000 to 2001. 3DMarks from two different years are not meant to be directly comparable. The same 3DMark should be. Once again, this is just a bunch of crying. If any of you were in the same situation you'd make the 3DMark scores from the same program comparable...

Also, I take it back, people are still complaining about bias even though their favored company is on top... :oops:
 
Cats

You should try a laser pointer, I regularly got my cat (5 years) climbing up the wall. :D

PS: Nice cat.
PPS: Can we have an OpenGL mode in 3DMarK03?
 
nggalai said:
DaveBaumann said:
hehe - He's 14 weeks old, and we only got him a couple of weeks ago. Now that he's coming out of the nervous phase with us he's started to become quite a pest - his favourite place appears to be my test bed room and I can't use a bloody computer when he's around as he'll either be on the keyboard or chasing the damned pointer round the screen!
My cat died five years ago. :( As I had lived with him for 18 years, I got too attached to him to get another one.

Well, prehaps I should give it a try. Thanks for the photo, Wavey. :)

ta,
-Sascha.rb

Had a cat for 24 years..... smartest cat I've ever seen..... and mean too! He was so much a part of my life I still miss him, BUT..... The day after he died I got up and couldn't stand not having a cat around, so I went and got 2 of them - brothers - and now they are 14! One of them is the dumbest cat I've ever seen, but he's so sweet!
BTW, Wavey - they go from kittens to curses......
 
Nagorak said:
Here's some news for you: FSB does affect gaming performance. If you jack up your FSB in Quake3 or ALMOST ANY OTHER GAME you'll get higher FPS. For being a 3D graphics forum, people here sure seem ignorant of how game performance works. :rolleyes:

Nonetheless there is a point to be made about 3Dmarks overdependence on FSB speed. Hint: Take a look at game #2: Car Chase High Detail. ;)

Edit = overdependence
 
just have the next 3dmarks incorporate a program called SetFSB and clock it back to defaults if you really dont want FSB to come in to play...i however think that FSB is fair game
 
IMO the benchmark should test eveything that can be done with DX9.
If that means that some test dont work with some hardware it should be done anyway.

If I run a benchmark like that I think it´s interesting to find out what my hardware can do and what it can´t do.
That is very valuable information IMO.

I think that even a test that wont run on any of the cards on the market at this time would be a good idea to include.
If the test in itself is DX9 compliant of course.
It would be very interesting to find out what company would be the first to release a card that can run that test.

Regards!
 
DaveBaumann,

Your cat has a good taste! ;) And he is cute.. I'd love to have a small cat too, but I'm too often at work. It would just be too often alone.. :(

olivier said:
humm quake3 is older and we still use it for benchmark :)
Yes Q3A is still being used, but not as much as it used to be. And actually I was refering to benchmark utilities.

T2k said:
Moreover its obviously NOT ONLY D3D bench - just take the CPU FSB... I think we need a pure D3D-bench.
Especially now, around showtime of GFFX...
Say what? It seems that you have misunderstood the basic idea of 3DMark. 3DMark sure is a D3D benchmark, but certainly not a GPU/VPU/whatnot benchmark. The game tests simulate real games, and as you know, games also take usage of the CPU. We use the CPU for many things. Like physics, simple AI etc. Just like any other "normal game" would do. We do have the theoretical + feature tests for more "in-depth" testing, if that's what you are looking for?

T2k said:
Wow? Shall we forget the word: "FSB"?
Huh? As I just said, the game tests are game tests and not theoretical tests. There is a difference you know.

T2k said:
How you'll measure? I'm really curious about the IQ of the new NV-line.
We have some built-in image comparison stuff (bil, tril, aniso, etc) + we have new options for taking comparison screenshots. Can't say much more. I think it is much better than what we had in 2000 and 2001. 8)

Doomtrooper said:
No its not that simple, so in Futuremarks management its ok to Release V1 of 3Dmark to the public and all DX8 class cards (at the time) get huge score increases from Nature..all DX7 class cards were tough luck to you..yet when DX 8.1 is released its not ok now to change the scoring but ok to all DX8 class ??

Who said rewriting another game test..rewrite Nature for Pixel Shader 1.4 support...how would that have 'shifted the scoring'.
First of all, you don't get a "huge score increase" if your hardware is capable of running GT4. 3DMark2001 is a DX8 benchmark and therefore at least 1 test should require that kind of hardware.
When DX8.1 came out, we didn't want to change the scoring system. We wanted 3DMark2001 and 3DMark2001 SE to be comparble. So what to do? Not touch the game tests. Simple as that. So, to introduce PS1.4 we made the Advance PS test.
We have a policy here that we don't touch the game tests in patches unless there is a serious bug or something. If we go in and start re-writing there is a chance that it changes the test too much and is no longer comparable with the older build.

Doom,

It looks like you are desperately looking for something (I don't know what) that just isn't there. We are not the bad guys. You would get off much easier if you would just accept the things as they are. They are not as bad and twisted as you might think.

demalion said:
look forward to seeing what you have done.
Hehe.. Talking about putting some pressure on me. Nah, I have only forwarded ideas, and suggestions that I have seen/read that people want/need. Of course I can't guarantee that everything will be in 3DMark03, but at least now IMHO we have better IQ tests than before.

LeStoffer said:
Nonetheless there is a point to be made about 3Dmarks overdependence on FSB speed. Hint: Take a look at game #2: Car Chase High Detail.
That scene uses a lot of physics. I think that's why it depends pretty heavily on the CPU. Still, it is a game test, and not a theoretical test. :) Besides, it's not entirely up to the FSB..

RM. Andersson said:
IMO the benchmark should test eveything that can be done with DX9.
If that means that some test dont work with some hardware it should be done anyway.
We do try to make use of the DX9's new features, but not all are as easy to "show" as others. I think the major improvement is the shaders, and of course we will try to show what they can deliver. ;) What else DX9 stuff we have is still a secret.

Ok, now off to do some real work..
 
Worm, maybe the best way to nullify the critics would have been to make the advanced shader part of the score and call the SE version 3dMark2002 or something, and say that scores are not comparable between 3dMark2001 and 3dMark2002.

I think that it's kinda dumb to release a "totally new" benchmark with new scoring system that is infact, identical to the old one with just one new test, but I guess that's how you'd stop this nonsense.

It's equally dumb to release a small upgrade to an existing benchmark that has a different scoring system and cannot be compared to any prior scores at all. It would cause mass confusion. "Hey, your 3dMark2001 score. Is it 3dMark2001, or 3dMark2001 SE? I dunno.....AHHHHH (Monty Python joke)"

Perhaps the best course of action would be to not do the advanced pixel shader at all. Of course, then the conspiracy theorists would say "why doesn't futuremark release any support for PS1.4. CONSPIRACY AGAINST ATI!!!"


So you see, you lose no matter what.
 
DemoCoder said:
Worm, maybe the best way to nullify the critics would have been to make the advanced shader part of the score and call the SE version 3dMark2002 or something, and say that scores are not comparable between 3dMark2001 and 3dMark2002.

I think that it's kinda dumb to release a "totally new" benchmark with new scoring system that is infact, identical to the old one with just one new test, but I guess that's how you'd stop this nonsense.
We didn't want to call 3DMark2001, SE 3DMark2002 as it didn't have enough new stuff to be called a totally new version. SE was the same benchmark as 2001, only with some fixes, 1 new test etc. Not enough to be a totally new version.

So you see, you lose no matter what.
Do I? ;) I still believe that some day people "get it". It takes time for some, but slowly we are getting there.

But this is _nothing_ compared to the fights I had over at 3dfxgamers.com's forums. Geez.. :D I got many "friends" there. Hehe..
 
worm[Futuremark said:
]Hrmm.. Actually no. Game Test 4 (Nature) is a DX8 game test, while the demo is DX7.

Thank you for pointing our your double standard. "Advanced Shader" test is DX8.1, and the GF4 "demo" mode (but still allowed to run and score) is DX8.0. Funny how the reasoning flips a complete and disorderly 180 degree change depending upon which IHV is the one without the capability.

Oh where have you been the last year? This topic has been all over the place, but here it goes.. We didn't make the PS1.4 into the Game Tests simply because we wanted the 2001 and 2001 SE scores to be comparable.

Yet "splash screen" enabled and disabled doesn't want to be taken into account or "comparable" scoring either. Somehow I still see a favorable bias without any account to score accuracy or comparison. NVIDIA themselves already commented about this with the very first 3DMark.

I kinda missed your point here.. You mean that the reference images in 3DMark2001 / SE are .. faulty/misleading or something?

It has to do with "scoring" or "failing" IQ tests. It's already been beat to death, but the fact still remains this feature in the Pro version only leads to one conclusion.

What evidence, if I may ask? I haven't seen 1 single evidence of any conspiracy. Show me 1. I mean _real_ evidence and not some "yadda yadda" rumors..

No rumors.. All documented fact.
3dmark2000- artificially higher scores for lesser performing hardware. NVIDIA came to the rescue here, likely feeling a bit bad about why their similar fillrate/bandwidth cards were illustrating anywhere from 200-400% advancement, yet being anywhere from 20-30% lower scoring in every other Direct3D or OpenGL benchmark, game benchmark or other utility.

3dmark2001- A test included in the score that can only be run on one IHV's product AND added to the final score to ensure other IHVs are artificially penalized by score. A fallback mode is available and visible on all other IHVs and the evidence is the lack of ability to allow the "fallback" mode to be scored.

3dmark2001SE- an "advanced shader" test is added. It is not allowed to be tabulated into the final score. It is described to have support for PS1.4, but yet it is carefully coded in such a way to not have any real benefit for PS1.4 cards, and in fact has a "fallback" mode, which is tested and performance added for non DX8.1 compliant hardware.

I am assuming that if Futuremark is going to remain consistent, then there should be absolutely no DX9.0 tests in 3DMark03 that will only run on the Geforce FX? Or if there are "advanced" tests, they will be written to the standards of the "least common denominator" so as they can be coded for the ultimate performance on "fallback" or lesser hardware? AND such tests will not be used in the final score?

This is really the only way Futuremark can hope to pawn off the kind of 'unbiased' or "objective" standpoint this kind of propaganda hopes to achieve.

But if I am right, I expect 3DMark03 to have-
- At least one or more tests that will *only* perform properly on the GeforceFX and will alter the score dramatically in favor of this IHV.
and/or
- Tests strategically written to highlight and exceed real-world performance delivery of Geforce FX featuresets.
and/or
- Written to the highest level capable by the Geforce FX while ignoring the highest level capable for other IHVs.

This would be a consistent trend for 3dmark/Futuremark and mark my words that one or more of the above will be the end goal of the new Futuremark benchmark. There is nothing to argue here except to possibly pull up this thread after it's release if any of those predictions turn out to be false.
 
I think that it's kinda dumb to release a "totally new" benchmark with new scoring system that is infact, identical to the old one with just one new test, but I guess that's how you'd stop this nonsense.

No, the way to stop the "nonsense" (as you put it), is for the next version fo Futuremark to hold some form of consistency.

Inconsistency is what creates theories or circumstantial evidence. The next version of Futuremark needs to use the same methodology, ethics and design standards that have flipped and changed regularly (which has been coincidently to those that consider this "nonsense" in favor of one particular IHV) throughout history and instead use their most "current" adopted ideology (i.e. the one from SE).

That's all thinking people ask- consistency in approach and thought process behind the benchmark.
 
Sharkfood...
No rumors.. All documented fact.
Facts? Do you have the links and/or the documentation for any of this? Your evidence this is where?
3dmark2000- artificially higher scores for lesser performing hardware. NVIDIA came to the rescue here, likely feeling a bit bad about why their similar fillrate/bandwidth cards were illustrating anywhere from 200-400% advancement, yet being anywhere from 20-30% lower scoring in every other Direct3D or OpenGL benchmark, game benchmark or other utility.
Nice "percentage figures" there....what cards though? What drivers? What OS? What CPU/motherboard/system? Which other benchmarks? With what settings? Which other IHVs are you talking about? What products?
3dmark2001- A test included in the score that can only be run on one IHV's product AND added to the final score to ensure other IHVs are artificially penalized by score. A fallback mode is available and visible on all other IHVs and the evidence is the lack of ability to allow the "fallback" mode to be scored.
So you are blaming Futuremark for the fact that ATI released a "DX8" card several months before DX8 actually came out? Are you blaming Futuremark for the fact that Microsoft shifted the goal posts about what would be a DX8 pixel shader specification? Are you blaming Futuremark for the fact that 3dfx weren't the first to release of pixel shader capable product onto the market, even though they helped develop the PS1.0 spec? The Nature test uses PS1.0 routines - there are no fallbacks available in DX8 if your card doesn't offer hardware support. Note that all the other tests (apart from the low detail Car Chase) use vertex shaders. DX8 allows these to be software processed, as long as the graphics card driver accepts pre-transformed vertices. This is all part of DX8 - it offers a fallback for vertex shading but not for pixel shading. Are you going to blame Futuremark for this too?
3dmark2001SE- an "advanced shader" test is added. It is not allowed to be tabulated into the final score. It is described to have support for PS1.4, but yet it is carefully coded in such a way to not have any real benefit for PS1.4 cards, and in fact has a "fallback" mode, which is tested and performance added for non DX8.1 compliant hardware.
Note the fact that it is still called 3DMark2001 - it is still the same benchmark. The scoring tests are still the same because it is still the same benchmark. If you introduced a new version of the Nature test to use PS1.4, how could you compare Radeon 8500s within the ORB? Do you disgard every single Radeon 8500 entry? You would have to in order to make any score comparison sensible. Given this option, do you think that ATI really care whether the Nature test (running PS1.0 routines) isn't using PS1.4? My favourite bit is "yet it is carefully coded in such a way to not have any real benefit for PS1.4 cards, and in fact has a "fallback" mode, which is tested and performance added for non DX8.1 compliant hardware"! Yeah right - the water material gets rendered in two passes instead of one. Not much difference between how a PS1.4 card renders the scene compared to a PS1.1 card? You're assuming that one less pass should mean considerably more performance (or vice versa...one extra pass means considerably less performance). Where is your evidence to prove such a belief?
This would be a consistent trend for 3dmark/Futuremark and mark my words that one or more of the above will be the end goal of the new Futuremark benchmark. There is nothing to argue here except to possibly pull up this thread after it's release if any of those predictions turn out to be false.
Sure - I'll be happy to do that! ;)
 
Sharkfood,

You crack me up! Really, you do. :LOL:

I always thought that you are a sensible guy, but your last post just made me feel sorry for you. You seem to live in some odd virtual reality full of conspiracy, and can't grasp the real life around you. You ought to snap out of it. Really.

*sigh*
 
Here's some news for you: FSB does affect gaming performance. If you jack up your FSB in Quake3 or ALMOST ANY OTHER GAME you'll get higher FPS. For being a 3D graphics forum, people here sure seem ignorant of how game performance works.

You are right to some extent.

Listen to this:
I yanked up my FSB by 50MHz and I had 0 increase in framerate in SOF2.
Listen to this:
I was running at 2048x1600 32bpp with no TC, 32bit textures and everything up to maximum.

If you want to know what I am really talking about just send me a PM. :)
 
Neeyik-
Facts? Do you have the links and/or the documentation for any of this? Your evidence this is where?

I dont subscribe to the "links plz" teenage moronic diatribe fanclub, thanks.

I prefer "facts" as in "can be reproduced by anyone that takes the time to actually put the proof to the test" rather than "website fanboi #1 with FrontPage can specialized create."

If you want to look at the facts, benchmark a Voodoo5 and a Geforce256. Use any choice of benchmarks and keep settings the same between both products. You can pick OpenGL or Direct3D. Doesn't matter. Quake2, Quake3, FinalReality, Winbench3D, GLExcess, Direct3D games with benchmarks (which are few, like DethKarz, Motorhead, FRAPS + any title using recorded STATIC gameplay). The Kyro also comes to mind. Go ahead and run some benchmarks. The proof is readily visible.

So you are blaming Futuremark for the fact that ATI released a "DX8" card several months before DX8 actually came out?

No, I am blaming Futuremark for being *inconsistent* in this regard. SE proves this point with it's Advanced Shader test. Whether or not to add this test to the final overall "score" is a totally different matter. But the fact remains the Nature test *does* have a non-DX8 mode of operation, but this mode was not chosen to be allowed, tested or scored. SE takes a 180 degree change in ethic and adds a DX8.1 test, doesnt code this even to the level of DX8.1 standards, provides a fall-back mode for non DX8.1 compliant hardware AND scores them.

Note the fact that it is still called 3DMark2001 - it is still the same benchmark. The scoring tests are still the same because it is still the same benchmark.

Fair argument, and like I said, is just a single, arguable viewpoint.. but then why is the DX8.1 shader test provided with a DX8.0 fallback AND a performance score provided?

The previous ethic was- if a fallback/lesser compliancy level isn't available (ala Nature), then other cards only get this feature in "Demo" mode with no method to benchmark or performance measure.

Yeah right - the water material gets rendered in two passes instead of one. Not much difference between how a PS1.4 card renders the scene compared to a PS1.1 card? You're assuming that one less pass should mean considerably more performance (or vice versa...one extra pass means considerably less performance). Where is your evidence to prove such a belief?

Spend about 6 months in the SDK and repeat the above statement and see if you can do so with a straight face. It's also funny you are assuming that anything coded in PS1.4/single pass will obviously perform *lesser* or equivalent to multi-pass/PS1.1 (as the Advanced Shader test propagates). There are enough developer quotes around to prove this is NOT the case in reality, but not as illustrated by 3DMark2001SE.

Even the ATI "ocean" screensaver has already been stated by multiple sources to be substantially worse performing if a single-pass/PS1.1 version were to be provided. An elementary knowledge of PS1.1 vs PS1.4 is enough to prove this as well.

worm-
You crack me up! Really, you do.

Will I be "cracking up" when 3Dmark03 is released? What inconsistencies will be released? Only time will tell.

If any one of my predictions are the driving force behind the benchmark then all your grandstanding will simply become a moot point.

Hey, I'll be quite happy with Futuremark if the next version actually takes the last benchmark ideology and sticks to it. That ideology dictated defines that "advanced" functionality is coded to the LEAST common denominator and if "fallbacks" are required, the tests are not added to the final overall score, yet provide an informational only score. If this is the new ethic for the benchmark from SE, then by all means, stick to it.
 
Shark,

inst this whole 3dmark2000 debate done and dusted? Every sane commentator knows that the V5 performed better in most titles in 2000 than the Geforce256, yet the Geforce 256 had a higher score on 3dmark. We all know why.

You could argue the same between a Gf2Ultra and a Gf3 when originally released in relation to 3dMark2001.

Again we all know why.

The problem was as always teenage webmasters and ignorant magazine hacks who (to my mind seem to worse than most 3d/game sites) dont know why so just post the numbers up.
 
gokickrocks said:
just have the next 3dmarks incorporate a program called SetFSB and clock it back to defaults if you really dont want FSB to come in to play...i however think that FSB is fair game

Erm, you want to standardize a FSB? What particular clock rate do you have in mind. For instance there are 400 P4s and 5xx P4s, soon to be 8xx P4s, 200 Durons, 266 Athons and 333 Athlons, soon to be 400 Bartons and infinite FSB Athlon 64/Opterons.
 
? Every sane commentator knows that the V5 performed better in most titles in 2000 than the Geforce256, yet the Geforce 256 had a higher score on 3dmark. We all know why

Heh, yeah, because 3dmark is a d3d benchmark, not a glide benchmark.
 
Status
Not open for further replies.
Back
Top