Futuremark policy? (@ worm)

Ante P

Veteran
Does Futuremark consider scores with extreme levels of LOD valid?
Ie if I run the test with LOD +15 on an nVidia card, is that fine and dandy with Futuremark?

Is forcing a 16 bit Z-buffer ok?
 
Ante P said:
Does Futuremark consider scores with extreme levels of LOD valid?
Ie if I run the test with LOD +15 on an nVidia card, is that fine and dandy with Futuremark?

Is forcing a 16 bit Z-buffer ok?
You won't get stencil shadows with 16-bit Z...
 
OpenGL guy said:
You won't get stencil shadows with 16-bit Z...

I'm referring to 2001, but in any case. Beyond that, what policy does FM have on "tweaking".
I just spoke to a few guys about this who hang around the guys who sorta have 3dmark as a hobby and they claimed that almost all people (as in 3dmark teams" and such) at high positions use a high positive LOD, usually 4-5, and that it's considered "ok" by the "elite" 3dmarkers.

Just seems like plain cheating to me so I was interested in FMs policy on such tweaks.
I mean it's not like FM have a set of rule son their page (which I think they should have).

Oh another question, is there any way for an application to force a LOD setting or does the driver always have the last say in this?
 
Ante P said:
Just seems like plain cheating to me so I was interested in FMs policy on such tweaks.
I mean it's not like FM have a set of rule son their page (which I think they should have).
Seems like cheating to me as well... There have been threads on this topic before.
Oh another question, is there any way for an application to force a LOD setting or does the driver always have the last say in this?
LOD bias is application selectable, but anything the application can do, the driver can override. That's why there are settings on the control panel to force AA, AF, etc.
 
OpenGL guy said:
Oh another question, is there any way for an application to force a LOD setting or does the driver always have the last say in this?
LOD bias is application selectable, but anything the application can do, the driver can override. That's why there are settings on the control panel to force AA, AF, etc.

Yeah I just thought perhaps MS built in some funky stuff in DX to give the application more control (which can't be overridden unless you for an example disable some functionality, sorta like how you (ATi) tackle the toggling of N-patches in DX)
 
I wonder why does the term "elite 3dmarker" sound so utterly pathetic?

Jesus Bleedin Christ. Elite 3dmarker... Now I've heard 'em all!


*G*
 
Grall said:
I wonder why does the term "elite 3dmarker" sound so utterly pathetic?

Jesus Bleedin Christ. Elite 3dmarker... Now I've heard 'em all!


*G*

hehe I know, I know, I just didn't know what to call those guys ;)
 
You view the overall feeling about LOD tweaks in the help file for 3DMark03:

"Non-comparable results and cheating
During the lifetime of the 3DMark benchmark series, eager users and even professionals have scewed the 3DMark results, in order to increase the performance measurement of their systems or the hardware products they represent. This kind of dishonest results may not be published or quoted without pointing out what kind of measures where used for increasing the results. There are a large number of known ways to "cheat" in 3DMark, but we will only list a few obvious ways to give some examples of what kind of "tweaks" produce results that are not comparable with default 3DMark results.

Requirements for a 3DMark03 default score:

  • The graphics driver settings should be set for maximum quality, since mipmap bias can othwerwise be downgraded and other similar tweaks producing less than the desired image quality.
    If a separate slider is available for texture quality or mipmap settings, these should be set to maximum quality or 'bias=0', or the value that produces the DirectX default mipmapping.
    No graphics card tweak software may be used when running a default score. The system should be freshly booted, and as many background programs as possible should be switched off or disabled.
    Forced AA or higher quality texture filtering should be turned off, since this might produce a lower score than the system is capable of. All this kind of settings should be set to 'application specific'.
    The graphics driver may not display the content differently than how it was originally meant to be displayed / rendered. For example textures that are not originally compressed may not be compressed by the driver.
    Vertex shaders may not be forced to be run on the CPU in a default run. There is an option in 3DMark03 to run all vertex shaders on the CPU, but the graphics driver may not alter this setting in any way.
    In general, the graphics driver may not have any 3DMark specific settings, it should run in a mode as default as possible. 3DMark is meant to measure general 3D gaming speed, and this is not gained in 3DMark specific driver modes.
 
hmm does this mean default 'non AF' quality setting on FX drivers are automatically cheating as they are not set to true trilinear?
 
Randell said:
hmm does this mean default 'non AF' quality setting on FX drivers are automatically cheating as they are not set to true trilinear?

I guess so, in any case a new driver with proper trilinear for the default mode (Balanced/Maximum Performance) is on it's way according to nvidia.
 
Ante P said:
Thanks Neeyik. :)
Man I looked all over the place at www.futuremark.com and the whole time I had the answer on my HD, go figures. ;)
Duh.. Neeyik beat me to it.

Anyway, as a basic rule: Open your disp prop and then what comes to all D3D settings, click on "Default". That way you ensure that the benchmark run is as clean as possible.

Actually we have something on this on our website too. I know, it isn't complete but still something. You can read about it here. Here's a snip from it:
Recommended Testing Procedure

- Restart the computer before running the benchmark and after making any driver/hardware changes
- Set all your display properties settings to "Default" under Direct3D
- Disable any networking connections and file sharing. Any network activity is likely to affect the tested system and its performance
- Close any open applications and background programs
- Run each test more than three times. This will help to ensure that the effect that any anomalous result has on the final result is kept to a minimum
- Do not attempt to initiate any other system activity while 3DMark03 is benchmarking

During all tests, 3DMark03 will instruct the graphics adapter drivers to disable Vsync. However, some driver revisions and older pieces of hardware will ignore such instructions. Futuremark therefore recommends that users make additional checks to guarantee that Vsync is disabled.
If DX would have some sort of way to force certain/all IQ settings (AF, Tril, LOD etc) we would be very happy! You can of course force certain settings from the app, but in the end it is up to the driver. If the driver says "No, I won't do this", there's not much we can do about it..

Besides, AFAIK the LOD setting doesn't affect your score that much. I haven't tried it in a long time, but last time I checked I gained maybe a few 3DMarks.. Things might have changed since that, so I might be wrong about this.

*edit: It has been a long time since I ran 3DMark2001 SE, but if I recall corretly you don't get a default score by using 16-bit Z? Correct me if I'm wrong.. I will check it out myself in a moment. Just to make sure.. ;)
 
worm[Futuremark said:
]*edit: It has been a long time since I ran 3DMark2001 SE, but if I recall corretly you don't get a default score by using 16-bit Z? Correct me if I'm wrong.. I will check it out myself in a moment. Just to make sure.. ;)

If your card has a tweak to force 16 bit Z you can always just open 3dmark and then change this settings, thus 3dmark will run and says that it runs with 24 (or 32) bits but in fact it'll be 16.
 
worm[Futuremark said:
]
If DX would have some sort of way to force certain IQ settings (AF, Tril, LOD etc) we would be very very happy! :D You can of course force certain settings from the app, but in the end it is up to the driver. If the driver says "No, I won't do this", there's not much we can do about it.

Besides, AFAIK the LOD setting doesn't affect your score that much. I haven't tried it in a long time, but last time I checked I gained maybe a few 3DMarks.. Things might have changed since that, so I might be wrong about this.

Hmm, okay so this obviously isn't possible for AF/Tri, but anyway...
Couldn't you draw a quad with MIP settings having completely different colors, then automatically adjust LOD based on the color which resulted?
So, it would always emulate a LOD of 0
Probably wouldn't be worth the programming time, but it would still be a nice feature though - don't see anyway to cheat this one, at least! That is, unless the drivers actually had a special path for 3DMark which would give lower LODs when passing this test...
But the company who'd dare to do that would be really lame!


Uttar
 
Ante P said:
If your card has a tweak to force 16 bit Z you can always just open 3dmark and then change this settings, thus 3dmark will run and says that it runs with 24 (or 32) bits but in fact it'll be 16.
Well if someone is using some extra tweak to fool the software, there isn't anything we can do about it I think. I could ask the chaps from the ORB team to see how the data differs between h4x0r3d Z buffer and true Z buffer..

Uttar said:
Couldn't you draw a quad with MIP settings having completely different colors, then automatically adjust LOD based on the color which resulted?
So, it would always emulate a LOD of 0
I don't know about this.. Sounds doable, but I'm not sure if it is a waterproof technique. I'll give the 3DMark team a heads-up on this idea! Let's see what they have to say.. ;)

*edit: I just discussed this with Patric O. (producer or 3DMark) and he said that it would be possible to do some "tricks" to do this kind of an approach, but they wouldn't be waterproof, and they might cause some other problems. The biggest problem (again) is that the driver level has the final call what it will draw onscreen. But we will look into this for the next 3DMark. :)
 
Although LOD tweaks and special drivers, lovingly handcrafted just for 3dmark03 by nvidia ;) , are used by the "elite benchmarkers" they are the people mainly have stumped up the $$ to be a Pro User, at compared to the person who runs it occasionally.

Therefore it may be financially unwise to throw away too many options which makes it "fun" for the guys to get that single last point out of the benchmark.
 
Back
Top