Futuremark: 3DMark06

Anyone willing to place bets if all of a sudden we see that 3dmark2006 is re-adopted at various sites back into that sites benchmark tool set? :)
 
Last edited:
Apple740 said:
Nick, is FP blending/filtering being used for HDR, or only FP blending? In case of blending/filtering, how is the impact of filtering with the R520 (because that card doesn't support FP filtering in hardware).
Duuuh! Sorry, I was going to post that info too.

We require FP16 texture & FP16 blending support (and SM3.0) for the HDR/SM3.0 tests. FP16 filtering is supported, but not required. For hardware without FP16 filtering the tests fallback to a very efficient shader emulation.

I'll check in later again! :)

Cheers,

Nick
 
Nick[FM] said:
The N/A score for non-supported AA (or AA quality levels) is from our point of view better, than either letting HW without support for AA run all tests without AA and get a score based on that (non-comparable) or run 2/4 tests with AA (non-comparable) and get a score based on that.

Then why is it "better" for SM 3.0 cards that don't support floating point blending (like GeForce 6200), to get a SM 2.0 based score, rather than the same N/A that you describe above?

I can appreciate that you just can't get "apples to apples" scores...but I do not understand why the situations are treated inconsistently.
 
jb said:
Anyone willing to place bets if all of a sudden we see that 3dmark2006 is re-adopted at various sites back into that sites benchmark tool set? :)
Well, in fairness to "those sites" now 3dmark can't be said to not represent REAL WORLD GAMEPLAY since it has a game in it and all now. ;)
 
Joe DeFuria said:
Then why is it "better" for SM 3.0 cards that don't support floating point blending (like GeForce 6200), to get a SM 2.0 based score, rather than the same N/A that you describe above?

I can appreciate that you just can't get "apples to apples" scores...but I do not understand why the situations are treated inconsistently.
Since the 6200 is not capable of running the HDR/SM3.0 tests even with default settings, it gets a 3DMark score. If any of the benchmark settings (or values) disables any tests, we won't output a 3DMark score. It is not an IHV specific thing. It applies to all IHV's and all hardware.
 
finally got nice download speed from that http://www.webnallen.se place.
My first thoughts based on reviews.
well... kinda disappointed in the choices they've made.
Oh well.
If the R580 cleans up in 3dmark06 futuremark might be forgiven ;)
 
Nick[FM] said:
Since the 6200 is not capable of running the HDR/SM3.0 tests even with default settings, it gets a 3DMark score.

I understand. The question is, what's the rationale? Gvien that the rationale used in the AA case the 3D mark score is "not comparable" to other cards running AA (2 tests vs. 4 tests for example), why are these scores considered "comparable"?

If any of the benchmark settings (or values) disables any tests, we won't output a 3DMark score.

The default vales / settings disable 2 tests in this GeForce 6200 case.

Again, I see the rationale for either case...I don't see why the rationale is being changed from one situation to another. I would prefer either

1) No 3D mark score in all cases where all tests can't be run. (Yes, this means no 3D mark score for SM 2.0 cards...but that's what 3D Mark 05 is for, right?)

or

2) No matter what the settings, run the tests that are possible (2 or 4), and base the 3D Mark score off of those.

I don't like seeing both 1 and 2 being applied in different situations.
 
"Games" without player actions and no, or predetermined NPC|MoB AI, aren't a true measure of systems performance. That is real play has been removed from the test. in the case of demo benches your CPU will do skinning, shadows and tangent space transformations and not NPC AI or physics modeling of the world. This is why often in games in actual play you see slow downs when many active creatures are around you.

I am glad they moved up to 1280x. I'll continue to base my purchases off of reading as many reviews I can on a given
card and looking at benches done in gameplay, or ones similar to the ones I play. Aka, if I play 1600x with 4-6xAA|8-16xAF that is what I pay most attention to. I also look closely at filtering options available.

For me, winning is having all the IQ options I wish and having playable frame rates in the games I enjoy. So if a
fellow is getting 75 frames in a game and I am getting 68 with ATi's HQ AF it's all fine with me.
 
Last edited by a moderator:
Bought it.

I gave up on trying to find a decent server, reflected on how much I would probably use the program to debug/diagnose my Windows box, and bought it.
It will probably run like molassess on my X800XL, and I'll suffer the indignity of not being able to run two of the tests. But what the hell.

Here's to you guys, Nicklas and company! Whatever else may be said, you do push the envelope.
 
Joe DeFuria said:
In other words, a currently ATI only featre (FETCH4) is only used if the card also supports a currently non-ati feature (DF24)?

24 bit depth stencil textures and Fetch4 are supported on X1300 and X1600 parts.
 
radeonic2 said:
My first thoughts based on reviews.
well... kinda disappointed in the choices they've made.
And which choices are the ones you refer to? It'd be good to know.

Joe DeFuria said:
I understand. The question is, what's the rationale? Gvien that the rationale used in the AA case the 3D mark score is "not comparable" to other cards running AA (2 tests vs. 4 tests for example), why are these scores considered "comparable"?
In order to avoid any wrong use of the scores (AA scores in this case) this was the best solution. As said, this applies to all cards where one setting (or value) disables any of the tests. We require that what tests are available must also be available with any settings, otherwise no 3DMark score is being outputted.

Joe DeFuria said:
The default vales / settings disable 2 tests in this GeForce 6200 case.
Depends on how you look at things really. In this case the HDR/SM3.0 tests require something the 6200 is not capable of even with default settings. AA is an optional thing, not in as default.

Joe DeFuria said:
1) No 3D mark score in all cases where all tests can't be run. (Yes, this means no 3D mark score for SM 2.0 cards...but that's what 3D Mark 05 is for, right?)
We wanted SM2.0 hardware to get a score in 3DMark06 as well. 3DMark05 is a great SM2.0 benchmark, but 3DMark06 is even better.

Joe DeFuria said:
2) No matter what the settings, run the tests that are possible (2 or 4), and base the 3D Mark score off of those.
Not sure about this one.. Refer to SM2.0, SM3.0 or SM3.0 with FP16 blending support? :???:

Cheers,

Nick
 
Entropy said:
Here's to you guys, Nicklas and company! Whatever else may be said, you do push the envelope.
Thanks! :D I take that as a compliment.

Have you tried all our official mirrors yet? I know most of them are getting hammered, but some seem to work pretty ok (Guru3D and MajorGeeks).

Cheers,

Nick
 
So, no fetch4 support, shader emulation for FP filtering, no real use of dynamic branching, and forced 24 bit DST on the ATI side and yet there is support for PCF and instead of using a workaround for HDR+AA on the nvidia side (like is forced on ATI), nvidia simply gets no score.

Seems to be quite one sided imo.
 
Nick[FM] said:
Depends on how you look at things really. In this case the HDR/SM3.0 tests require something the 6200 is not capable of even with default settings. AA is an optional thing, not in as default.

In other words, you (FM) draw the line at default vs. optional settings. We'll just have to agree to disagree then.

AA is an "optional thing", but what you've chosen as "default" fetures are nothing more than your own subjective judgements anyway. When you turn AA on as an option, some hardware runs all tests, some don't.

We wanted SM2.0 hardware to get a score in 3DMark06 as well. 3DMark05 is a great SM2.0 benchmark, but 3DMark06 is even better.

I agree that SM 2.0 hardware should be gi

Not sure about this one.. Refer to SM2.0, SM3.0 or SM3.0 with FP16 blending support? :???:

If a card can run SM 2.0 tests with AA, but not the SM3 tests with AA, then it should get a score based on SM 2.0 only. The same way a SM 2.0 card will get a score. It's no more or less "comparable" than any other SM 2.0 to SM 3.0 score without AA.

Thanks for fielding my questions, btw! ;)
 
ANova said:
So, no fetch4 support, shader emulation for FP filtering, no real use of dynamic branching, and forced 24 bit DST on the ATI side and yet there is support for PCF and instead of using a workaround for HDR+AA on the nvidia side (like is forced on ATI), nvidia simply gets no score.

Seems to be quite one sided imo.
3DMark06 does support FETCH4, uses Dynamic Flow Control, uses 24 bit "DST" (prefer to use wording "Hardware Shadow Mapping") for any hardware that has hardware "DST" support.

I don't see it as one sided.
 
ANova said:
So, no fetch4 support, shader emulation for FP filtering, no real use of dynamic branching, and forced 24 bit DST on the ATI side and yet there is support for PCF and instead of using a workaround for HDR+AA on the nvidia side (like is forced on ATI), nvidia simply gets no score.

Seems to be quite one sided imo.

To be fair, fetch4 is supported.

However, I'm starting to agree...seems like quite a few inconsistent decisions have been made.

I'm still scratching my head as to why there's a feature test for vertex texture fetch, but not for dynamic flow control in pixel shaders. If there's one break-out feature for PS 3.0 vs. PS 2.0 it's DFC...and both vendors support it to boot.
 
Nick[FM] said:
3DMark06 does support FETCH4, uses Dynamic Flow Control, uses 24 bit "DST" (prefer to use wording "Hardware Shadow Mapping") for any hardware that has hardware "DST" support.

I don't see it as one sided.

I'm probably a bit slow today...
but: "3DMark06 does support FETCH4" means that it is enabled and working in x1300 and x1600?

And also:
"uses Dynamic Flow Control" in vertex shaders or pixel shaders? If it is used in a PS, how much does have a efficient Dynamic Flow Control logic helps there?
 
NocturnDragon said:
I'm probably a bit slow today...
but: "3DMark06 does support FETCH4" means that it is enabled and working in x1300 and x1600?
Well the FETCH4 support is there, so any card with support for it will use it. You do the math. :smile:

NocturnDragon said:
And also:
"uses Dynamic Flow Control" in vertex shaders or pixel shaders? If it is used in a PS, how much does have a efficient Dynamic Flow Control logic helps there?
In the Pixel Shaders. I don't have any numbers here, but it also depends on the graphics card & drivers really.. Just wanted to clear up that we indeed do have DFC in 3DMark06.
 
Back
Top