Futuremark: 3DMark06

trinibwoy said:
Whoa there pardner - I think you're quite wrong. 3dmark comparisons in the vast, vast majority of cases are at default settings.

Whoa there....read what I wrote.

I said that people use the CARDS with AA enabled (you know, when they use their cards to play games). Not that that they run the becnhmark with AA on the majority of times.

Of course, whatever default FM decides on, that's going to be the "most run" for that benchmark.
 
Chalnoth said:
Much better, if you ask me, to just compare the SM2 AA scores and be done with it.

Look, I don't really have a problem with it one way or the other....there are pros and cons to each approach. Either just manually compare the SM2 AA scores (and don't produce an overall score), or produce an overall score using their formulas to get a "pseudo comparison." Just be consistent about it!. (Have I not used the word "consistent" enough? ;) )
 
Joe DeFuria said:
Whoa there....read what I wrote.

Ah, now I get your meaning. But your wording above isn't exactly clear in that respect. And with regard to the comparison to games, 3dmark06 does exactly what the game does with Nvidia cards and HDR+AA - it doesn't run at all !! ;)
 
DemoCoder said:
Well, if they changed that (report no score for SM3.0 parts without blending) would that satisfy you? If they made it consistent, by still not reporting a complete score AA on an NV4x, would you be happy?

Yes, I would. Am I not clear on that?
 
Bouncing Zabaglione Bros. said:
It's not that at all. There are just so many weird discrepancies and choices, and they seem to favour Nvidia. Come on, a "forward looking test" with no SM3.0 branching, no parallex mapping, no AA/AF? Even places where Nvidia cards get no score rather than a bad score, where the exact opposite happens for ATI cards? Nvidia cards get advantage from their specific non-DX features, but ATI cards don't?
Some valid points, but how does the absence of parallax mapping favour NVidia? Sometimes I get the impression that some people look at the branching advantage R520 has over G70 and from there extrapolate an advantage in arithmetic- and texture-heavy shaders that just isn't there. Then they expect R520 to perform better G70 in "PS3.0" and if that expectation isn't met the benchmark must be crap. While of course that is a possibility (and I won't comment on 3DMark06 because I haven't seen it yet), there's also the point that G70 does indeed several things faster than R520.

Doesn't ATI get a better score from their specific non-DX feature (fetch4)?

NocturnDragon said:
Yes, but it has not been confirmed on HOW it's used, and how much impact it has,
and showing from the benchmaks it has no impact whatsoever.
If it was used how future games will use it (a year from now?) nvidia cards would crawl!
You're jumping to a conclusion and then expect the tests to support it. Not exactly scientific method.

Bear in mind that fetch4 is available on every single channel texture format, while in the test is only used in the PS2.0 tests... why not in the other ones? (maybe because nvidia PCF wouldn't work?)
Because it doesn't do any good for sparsely sampled filter kernels, especially as you cannot efficiently index vector components.


For those who are interested in how "CPU-limited" 3DMark06 is: use a null renderer (e.g. DXTweaker, 3D-Analyze).
 
dizietsma said:
I'd agree with this Pete except that gamers do have the option of putting up the screen resolution instead ( I am assuming they have a good enough monitor ). Indeed, Futuremark themselves have put the default screen resolution up and yet again left out AA in the standard test.
And I'm assuming that all current cards can just as easily, if not moreso, use AA rather than bump the res. Seems to me FM settled on 12x10 and not 12x9 or even 720p b/c the first matches with the most common LCD res. LCD uses would likely prefer AA to higher res, just like CRT users might prefer 2xAA rather than a notch higher res with no AA.

the issue here is that people think the scoring is not fair for none standard tests.
I don't know, I think HDR is becoming more standard, so why not the option to use AA with it? And if one can't, why not let that be reflected in the score--a score, rather than N/A? I'm going to completely read FM's whitepaper and reviewer's guide before I mouth off anymore, tho, to be fair to Nick & Co. (better late than never).

To me this is a bit strange because for the last few months/years this forum has tended to pour scorn on Futuremarks bench, it's scoring and the use the IHV's use the scores to sell cards and that anybody who buys a card based on this is tending towards being a bit daft. But now it seems this is the upmost importance.

| I put on my green tinged pro futuremark hat |

And why is this ? because ATi is not favoured.
Well, that's one way to look at it, and quite a few ppl consider B3D ATI's last bastion of hope/hype. But you could argue that 3DM03 and 05 were partially tilted toward ATI. NV cheated on the first with the FX vs. ATI's 9-series, but they didn't need to do so (detectably) with the 6-series, whose shader power outgunned the X-series. And tho 05 had the DST and PCF brouhaha, its vertex setup limitation seemed to help ATI in relation to its weaker pixel shaders (see X1600 up with the 256-bit big boys, but falling behind on most games).

Remember that poor bloke from Anandtech that came over here and went grey haired before he had to leave saying he had better things to do
Well, to be fair, I haven't seen Derek in Ars' or even AT's forums in quite some time. :) Unfortunately, ppl get vocal everywhere, and ultimately his time is better spent reviewing than arguing (see Brent). That's not to say he can't just read the forums, disregard most posts, and maybe pick up a tip or two.

N00b said:
Is there any site that has 3DMark06 feature tests scores for the 1800XT and the 7800 GTX (512)? Or a comparisson with hardware shadow mapping disabled?
Damien took care of the former for you. Check out Hanners' EB article for the latter. (I referenced his #s a page or three back: 25 and 17% hit on a GF6 and GF7, respectively).

Bouncing Zabaglione Bros. said:
no AA/AF?
Wait, 06 goes back to no AF by default? Didn't 05 and maybe even 03 use 4xAF? Has FM given a reason for this (e.g., too many texture accesses otherwise, most games start with 1xAF, etc.)?
 
Last edited by a moderator:
Xmas said:
Doesn't ATI get a better score from their specific non-DX feature (fetch4)?
From what I'm gathering from this thread, Fetch4 is not currently used on most ATI hardware (notably the R520) because most ATI hardware doesn't support Fetch4 at the precision that Futuremark is asking (24 bit).

Also bear in mind that none of the SM3 benchmarks use either PCF or Fetch4. And in the SM2 benchmarks where some ATI hardware can use Fetch4, nVidia hardware is always using PCF (since nVidia supports PCF at the precision 3DMark is asking).
 
DST24 was implemented at the same time as the boards that implemented Fetch4 - i.e. if the ATI hadware supports Fetch4 that hardware will also support 24 bit depth texture formats.
 
Chalnoth said:
From what I'm gathering from this thread, Fetch4 is not currently used on most ATI hardware (notably the R520) because most ATI hardware doesn't support Fetch4 at the precision that Futuremark is asking (24 bit).

Also bear in mind that none of the SM3 benchmarks use either PCF or Fetch4. And in the SM2 benchmarks where some ATI hardware can use Fetch4, nVidia hardware is always using PCF (since nVidia supports PCF at the precision 3DMark is asking).

Indeed, and this is where the problems lie. It's simply not an accurate benchmark because the cards are running differently. Nvidia is using the PCF optimization in 24 bit while ATI is running without any optimizations in 32 bit. Now you have to give Futuremark some credit, the X1800 simply does not support D24X8 nor fetch4 for some reason (I guess timing), so there's little Futuremark could have done, but I would still be interested in the results of both cards running 16 bit DST without any optimizations. Either that or both cards running with R32F and all possible optimizations.
 
Xmas said:
For those who are interested in how "CPU-limited" 3DMark06 is: use a null renderer (e.g. DXTweaker, 3D-Analyze).
I've only tried a P4 3GHz system with a 6600 GT so far, but the SM2.0 tests seem remarkably CPU-limited. The HDR tests are a little better, with the first one being much less CPU bound than the second. Odd...
 
mrcorbo said:
Yeah, I find it pretty funny how similar the tone here is now to that when NV30 was showing so poorly in '03. Right down to the unshaking belief that with future games their favored architecture is going to show it's "real" performance.

I'm not comparing the technology of NV30 to R520, because R520 is a much stronger design then NV30 was. But it sure feels like I've gone into the way-back machine back to the 3DMark 2003 days.

Now WRT the complaints about Nvidia's parts getting no score when AA is enabled; what's wrong with just comparing the SM2.0 scores and including the SM3.0 results with either a 0 for the Nvidia cards or just not even including them in the results becasue they don't support it. I mean, the overall 3DMark score in '06 is a pretty poor way to compare different cards anyway IMO. With the CPU score included it has become more of a platform result and any difference between 2 cards while using the same CPU is going to actually be lessened in the overall 3DMark score because you are getting the exact same score for the CPU.

Basically you are complaining about the inability to do something that it is unadvisable to do in the first place.

Only back then NV30 performed in games the same as it did in 3DMark. Which is not the case in this situation.
 
For anybody who is interested, I've collated shader dumps from all of the tests (bar the batch tests) from 3DMark06:

http://www.neeyik.info/fmark/06shaders.rar

Each folder in the rar file contains the vertex and pixel shaders from the respective tests, as caught by 3DAnalyze. The SM2.0 and HDR folders contain quite a lot of shaders because in the case of the SM2.0 tests, I ran them twice: default and then without HW DST. The same applies for the HDR tests but this time default and then with software FP filtering.

A quick glance at some of the longer pixel shaders in the HDR tests shows that a couple of them are using if not equal to...else with a bucket load of instructions that can be skipped; the shader particle test is also using flow control in its vertex shader that performs the vertex texturing. Oh and the Perlin noise test is also one hell of a PS!
 
Is it just me or is the Perlin noise one just a (very) long < 512 instruction shader that'd compile as a pixelshader 2.0 test?
 
DemoCoder said:
Well, if they changed that (report no score for SM3.0 parts without blending) would that satisfy you? If they made it consistent, by still not reporting a complete score AA on an NV4x, would you be happy?

Actually, Joe has a point.

Think about it. Every review site on the planet when reviewing graphics cards, will benchmark the cards using no AA+AF as well as adding AA+AF results. Check any site for the last 4 years and you'll see it's the standard.

FutureMark have been in this industry for quiet a while and for them to still not include AA+AF results as a final score is worrisome. Of course, you can do so with the advanced and professional editions, but the basic edition can't do so.

Of course since the program is used to analyse a range of cards, this is most probably not convienient atm.

Back to all that's been happening at hand, well Futuremark could've maybe did more, they don't think so and it's their prerogative as developer.
 
rwolf said:
Only back then NV30 performed in games the same as it did in 3DMark. Which is not the case in this situation.

Actually, IIRC, at the time 3DMark 2003 was released, the 5800 Ultra was even or performed better than the 9700 pro in most games w/o AA/AF. I think NVidia had convinced developers as well as consumers to wait to get serious about DX9 until NV30 was released because it was going to be such a great product. So, at the time it came out, 3DMark '03 was really the only indication of how bad a DX9 implementation NV30 really was. It wasn't until later on that the predictions made by '03 were validated by actual games.
 
Unknown Soldier said:
Actually, Joe has a point.

Think about it. Every review site on the planet when reviewing graphics cards, will benchmark the cards using no AA+AF as well as adding AA+AF results. Check any site for the last 4 years and you'll see it's the standard.

FutureMark have been in this industry for quiet a while and for them to still not include AA+AF results as a final score is worrisome. Of course, you can do so with the advanced and professional editions, but the basic edition can't do so.
Well, again, it'd be ridiculous to do comparisons in this way with the full score. You'd want to break down the score and only compare the SM2 FSAA results between the two IHV's. But that's what you can do now.
 
mrcorbo said:
Actually, IIRC, at the time 3DMark 2003 was released, the 5800 Ultra was even or performed better than the 9700 pro in most games w/o AA/AF. I think NVidia had convinced developers as well as consumers to wait to get serious about DX9 until NV30 was released because it was going to be such a great product. So, at the time it came out, 3DMark '03 was really the only indication of how bad a DX9 implementation NV30 really was. It wasn't until later on that the predictions made by '03 were validated by actual games.

I cannot be the only one who remembers the large scale cheating done in titles that weren't even in the dx 9 weak point of the NV30 .
 
Rys said:
Is it just me or is the Perlin noise one just a (very) long < 512 instruction shader that'd compile as a pixelshader 2.0 test?

How do you come to this conclusion? The Perlin noise shader is a 3.0 shader.
 
Back
Top