Futuremark: 3DMark06

madmartyau said:
Just finished downloading from Techconnect

3DMark Score: 4225
SM 2.0 Score: 1706
SM 3.0 Score: 1704
CPU Score : 1609

why dont you post a compare url...??

Futuremark killed the ones i posted already :(
 
Reverend said:
And the dumbass journalist would reply with "Well, what's next, 5 different scores, like for GPU, CPU, memory, physics and GPU+CPU+etc=system in an all-encompassing 3DMark package?"

And Futuremark would respond "No, you dumbass journalists can't handle 1 number properly...we're taking a big risk with two...wouldn't dare confuse you poor sods with more..."
 
Unknown Soldier said:
Ok ... if the X1800XT is supposed to do "SM3.0 Right" why does it lose to the 7800GTX 256MB ?

http://www.pcper.com/images/reviews/199/3dmark06-sm3.gif

The single 7800GTX 512MB walks it.

US


They are using over clocked GTX in that review. And its interesting to note that very last sentence in that review tells you to check their engine for prices on the GTX...

While I am sure this does not change the results that much, this does make you wonder...
 
That is odd.

I've also seen scores where a X1600XT walks all over a X850XT
Maybe Futuremark released this prematurely??

Surprised AA is still not default - especially in 2006.

Anyone see any sites with that data - it's of more interest to me.
 
tEd said:
I guess there is alot potential not used for x1300/x1600 and x1900 as i assume 3dmark06 won't use the 24bit dst and fetch4 functionality for the shadowmap rendering.

Also if they use fp16 filtering for hdr rendering nvidia will have an advantage as ati don't support fp16 text:devilish: ure filtering and it has do be done in the shader
3DMark06 has multivendor "DST" and "PCF" support. ;)
 
Bunch o' numbers here http://www.pcper.com/article.php?aid=199&type=overview

ATI pretty thoroughly trounced, particularly X1800xl.

Couple questions: 1) Have we figured out the performance constraints yet with this one? Can we finally retire "vertex fetch limited" to the ash heap of history? :LOL: 2). How much optimizing do we figure the IHV's have in their current drivers for it? Are we expecting a big jump next round, or the little bitty incremental increases over time?
 
From the screenshots of the "old" game tests, it's striking how the "old-fashioned" horrible "HDR lighting effect" has been transformed into something that looks a lot more realistic.

This is a major major plus in my view. Hopefully it'll lead people to understand what HDR really should look like. And the ghastly junk we've seen in quite a few games will gradually fall out of fashion.

Ah well, pity my X800XT can't do much - downloading anyhow to see what I can see.

Jawed
 
NVIDIA and a dual core AMD is the way to go i guess. :???:

My "old" FX-55@2.8 GHz is worthless in this test. So is my X1800.

An X2 3800+ and a 7800 GT murders my score. BS. ;)
 
IbaneZ said:
NVIDIA and a dual core AMD is the way to go i guess. :???:

My "old" FX-55@2.8 GHz is worthless in this test. So is my X1800.

An X2 3800+ and a 7800 GT murders my score. BS. ;)

Yea my cpu is also killing my score. Is it possible with the Pro version to skip the cpu tests? They take so long :(
 
m4trix said:
Yea my cpu is also killing my score. Is it possible with the Pro version to skip the cpu tests? They take so long :(

Yes, it is, although of course then you won't get a final, overall 3DMark score, only SM 2.0 and HDR/SM 3.0 scores.
 
Hanners said:
Yes, it is, although of course then you won't get a final, overall 3DMark score, only SM 2.0 and HDR/SM 3.0 scores.

uhmmm :( So its different then the 03 and 05 on that point?
 
Back
Top