Futuremark: 3DMark06

TR is moving up my charts just because he's got that irreverent yet good natured style to go with his technical chops. Some people can only be funny when they are sticking the needle to someone else, and that gets old.
 
Chalnoth said:
Of course, it'll take a hell of a lot more convincing to get me to believe that 3DMark is anywhere close to as good a benchmark as real games are, but we'll have to see.
It's not as if current games benchmarks aren't suspect to odd design or questionable designs/decisions though - take some of the ones we use: Chaos Theory (no SM2.0 path on launch); FarCry (HDR+AA patch still not publically available); Doom3 (shader tweaked by a 3rd party runs better for one particular IHV).
 
Neeyik said:
It's not as if current games benchmarks aren't suspect to odd design or questionable designs/decisions though - take some of the ones we use: Chaos Theory (no SM2.0 path on launch); FarCry (HDR+AA patch still not publically available); Doom3 (shader tweaked by a 3rd party runs better for one particular IHV).

Wow, all things in favor of the big green. And within a couple hours of launch 3dmark06 has already been branded as an Nvidia sellout (notwithstanding the fact that it was probably developed on NV40/G70). Looks like 2006 is gonna be more of the same :rolleyes:
 
Neeyik said:
It's not as if current games benchmarks aren't suspect to odd design or questionable designs/decisions though - take some of the ones we use: Chaos Theory (no SM2.0 path on launch); FarCry (HDR+AA patch still not publically available); Doom3 (shader tweaked by a 3rd party runs better for one particular IHV).
Definitely. This is why you use no single game for a decent analysis. Any good analysis will use a wide variety.

So 3DMark is a little better than a single game benchmark in this regard: it specifically tests a few different scenarios. But, at the same time, it is limited in that the same basic design decisions still went into each individual test. So 3DMark is almost as good as testing a few different games based on the same engine, with the primary drawback that no 3DMark test is based on an actual game.
 
Kombatant said:
Did you really have a doubt about that? :devilish:

Yes, I was hoping for 3D nirvana :LOL: I need to re-read the thread to get a better understanding of exactly how ATi got the shaft - right now it's just a jumble of acronyms and accusations.
 
FX55 with X1900 Series card


FX60 with X1900 Series Card


AMD64 3200+ with GTX512(NOC)


AMD64 3200+ with GTX512(OC)


Something tells me that this picture seems very wrong.

US
 
Rys said:
So because your art direction was already set, you couldn't implement the technique because no scene required it? While deep freeze is graphically very impressive, that might have been your candidate for a different look and set piece that included some of the techniques people are asking for. You can't undo what's done, though.
Our artists have very much freedom to design the scenes as they see fit, but of course our programmers work with them to make sure that we get all needed stuff in there. None of the scenes really would benefit from POM (at least I can't think of any places where it would benefit and make the scene look better). In any case, we don't want to add effects just because they exist. I doubt that any game developer would add 1001 effects to games just because they exist.

Rys said:
I get the very distinct impression that this release of 06 was very much an in-house thing, with little to no consultation and discussion with your major industry (non-IHV) and media BDP partners.
Let me quote our press release:
Developed in conjunction with BDP members AMD, ATI, Dell, Imagination Technologies, Intel, Microsoft, NVIDIA, S3, SIS, Velocity Micro and XGI, this latest offering from Futuremark is a collaborative effort that reflects the projected workloads of game content two years from today.
That covers the BDP partners. We always collect information from the media (either via discussions, posts, reviews, comments etc.) and work by that. In some cases we work directly with them, but it is tricky due to NDA's between us and other companies. Discussions like these also give us a pretty good picture what users/media feels about our products, and what we could do better. We have also discussed our approaches and ideas with other developers.

Rys said:
Seems like a waste of resources to push out a (knowingly, I bet) controversial update to 3DMark this close to Vista and D3D10, and it smells IHV-led, on the face of it.
3DMark06 is our last DX9 benchmark before moving to new grounds. 3DMark05 lacked SM3.0 tests, and we also wanted to push the envelope for DX9 one more time. There were quite a few tricks and treats we wanted to have in our benchmarks before leaving the good'n'old DX9. :smile:
 
fallguy said:
Can anyone answer me as to why my SLI'd GTX's are getting just about 52xx? I just swapped out a DFi SLI board, for a Asus one, and SLI is working fine in CoD2. Which is all that matters I guess, but Im a little confused as to why its not working in 3dmark06.

The drivers have a built in profile for 06, but the score seems to be well below what it should be.

Worm?
Sounds strange. Which drivers do you have installed? 3DMark doesn't actually "enable" SLI or "disable" it either. It is up to the drivers to determine if SLI should be on or off.
 
trinibwoy said:
Wow, all things in favor of the big green. And within a couple hours of launch 3dmark06 has already been branded as an Nvidia sellout (notwithstanding the fact that it was probably developed on NV40/G70). Looks like 2006 is gonna be more of the same :rolleyes:
Thats not what he was saying - the point being is that if you stick with something common there is then any accusations that are levelled are easily deflected; supporting that for one and that for another muddies things considerably.
 
Unknown Soldier said:
FX55 with X1900 Series card


FX60 with X1900 Series Card


AMD64 3200+ with GTX512(NOC)


AMD64 3200+ with GTX512(OC)


Something tells me that this picture seems very wrong.

US
Hmm... if you think that
a) I get 4047 with my 1800xt, stock speeds
b) I get 1040 from my 3200+ o/c at 2.7GHz
c) see the FX60 score there,

that'd mean that, if i had a FX60, I'd get something a score of 4940 with my current card right? So isn't 5348 kinda low for a 1900? Unless it's not a high-end model, of course.
 
Last edited by a moderator:
Nick[FM] said:
And which choices are the ones you refer to? It'd be good to know.


In order to avoid any wrong use of the scores (AA scores in this case) this was the best solution. As said, this applies to all cards where one setting (or value) disables any of the tests. We require that what tests are available must also be available with any settings, otherwise no 3DMark score is being outputted.


Depends on how you look at things really. In this case the HDR/SM3.0 tests require something the 6200 is not capable of even with default settings. AA is an optional thing, not in as default.


We wanted SM2.0 hardware to get a score in 3DMark06 as well. 3DMark05 is a great SM2.0 benchmark, but 3DMark06 is even better.


Not sure about this one.. Refer to SM2.0, SM3.0 or SM3.0 with FP16 blending support? :???:

Cheers,

Nick
Having nvidia and ati Sm3 hardware run different code because you chose to require an optional DX9 feature when the work around for cards which don't support results in in a clear disadvantage for the card which doesn't not support an optional feature.
Maybe you should have used 3dc because nvidia can support it though a hack;)
 
First two will be the dual core drivers helping out the x1900 ;) But the boost seems unatural though a bit too much.

Anyone can test a gtx 512 with a 4800+ and overclock the CPU?
 
Last edited by a moderator:
Dave Baumann said:
Nick, can you find out a little more about the conditions that Fetch4 is used in please. I'm genuinely trying to find a useful Fetch4 test, and was asking about 3DM's use and I got a rather confusing message back that Fetch4 is only used in the Shader Model 2.0 tests - if thats the case, whats the reason for not using it on the SM3.0 tests (especially as the only ATI hardware that has Fetch4 is SM3.0 and ATI's actual SM2.0 parts need to use a different algorithm because they don't support Fetch4!).
FETCH4 is being used for all shadows in the SM2.0 graphics tests (for hardware that supports it of course). Due to the sampling method in the HDR/SM3.0 graphics tests, we weren't able to use neither FETCH4 or PCF in those tests. It simply wouldn't have worked due to the rotated grid we use.
 
trinibwoy said:
Yes, I was hoping for 3D nirvana :LOL: I need to re-read the thread to get a better understanding of exactly how ATi got the shaft - right now it's just a jumble of acronyms and accusations.

I think it's pretty simple.

1) DST16 could've been used in the benchmark but wasn't (developers(or IHV's) preference I suppose)
2) DST24 is used, but cards that don't support the feature is forced to use F32(which might cost more but is an unknown factor atm).
3) Fetch4 used in SM2.0 which doesn't support Fetch4 but an algorithm of it is used.
4) SM3.0 supports Fetch4 yet it isn't used in the SM3.0 benchmark.
5) AA doesn't get scored with a certain IHV's card.

Anything else to add?

US
 
Last edited by a moderator:
Rys said:
Jason, did you push for parallax mapping's inclusion during the 06 development process, as part of ET's participation in the BDP? Wavey, did you do anything for 06? Hanners?

Neither myself nor the ET editor Loyd Case were ever asked. We didn't see or hear anything about 3DMark06 until January (as in, this month, not last January).

Had someone asked me last summer, I would have said parallax occlusion mapping with a variable sample rate depending on distance from camera and angle, and sub-surface scattering (on flesh, liquids) would be two primary target areas. But hindsight is always 20/20. And I can certainly understand Futuremark's desire to prevent 3DMark from becoming "design by committee" or letting just their partners "buy" their way into having influence over the app.

I still can't reconcile why a GeForce 7800 gets no score with AA enabled. Enabling AA puts it in the same boat as an X800 or GeForce 6200 - able to complete only the CPU and SM2.0 tests. I think it should use the forumla for those cards in that situation.

There should be less of a hassle over optional features (knock on wood!) with the DX10 version of 3DMark. DX10 is much more rigirous in its requirements. There's a straight-up list of features, and either you have them and you're a DX10 card, or you don't have them and you're not. None of this hooey about various different shader models, optional texture formats, etc.
 
Last edited by a moderator:
radeonic2 said:
Having nvidia and ati Sm3 hardware run different code because you chose to require an optional DX9 feature when the work around for cards which don't support results in in a clear disadvantage for the card which doesn't not support an optional feature.
Maybe you should have used 3dc because nvidia can support it though a hack;)
They both run the same code really! The other simply doesn't support FP16 filtering, and thus needs to use a fallback, which I already said is very efficient! We do not require FP16 filtering. We require FP16 blending, which we have no fallback for.

3Dc would have increased the package by 2x (or something like that)... Same thing as when we discussed this topic with 3DMark05. Nothing has changed since that.
 
Back
Top