Will 3DMark07 be a popular benchmark tool?

Well. A Quad Core System with a 8800GTX scores more than my Dual Core System with 8800GTX SLI. Maybe quad core will make that much difference in the future. But I doubt it.
 
3Dmark is an ok way to push graphics cards to their theoretical limits -- this many triangles, this many shader passes, that many monkeytrons per pegaflaster, but there's little basis in reality. As such, my main problem with it hasn't been that it hasn't been a good enough "guess" into the future, but that it's always been hella boring. Sit around and watch a slideshow all day on my middle-of-the-road gaming PC? no thanks. if there were some real options to tweak or what have you, then it could be interesting. it would be much better for me if it were a tool to play with different shadowing or smoke algorithms, for example, but it's not, and that's that.

The problem with 3dmark threads is that they always seem to boil down to two camps: people like me who think it's a cute, small facet of the demoscene, not really the best in demos but I uess it's ok for an hour if you have a cutting-edge SLI rig... and people who think it's worth talking about all damn day and ruminating upon whether it is a genie that we can make predictions based upon. It's a goddamn slideshow people!!! Anyway, rehash the same arguments you've been having for years if you want, I think I'm just past the ooo, smoke and graphics stage. I can't even look at 3d/future/wtfevermark anymore.

[Edit:] sorry, to answer the question of the OP, i imagine, yes, futuremark 07 will be a popular benchmark, because people will complain if hardware sites don't use it! ;p
 
Last edited by a moderator:
[Edit:] sorry, to answer the question of the OP, i imagine, yes, futuremark 07 will be a popular benchmark, because people will complain if hardware sites don't use it! ;p

But who cares if people complain, if it suck it suck :):(.... "I hope not"
 
Nick[FM];906103 said:
It seems that games are, unfortunately, lagging a bit behind in efficient multi-core support. In any case, we are confident that games will start to take full advantage of multi-core CPU's anytime soon. Creating efficient support for multi-core CPU's is not like a walk in a park, and definately not easy if you already have your engine design/architecture done without any multi-core CPU support. It's not "plug'n'play". ;)

If we wouldn't have supported multi-core CPU's in 3DMark06's CPU tests, it would have been an outdated product already at launch. 3DMark06 scales both with more CPU cores/speed, as well as with more powerful GPU's (single/dual/quad).

If you simply want to benchmark the GPU's in 3DMark, use the SM2.0 and HDR/SM3.0 Scores. That's why we put them in there. For good CPU benchmarking, use the CPU Score. For a great overall performance number, use the 3DMark Score. It's all in there.

Oh, and just as a sidenote, 3DMark06 comes with graphics tests and CPU tests (and the feature tests), not game tests.

Cheers,

Nick

I understand all of this, and that if you dont support dual/quad cores, its already out dated. But to give such a huge increase in points, is silly to me. The vaste majority of people do not buy the product (just my guess, feel free to correct me if wrong) so the choices are limited on what you can run.

Ill give you an example of two machines I ran, and compared to each other. Both exactly the same, other than the CPU's, and their speeds.

First:A8N32-SLI
3700+@2.75gig
2x1gig Crucial Cas3-3-3-7-1T
2xBFG 7800GTX's @ 485/1.33
X-Fi
Score:6328

Second
A8N32-SLI
Opty 165@2.6gig
2x1gig Crucial Cas3-3-3-7-1T
2xBFG 7800GTX's @ 485/1.33
X-Fi
Score:8011

I cant link both to show you, so you'll just have to take my word. Here is the Opty link of 8011.

As I said, exact same system, all I did was change the CPU. About 1700 points more, for the system that is 150Mhz slower. There is simply no way that will correlate to real world games. If anything, the top system would be faster. Although they would probably run neck and neck in every game tested.

Thats the issue I have with it. So will quad give you even more of a boost, that wont show in real games? I dont know of anyone who has such a system, and plays in a res where dual/quad cores will give you more frames, such as 800x600. I realize that you have to look to the future, hence your name. But the simple fact is, the scores of 3dMark between systems, do not come close to matching real game benchmarks. Its pretty much as simple as that.
 
I tried to search some game benchmarks/comparisons between those systems without luck. I'd guess you would be somewhat cpu dependant in many games with 3700+ and 7800 GTX SLI. There were some results in the orb around 6800 - 6300, that would give us 17 - 27% speedup.
As I said, exact same system, all I did was change the CPU. About 1700 points more, for the system that is 150Mhz slower. There is simply no way that will correlate to real world games. If anything, the top system would be faster. Although they would probably run neck and neck in every game tested.

Thats the issue I have with it. So will quad give you even more of a boost, that wont show in real games? I dont know of anyone who has such a system, and plays in a res where dual/quad cores will give you more frames, such as 800x600. I realize that you have to look to the future, hence your name. But the simple fact is, the scores of 3dMark between systems, do not come close to matching real game benchmarks. Its pretty much as simple as that.
Are you comparing single and dualcore performance in Mhz? I doubt you wouldn't get any gain, even if running single thread game, becouse of the threaded drivers, dx and other runtimes. That would basically let the game have it's own core.

Would be nice to see few benches with those systems..
 
I tried to search some game benchmarks/comparisons between those systems without luck. I'd guess you would be somewhat cpu dependant in many games with 3700+ and 7800 GTX SLI. There were some results in the orb around 6800 - 6300, that would give us 17 - 27% speedup.

Are you comparing single and dualcore performance in Mhz? I doubt you wouldn't get any gain, even if running single thread game, becouse of the threaded drivers, dx and other runtimes. That would basically let the game have it's own core.

Would be nice to see few benches with those systems..

I was comparing the total score in 3dMark06. As I said, the systems were exactly the same, other than the CPU. After changing the 3700+ (single core) to the Opteron 165 (dual core), the score increased by about 1700 points. Both CPU's being overclocked. Frames in games where virtually the exact same. I game at 1920x1200, so its not CPU bottlenecked. Even running 3dMark06 at 1920x1200 showed a huge increase after changing the CPU. Which simple does not match real game frames.

I cant run benches with the two systems. I sold the 3700+ long ago, and moved to a A8R32-MVP and X1900XT Crossfire 6 months or more ago. I kept the scores in the Orb because I think its very silly to give such a huge increase for going dual core, when games do not match the same increase.
 
I was comparing the total score in 3dMark06. As I said, the systems were exactly the same, other than the CPU. After changing the 3700+ (single core) to the Opteron 165 (dual core), the score increased by about 1700 points. Both CPU's being overclocked. Frames in games where virtually the exact same. I game at 1920x1200, so its not CPU bottlenecked. Even running 3dMark06 at 1920x1200 showed a huge increase after changing the CPU. Which simple does not match real game frames.

I cant run benches with the two systems. I sold the 3700+ long ago, and moved to a A8R32-MVP and X1900XT Crossfire 6 months or more ago. I kept the scores in the Orb because I think its very silly to give such a huge increase for going dual core, when games do not match the same increase.
Yea i guess just got to take your word for it.
 
Why would I lie? I gave one link, I can take a pic of it in the browser, and post it. I cant link two scores of 3dmark06, it only lets you have one Active Project.
 
OK so if you acknowledge that GPU-specific scores and CPU-specific scores are loosely and/or ambiguously linked to each other, why insist on concocting a single overall score which is (by extension) loosely or ambiguously linked to average game performance (current or future)?


How about losing the single summary score from future 3DMark0x's and sticking with the (more informative) CPU- and GPU-specific scores as an ensemble?
Or better yet, if Futuremark are not capable of efficiently integrating the impact of better CPUs in their game tests (hence the separate 'CPU' test) in their non-interactive 10 min sceneries, what makes them think that actual game developers should be able to do just that for 15+ hours of game with gigantic levels full of interactive environment? LMAO.

If Futuremark truly believes that CPU will matter the way they have in current '06, they should show it off in the 'real' way. By effectively integrating the impact of CPU in the game tests. Why not make quad-core CPUs shine in SM2.0 and SM3.0 tests if you believe quad-core CPUs will shine in upcoming games? (Oh wait. You don't know how!) Or at least let me run the freaking tests separately without paying for it.

And now, you think I should pay for it if I wanted to test it that way? Well, then I'd say you shouldn't allow 'free' downloads at all. But you wouldn't do that because that'll make your company disappear in mere days. It is also funny that through its evolution (01->03->05->06), the tests give users lesser and lesser controls. You guys are leveraging your existing popularity to force users (especially dumb ones) in your direction, which is obviously financial. (be it from paid versions or from beta-programs)

Edit: Sorry this post isn't towards the author of the quoted text. It was an inspiration. ;)
 
Last edited by a moderator:
Nick[FM];906273 said:
Again, we are not able to control how people present the 3DMark scores, and what comes to leaked numbers.. well.. I really dislike anykind of leaked information, and any such information shouldn't be considered as valid information. I know that speculating about new hardware can be fun, but people should know that leaked information is still only that - leaked information with no real value/validity.

Nick

Totally false. See above my post.
 
Wow, FM is able to control how people present their numbers? That's a news flash there. Can you point me at your documentation for this startling revelation?
 
It is also funny that through its evolution (01->03->05->06), the tests give users lesser and lesser controls.
I think you'll find that it's been pretty much the opposite - as a user of the free version of 2001, one could alter no. of tests, resolution, back buffer usage, colour and z depth, antialiasing and rendering mode. For 06, for the princely sum of $20, you get no. of tests, resolution, antialiasing, texture filtering, HLSL profile; the ability to control precision, use of post-processing effects, hardware shadow maps, coloured mip maps, software vertex shaders, FP filtering. On top of that, neither the free nor paid-for versions of 2001 offered benchmark graph, image quality, antialiasing and texture filtering examination tools. That kind of seems...err...more rather than less to me.

You guys are leveraging your existing popularity to force users (especially dumb ones) in your direction, which is obviously financial.
As already mentioned in this thread, your views do not reflect the majority of 3DMark users - if you followed their forum regularly over the past 3 years you would have seen a distinct increase in the number of people requesting that the CPU results form part of the final 3DMark score. People also requested that the CPU tests reflect genuine workload and not the false nature of software vertex shading and that changed for 06 too. It's also worth baring in mind that the cost of buying 3DMark or PCMark opens up the ORB functionality quite a bit and, again, an awful lot of users requested it to be this way.

I think the problem with a lot of people who openly criticise 3DMark (and Futuremark by proxy) is they don't appreciate what the vast majority of its regular users and fan base are actually like - being able to alter the use of hardware shadow maps sounds great but I would estimate that something like 95% of the people who've purchase the Advanced version of 06 have never used the function nor would ever want to. I, on the other hand, would like to change absolutely everything but I'm very much in a minority. If you're a business, what makes better financial sense - targetting the bulk of the demands or the fringe requests?
 
I think you'll find that it's been pretty much the opposite - as a user of the free version of 2001, one could alter no. of tests, resolution, back buffer usage, colour and z depth, antialiasing and rendering mode. For 06, for the princely sum of $20, you get no. of tests, resolution, antialiasing, texture filtering, HLSL profile; the ability to control precision, use of post-processing effects, hardware shadow maps, coloured mip maps, software vertex shaders, FP filtering. On top of that, neither the free nor paid-for versions of 2001 offered benchmark graph, image quality, antialiasing and texture filtering examination tools. That kind of seems...err...more rather than less to me.
I thought I was clear that I was talking about the 'free' versions.

If you're a business, what makes better financial sense - targetting the bulk of the demands or the fringe requests?
I think we're agreeing to each other.

Edit: Oh and the 'shamless' thingy color was changed from orange to grey? I like that because grey is indeed one of my favorite colors. :)
 
Last edited by a moderator:
OK so if you acknowledge that GPU-specific scores and CPU-specific scores are loosely and/or ambiguously linked to each other, why insist on concocting a single overall score which is (by extension) loosely or ambiguously linked to average game performance (current or future)?

How about losing the single summary score from future 3DMark0x's and sticking with the (more informative) CPU- and GPU-specific scores as an ensemble?
Currently we have no plans to change the way 3DMark works (in terms of scores) but we are of course always interested to get feedback from everyone.

Shtal said:
When information has absorb/leek that 3DMark07 will support multi-core CPU in game tests.

A. Will it be similar like "Old Comanche 4 helicopter" CPU depended game?
B. And will FM use a real game engine like they did from the past since 3Dmark2001.

1. How about including SM3.0 DX9 "Or it is going to be SM4.0 DX10 only"
A. A new 3DMark is hardly ever comparable to an old game, and I am not really sure what you mean with this question?
B. If you refer to using/licensing an existing engine, no. All we do is in-house, and as you may have noticed in 3DMark06, the engine is a game-engine (yes, there is a mini-game available).
1. You mean for the next 3DMark? It will be DX10 only. 3DMark06 is a great DX9 benchmark, so there is no need/use to create yet another one.

SugarCoat said:
But many of them are stupid, need to make things as foolproof as possible!
We are constantly improving the documentation on how the benchmarks should be used, and we intend to improve the information in the future as well.

pjbliverpool said:
BTW, that scene posted from 06, is that only available in the pro version as I have never seen it with my SM3 GPU?
The section of the demo in the shot is available both in the Advanced and Business Edition of 3DMark06.

geo said:
Hmm? More detail please. . . .maybe a thread in site feedback. (Edit: Rys checked it on his Mac laptop, and it looked fine to him, so now we're really curious what you're seeing. . . )
I'll post about this in the site feedback to keep the thread on topic.

Shtal said:
Still no answer from Mr. Nick[FM].... (Will 3DMark07 be important tool ?)
But of course. ;)

Sorry for short answers this round (low on energy, need some sleep).

Cheers,

Nick
 
Nick[FM];909551 said:
1. You mean for the next 3DMark? It will be DX10 only. 3DMark06 is a great DX9 benchmark, so there is no need/use to create yet another one.

Thats quite a turn around from previous releases where the latest DX version generally got a single test (maybe a couple) and the rest of the benchmark used the previous release.

DX10 only is quite a bold move. As a DX10 owner im pleased to see that it will be taking full advatange of my GPU but if non DX10 owners can't use it at all then I wouldn't be suprised to hear some complaints.

Still DX10 from the ground up... thats gotta look pretty stunning :oops:
 
Thats quite a turn around from previous releases where the latest DX version generally got a single test (maybe a couple) and the rest of the benchmark used the previous release.

DX10 only is quite a bold move. As a DX10 owner im pleased to see that it will be taking full advatange of my GPU but if non DX10 owners can't use it at all then I wouldn't be suprised to hear some complaints.

Still DX10 from the ground up... thats gotta look pretty stunning :oops:
3DMark03 has one DX9 test, 3DMark05 has 3 and 3DMark06 has 4. There is no need to create once again a DX9 benchmark (or tests). 3DMark06 is also Vista enabled.

Unfortunately I can't spill any beans on the next 3DMark yet but perhaps soon we'll reveal some neat info about it. It will have sport some cool new features you'd wouldn't perhaps expect to see in 3DMark. ;)

Cheers,

Nick
 
Nick, I think the problem some people have (at least, that's mine!) is that there is no combined SM2.0.+SM3.0. score combined on screen. As such, most people aren't going to give their "GPU" score even for GPU reviews or rumours.

So, if you guys only have DX10 benchmarks in 3DMark07, just make sure to have a final number for the GPU-only part. That should clearly be even more of an obvious design choice as with 3DMark06.


Uttar
P.S.: Neat features I wouldn't expect in 3DMark: GPU-based soundwave raytracing and occlusion? :)
 
Nick[FM];909603 said:
3DMark03 has one DX9 test, 3DMark05 has 3 and 3DMark06 has 4. There is no need to create once again a DX9 benchmark (or tests). 3DMark06 is also Vista enabled.

Unfortunately I can't spill any beans on the next 3DMark yet but perhaps soon we'll reveal some neat info about it. It will have sport some cool new features you'd wouldn't perhaps expect to see in 3DMark. ;)

Cheers,

Nick

Yeah I agree on the point that DX9 has pretty much been done to death now, and its not like we're seeing anything better looking than 06 out there yet anyway.

Why do I get the feeling 07 is going to crawl on my GTS though :cry:

Well, maybe thats good thing, its supposed to be pushing the limits afterall and I doubt a GTS will be near the limit by the time 07 releases.

Would be nice to see a CPU vs GPU physics there, kinda like the GPU vs CPU graphics tests of previous versions.
 
P.S.: Neat features I wouldn't expect in 3DMark: GPU-based soundwave raytracing and occlusion? :)

A GPU/CPU/PPU Physics section should be expected by everyone. I'd think they'd have expanded the sound section too. I'm not sure if they'd include a section for multimedia decoding and encoding.

What I wouldn't expect is support for an A.I Processor. I also wouldn't expect some way of benchmarking NIC cards. It would be neat if there was some way for doing so.
 
Back
Top