[Edit:] sorry, to answer the question of the OP, i imagine, yes, futuremark 07 will be a popular benchmark, because people will complain if hardware sites don't use it! ;p
Nick[FM];906103 said:It seems that games are, unfortunately, lagging a bit behind in efficient multi-core support. In any case, we are confident that games will start to take full advantage of multi-core CPU's anytime soon. Creating efficient support for multi-core CPU's is not like a walk in a park, and definately not easy if you already have your engine design/architecture done without any multi-core CPU support. It's not "plug'n'play".
If we wouldn't have supported multi-core CPU's in 3DMark06's CPU tests, it would have been an outdated product already at launch. 3DMark06 scales both with more CPU cores/speed, as well as with more powerful GPU's (single/dual/quad).
If you simply want to benchmark the GPU's in 3DMark, use the SM2.0 and HDR/SM3.0 Scores. That's why we put them in there. For good CPU benchmarking, use the CPU Score. For a great overall performance number, use the 3DMark Score. It's all in there.
Oh, and just as a sidenote, 3DMark06 comes with graphics tests and CPU tests (and the feature tests), not game tests.
Cheers,
Nick
Are you comparing single and dualcore performance in Mhz? I doubt you wouldn't get any gain, even if running single thread game, becouse of the threaded drivers, dx and other runtimes. That would basically let the game have it's own core.As I said, exact same system, all I did was change the CPU. About 1700 points more, for the system that is 150Mhz slower. There is simply no way that will correlate to real world games. If anything, the top system would be faster. Although they would probably run neck and neck in every game tested.
Thats the issue I have with it. So will quad give you even more of a boost, that wont show in real games? I dont know of anyone who has such a system, and plays in a res where dual/quad cores will give you more frames, such as 800x600. I realize that you have to look to the future, hence your name. But the simple fact is, the scores of 3dMark between systems, do not come close to matching real game benchmarks. Its pretty much as simple as that.
I tried to search some game benchmarks/comparisons between those systems without luck. I'd guess you would be somewhat cpu dependant in many games with 3700+ and 7800 GTX SLI. There were some results in the orb around 6800 - 6300, that would give us 17 - 27% speedup.
Are you comparing single and dualcore performance in Mhz? I doubt you wouldn't get any gain, even if running single thread game, becouse of the threaded drivers, dx and other runtimes. That would basically let the game have it's own core.
Would be nice to see few benches with those systems..
Yea i guess just got to take your word for it.I was comparing the total score in 3dMark06. As I said, the systems were exactly the same, other than the CPU. After changing the 3700+ (single core) to the Opteron 165 (dual core), the score increased by about 1700 points. Both CPU's being overclocked. Frames in games where virtually the exact same. I game at 1920x1200, so its not CPU bottlenecked. Even running 3dMark06 at 1920x1200 showed a huge increase after changing the CPU. Which simple does not match real game frames.
I cant run benches with the two systems. I sold the 3700+ long ago, and moved to a A8R32-MVP and X1900XT Crossfire 6 months or more ago. I kept the scores in the Orb because I think its very silly to give such a huge increase for going dual core, when games do not match the same increase.
Or better yet, if Futuremark are not capable of efficiently integrating the impact of better CPUs in their game tests (hence the separate 'CPU' test) in their non-interactive 10 min sceneries, what makes them think that actual game developers should be able to do just that for 15+ hours of game with gigantic levels full of interactive environment? LMAO.OK so if you acknowledge that GPU-specific scores and CPU-specific scores are loosely and/or ambiguously linked to each other, why insist on concocting a single overall score which is (by extension) loosely or ambiguously linked to average game performance (current or future)?
How about losing the single summary score from future 3DMark0x's and sticking with the (more informative) CPU- and GPU-specific scores as an ensemble?
Nick[FM];906273 said:Again, we are not able to control how people present the 3DMark scores, and what comes to leaked numbers.. well.. I really dislike anykind of leaked information, and any such information shouldn't be considered as valid information. I know that speculating about new hardware can be fun, but people should know that leaked information is still only that - leaked information with no real value/validity.
Nick
I think you'll find that it's been pretty much the opposite - as a user of the free version of 2001, one could alter no. of tests, resolution, back buffer usage, colour and z depth, antialiasing and rendering mode. For 06, for the princely sum of $20, you get no. of tests, resolution, antialiasing, texture filtering, HLSL profile; the ability to control precision, use of post-processing effects, hardware shadow maps, coloured mip maps, software vertex shaders, FP filtering. On top of that, neither the free nor paid-for versions of 2001 offered benchmark graph, image quality, antialiasing and texture filtering examination tools. That kind of seems...err...more rather than less to me.It is also funny that through its evolution (01->03->05->06), the tests give users lesser and lesser controls.
As already mentioned in this thread, your views do not reflect the majority of 3DMark users - if you followed their forum regularly over the past 3 years you would have seen a distinct increase in the number of people requesting that the CPU results form part of the final 3DMark score. People also requested that the CPU tests reflect genuine workload and not the false nature of software vertex shading and that changed for 06 too. It's also worth baring in mind that the cost of buying 3DMark or PCMark opens up the ORB functionality quite a bit and, again, an awful lot of users requested it to be this way.You guys are leveraging your existing popularity to force users (especially dumb ones) in your direction, which is obviously financial.
I thought I was clear that I was talking about the 'free' versions.I think you'll find that it's been pretty much the opposite - as a user of the free version of 2001, one could alter no. of tests, resolution, back buffer usage, colour and z depth, antialiasing and rendering mode. For 06, for the princely sum of $20, you get no. of tests, resolution, antialiasing, texture filtering, HLSL profile; the ability to control precision, use of post-processing effects, hardware shadow maps, coloured mip maps, software vertex shaders, FP filtering. On top of that, neither the free nor paid-for versions of 2001 offered benchmark graph, image quality, antialiasing and texture filtering examination tools. That kind of seems...err...more rather than less to me.
I think we're agreeing to each other.If you're a business, what makes better financial sense - targetting the bulk of the demands or the fringe requests?
Currently we have no plans to change the way 3DMark works (in terms of scores) but we are of course always interested to get feedback from everyone.OK so if you acknowledge that GPU-specific scores and CPU-specific scores are loosely and/or ambiguously linked to each other, why insist on concocting a single overall score which is (by extension) loosely or ambiguously linked to average game performance (current or future)?
How about losing the single summary score from future 3DMark0x's and sticking with the (more informative) CPU- and GPU-specific scores as an ensemble?
A. A new 3DMark is hardly ever comparable to an old game, and I am not really sure what you mean with this question?Shtal said:When information has absorb/leek that 3DMark07 will support multi-core CPU in game tests.
A. Will it be similar like "Old Comanche 4 helicopter" CPU depended game?
B. And will FM use a real game engine like they did from the past since 3Dmark2001.
1. How about including SM3.0 DX9 "Or it is going to be SM4.0 DX10 only"
We are constantly improving the documentation on how the benchmarks should be used, and we intend to improve the information in the future as well.SugarCoat said:But many of them are stupid, need to make things as foolproof as possible!
The section of the demo in the shot is available both in the Advanced and Business Edition of 3DMark06.pjbliverpool said:BTW, that scene posted from 06, is that only available in the pro version as I have never seen it with my SM3 GPU?
I'll post about this in the site feedback to keep the thread on topic.geo said:Hmm? More detail please. . . .maybe a thread in site feedback. (Edit: Rys checked it on his Mac laptop, and it looked fine to him, so now we're really curious what you're seeing. . . )
But of course.Shtal said:Still no answer from Mr. Nick[FM].... (Will 3DMark07 be important tool ?)
Nick[FM];909551 said:1. You mean for the next 3DMark? It will be DX10 only. 3DMark06 is a great DX9 benchmark, so there is no need/use to create yet another one.
3DMark03 has one DX9 test, 3DMark05 has 3 and 3DMark06 has 4. There is no need to create once again a DX9 benchmark (or tests). 3DMark06 is also Vista enabled.Thats quite a turn around from previous releases where the latest DX version generally got a single test (maybe a couple) and the rest of the benchmark used the previous release.
DX10 only is quite a bold move. As a DX10 owner im pleased to see that it will be taking full advatange of my GPU but if non DX10 owners can't use it at all then I wouldn't be suprised to hear some complaints.
Still DX10 from the ground up... thats gotta look pretty stunning
Nick[FM];909603 said:3DMark03 has one DX9 test, 3DMark05 has 3 and 3DMark06 has 4. There is no need to create once again a DX9 benchmark (or tests). 3DMark06 is also Vista enabled.
Unfortunately I can't spill any beans on the next 3DMark yet but perhaps soon we'll reveal some neat info about it. It will have sport some cool new features you'd wouldn't perhaps expect to see in 3DMark.
Cheers,
Nick
P.S.: Neat features I wouldn't expect in 3DMark: GPU-based soundwave raytracing and occlusion?