Will 3DMark07 be a popular benchmark tool?

I don't think that 3DM01 was a good benchmarking tool. It was popular, but not very objective. GeForce = good score / Kyro II, Radeon, Voodoo 5 = low score (e.g. GF256 gives much better results than other mentioned boards, but majority of games runs much smoother on Radeon or Kyro II...).
I have no idea why you think that majority ran better on Radeon or Kyro2 or Voodoo4 (V5 were so rare that there is no point discussing'em)
Fact is, of all these GF256 had longest life-span.
Hell, what is GF2MX/4MX, than slightly modified GF256? One still can play some games with such cards. Where is support for R100, Kyro, Voodoo? Dead and gone.
 
Actually I don't understand why it's so important that the 3DMark engine must be used in some games to be qualified for a "game benchmark."

Computer games have very high diversity. An engine for one game can have very different performance characteristic from another game engine. Therefore, there is no so-called "representative" game benchmark. There will never be such a thing.

I think, what 3DMark wants to be, is a kind of prediction of how future game engines may perform. Of course, no one knows what direction future game engines may follow. The folks at Futuremark are just guessing, as everyone else. As long as the engine in 3DMark performs similar tasks as a game engine, I think it'll be a good estimation of how game engines may perform in the future. Of course, there's something quite important in a general game engine design that 3DMark can't ignore. For example, in 3DMark, the camera normally follows a pre-defined path, but this is very rare in an actual game. An engine can be designed for a fixed camera path, while a game engine can't. So, the engine in 3DMark shouldn't be optimized this way, or it won't be a good estimation. I think the "game mode" in 3DMark is a good demonstration that the engine is not designed just for a fixed camera path, among other things.

Maybe someone can do a nice review over the last few 3DMarks in retrospective? Like, seeing how "vintage" graphics cards perform on games released years later than a 3DMark version, and compare how well the 3DMark estimations are?
 
I have no idea why you think that majority ran better on Radeon or Kyro2 or Voodoo4 (V5 were so rare that there is no point discussing'em)
Fact is, of all these GF256 had longest life-span.
Hell, what is GF2MX/4MX, than slightly modified GF256? One still can play some games with such cards. Where is support for R100, Kyro, Voodoo? Dead and gone.
GF256 is 4pp chip with 1 trilinear TMU per pipe. GF2MX is 2pp chip with 2 bilinear TMUs per pipe. And what is R7500 than overclocked R100? Try HL2 or Aquanox on GF256/GF2MX and on R100. You will be surprised :)
V5 were so rare that there is no point discussing'em
As a collector I must say, that it's much easier to get working Voodoo 5 than Kyro II or Radeon 256 VIVO. GeForce FX is rare, too, but we discuss them in almost every thread :)

Back to topic - According to my tests, Radeon DDR (and even 64MB SDR) is faster than GeForce 256/2MX in every D3D game and the only exception is 3DM01. Kyro II is faster in all pre-TnL games and even in some TnL games (e.g. Dungeon Siege) any in many OpenGL engines (Q3, SeriousSam)... And 3DM01 says that GF256 should be >50% faster... So I abide by what I have said. 3DM01 was popular, but unfair.
 
Back to topic - According to my tests, Radeon DDR (and even 64MB SDR) is faster than GeForce 256/2MX in every D3D game and the only exception is 3DM01.
I simply don't believe you :)
2y ago i compared gf2mx and Radeon SDR on few games, they either tied, or Gf was faster (NWN i remember)

Back on topic - I doubt 3Dmark 2007 will be as popular as it was.
FR, 2000, 2001, even 2003 were able to run even on not-so-high-lvl machines when thy were released, since 2003, even starting the bench become more and more demanding - obviously why, the gap between high-end and low end increased dramatically - compare todays top 8800 with low end 6200TC/X300SE.

And its becoming more "useless" IMHO. Maybe with shader execution and unification, 3DM results will become more consistent with games.
 
I think, what 3DMark wants to be, is a kind of prediction of how future game engines may perform. Of course, no one knows what direction future game engines may follow.

Exactly! - Like 3DMark05 which was heavily vertex depended (Maybe not exactly) But prediction was wrong is because games now/ is most heavily pixel depended.
 
I believe 3DMark07 will be different in the way it will show a differentness between single, dual and quad cores and faster CPU's that will gain faster FPS in game tests for their new 07 benchmark. Since the days of 2001SE, the 3DMark version 03-05-06 like it was said earlier graphic demos.
 
These kind of benchmarks are here for only two purposes: showing what you might expect and bragging rigths. And the latter is definitely the main reasom why it's so popular. How may people try to improve their rig to go up in the online ranking?

It's like a non-interactive multiuser game. Or, the interaction is through the online ranking system.
 
It's amazing how so many of you guys really knowledgeable of graphics technology turn into 'kindergarten mode' as soon as the FM benchmarks are mentioned. At once the thread gets filled with stuff like 'it only measures your ...nis', it doesn't measure graphics performance at all etc.

If I recall correctly (and I should, as I used to be in the middle of all that) all was well until 3DMark03, and then all of the sudden 3DMark became something like the anti-christ. - Just how did that happen and who started that discrediting campain?

The answer is simple: Nvidia
As their then latest architecture did not measure up to that of their rival, they simply claimed 3DMark measured wrong. Now that soon 4 years have passed since then, I must say I understand their reaction, even though I may still not agree with it.

But the question remains: Why do you guys still go on about that old Nvidia's PR campain to discredit 3DMark? Nvidia has commended later versions of 3DMark, so not even they are discrediting 3DMark anymore, but you still do.

3DMark remains the best performance benchmark of this and the next generation of 3D rendering technology. It is not bottlenecked by bulky game engines, which mainly scale with the CPU performance. If you want to compare the performance of a certain game, you should use a benchmark of that game. But if you want to compare 3D performance in general, 3DMark is the way.
 
I think 3D mark has its place (never been of interest to me personally) in overclocking burnin as well as epenis stuff--with pretty graphics to show you what your card could potentially do--or not do, being the case :devilish:

For all the complaints though, I think the solution is obvious. 3Dmark is going after a specific audiance and has the right honey. For many here, we realize that gametests, unless having 4 or 5 very distinct ones, is kind of pointless as we have seen games go in a number of directions of different rendering techniques. And yet simple, specific, tests are boring. So why not design some artistically/graphically pleasing benchmarks that target specific qualities of a GPU? e.g. Pretty tests that push fillrate, texturing, pixel shading, vertex shading, etc... You could do all sorts of substs of these as well (e.g. different shaders as well as with dynamic branches, etc). With the DX10 GPUs there could be tests that have variable loads and thread switching, streamout, geometry shaders, etc. And then, for kicks, there could be some "synthetic" game loads, e.g. some average "ratios" of demand one may see in certain games. Something like this would probably be more useful, but also draw in the pretty-picture crowd some. And some sort of scoring system would also be helpful, especially with so many tests.

Anyhow, probably a pointless suggestion. There is software out that already does the basic tests, all I am suggesting is some marketing PR and some pretty pics :LOL:
 
10 minutes to run a benchmark is boring, that's why I can't stand 06. :smile: Hopefully they'll make 07 a 5 minute benchie.

01 is perfekt. *Poff* Done!
 
If they make 3DPr0nMark 2007 and if it can sustain more than 25FPS on anything less than 8800GTX then surely it will have some chances to be popular :mrgreen:

This way, it will only be a "WTF?!? 2FPS on 8800GTX SLI?!? I need faster cards ARGH!!!".
 
If they make 3DPr0nMark 2007 and if it can sustain more than 25FPS on anything less than 8800GTX then surely it will have some chances to be popular :mrgreen:

This way, it will only be a "WTF?!? 2FPS on 8800GTX SLI?!? I need faster cards ARGH!!!".

My guess 3DMark07 is going to rely on the CPU also....

You may get average anything between 29-35 FPS on 8800GTX SLI: ONLY IF; big IF, if you be using Quad CPU. But on dual core you might get only ~22 FPS.
 
I predict 2FPS on 8800GTX SLI.

With a res of 1920x1200, 32xAA, TSAA, 16xanisotropic, vsync on and graphics that are 4 years ahead of time then yes perhaps 2fps. But then what other machine would match it? -Current consoles?

No they would barely be able to render a fraction of a frame in a second even if they had an optimized version for its hardware. So in conlcusion 2fps is nothing bad unless it looks bad!:cool:
 
Last edited by a moderator:
3Dmark is crap to me now. Showing a huge benefit for dual core CPU's is dumb enough, quad core score increasing is even dumber. The simple fact is, 3DMark shows a much larger difference between some systems, than games do. And that negates the whole purpose.
 
Completely agree. I don't know where the hell Futuremark got the impression that multiple cores have a significant impact on game performance. It's not true now and probably still won't be true in the near future. The CPU score is far overweighted in current versions.

If they wanted to demonstrate the usefulness of multiple cores why not just build it into the game tests? The SM2.0 and SM3.0 tests show minimal gain with more cores, yet they decided to just add in the score from a ridiculously CPU limited test? Seems like they're contradicting themselves.
 
3Dmark is crap to me now. Showing a huge benefit for dual core CPU's is dumb enough, quad core score increasing is even dumber. The simple fact is, 3DMark shows a much larger difference between some systems, than games do. And that negates the whole purpose.
It seems that games are, unfortunately, lagging a bit behind in efficient multi-core support. In any case, we are confident that games will start to take full advantage of multi-core CPU's anytime soon. Creating efficient support for multi-core CPU's is not like a walk in a park, and definately not easy if you already have your engine design/architecture done without any multi-core CPU support. It's not "plug'n'play". ;)

If we wouldn't have supported multi-core CPU's in 3DMark06's CPU tests, it would have been an outdated product already at launch. 3DMark06 scales both with more CPU cores/speed, as well as with more powerful GPU's (single/dual/quad).

If you simply want to benchmark the GPU's in 3DMark, use the SM2.0 and HDR/SM3.0 Scores. That's why we put them in there. For good CPU benchmarking, use the CPU Score. For a great overall performance number, use the 3DMark Score. It's all in there.

Oh, and just as a sidenote, 3DMark06 comes with graphics tests and CPU tests (and the feature tests), not game tests.

Cheers,

Nick
 
Nick[FM];906103 said:
It seems that games are, unfortunately, lagging a bit behind in efficient multi-core support. In any case, we are confident that games will start to take full advantage of multi-core CPU's anytime soon.
Agreed... I've been a bit unimpressed at how reluctant developers are to really start to attack parallelism, but I suppose there are code bases and tools out there that are hard to work around. I guess we also have to wait until people are sufficiently convinced that there isn't going to be some "silver bullet" - you will have to redesign data structures and algorithms.

In any case now that the consoles really necessitate multi-core usage to get reasonable performance, I suspect we'll see more on the PC side. From that point of view I guess it's good that - for example - the PPU in the Cell is slow as hell ;)
 
Back
Top