Late, noisy, HUGE!!! - makes 17k Mark...

Status
Not open for further replies.
I just use it to test my system overclocking, to see if there is any benefit in any of the tests, etc. Prime95 can run for hours on my system at 218MHz fsb, but 3dmark will lockup beyond 210Mhz, so it's a handy test for AGP/video stability. Only reason I publish any is because the stupid site won't let me compare my own results unless I do.

Dunno why everyone is so down on the thing, it's just a bit of code to run, lots of 3d pros in these forums, come up with something better. :)

3dmark2003 should be out soon, I seriously doubt that the folks ragging on 3dmark won't try it out. :)
 
Doomtrooper said:
Ostsol you take for granted your above average knowledge of 3D cards and code.

*shrugs* I guess MadOnion/FutureMark is at fault for designing their 3dMarks system so poorly. *sighs*

EDIT: Regardless, fook the 3dMark score! I wanna see some synthetic bench results! :D
 
why care for 3dmarks2001, they never were representative of real performance...
(one proof: a kyro II card barely beating a geforce 2 MX card in 3dmarks while kicking its ass in practically every game and every other bench...)

and as we know it's much CPU limited w/ the latest cards, so not good for comparing them..

what other benches could be used instead ?
 
Well Tom :LOL: I would start with some modern engines we all use...I.e BF 1942, Return to Castle Wolfenstein or MOHAA...can you see a trend here..two are based off the tweaked Quake 3 engine...UT 2003 could be used for CPU limitations but wary of this engine due to IHV bias...oh and Morrowind for pretty screen shots :)

Battle Field 1942s engine is called Refractor II . The renderer uses a technique, which supports among others a maximum scalability. A soldier which is very close to you will, for example, be drawn with over 4000 polygons, while the same soldier standing far away from you will have less then 100 polygons.
The game plays VERY well with 64 + players with planes and tanks rolling around.
 
borzwazie said:
It takes a separate power supply!
It's fricken huge!

How the tables have turned :) Still, if I could afford it I would probably get one. I got a 5500 after all, since it was the best thing you could buy at the time. I also had 3 slots taken up when I had Voodoo 2 SLI.

If anything, the release of this monster ought to bring the price of a 9700 down to realistic levels. 300 bux is crazy amounts of cash for a video card. 400 is insane. I didn't spend that for SLI cause I got a card for free :)

I actually think the GFX Cooler design is kind of cool. Just buy an Nforce, you shouldn't need more than 1 or 2 PCI slots then anyway (RAID card and...ahem, modem).

BTW if you want a R9700 Pro, you can get a refurb for $250 at Newegg.com!
 
Interesting to see the new ABIT boards reach almost 16,000 3D marks with only CPU o/c too...

3dmark-oc.jpg
 
Ostol is right about one thing though and that's just how obsessed some people are on these boards about 3DMark (in the negative sense, of course).

Okay - so ignore the "gaming" business. Treat it as benchmark that, unlike a game, can be used by people under similar circumstances. By that I mean when testing with games, the large number of display options that can often be found in them, gives you more variables to control to ensure consistency. To my mind, very few benchmark/review results are meaningful as there is no control or reference point within them - it's just a collection of data points for one system.

We argue that the "average" user is misguided by the evil that is 3DMark and should only ever use game timedemos. I believe we should be arguing that the average user should only be looking at reviews that compare several products, particularly with older ones as usual comparison points. The new NVATrox FFTGH7 card produces 18,000 3DMark or runs a MOHAA timedemo at 65 fps - well so what? Even if you just start increasing test settings, such as resolution or adding FSAA, all you are doing is exposing the limitations of the product. What your average user is going to be looking for is a clear sign that "if I spend my money on XXXX, it's going to be better than spending it on YYYY because the results in these reviews show it".

It's easy to sit here and criticise anything. Far harder to go out and try to prove that one has a better solution.
 
Everyone optimizes for the Onion, those optimizations don't neccessarily mean they work for games utilizing similar tech..I'd much rather see game optimizations vs. benchmarks and demos.
 
Neeyik said:
It's easy to sit here and criticise anything. Far harder to go out and try to prove that one has a better solution.

True, but I do think that Futuremark could have made a better effort to tell and show the world what the different 7 benchmarks really does benchmark. Today it test the whole system (CPU/system memory bandwidth/GPU) over just the GPU. That's okay and all, but why the heck do we have to guesstimate by trial and error on our own what the different 7 bechmark really benchmark? That's my only problem with Futuremark really.
 
Hi everybody,

As several people already pointed out a single 3dMark score is pretty meaningless for an objective evaluation of a cards performance.


As for 3DMark2001 being CPU/System limited:

AFAIR two years ago when 3dMark2001 was launched the most powerful Desktop platform was some 1.4 GHz Athlon with Sdr-SDRAM and a Geforce3.
Today the most powerful Desktop platform would consist of a 3GHz P4 with RAMBUS and a GeforceFX/Radeon9700Pro.
So IMO video card performance has increased far more than CPU/system performance over an equivalent amount of time.
This trend should continue and indicates to me that over time 3DMark2001 should become more and more CPU/system performance limited at standard resolutions.


I see several solutions for this "problem" if you want to benchmark your graphics card with 3DMark:

1) Benchmark at high resolutions with AA and/or AF to make the video card the performance bottleneck

2) Take only the parts of 3DMark that aren't CPU/system bound in the first place (theoretical tests)

3) Wait for 3DMark 200X to arrive ;)


IMO 3DMark (whatever iteration) can be a useful tool to compare 3DMark performance of various configurations (which can be fun for many people) or to test single aspects of a card (fillrate, vertex shader performance, etc..).
IMO it should not be used as a "game benchmark", or even worse, as a "buying guide" for any hardware.

Just my 2 Cents.
 
Ostsol said:
The problem with 3dMark2001 is that everyone always just looks at the overall score rather than the synthetic benchmarks to get an indication of performance. It's the users, not the product.

And the problem is that among the "users" are most of the internet hardware sites and the print media..... and they seem to think that 3DMark is "the" benchmark....still! And, the real injustice here is that the manufacturers know this, so they design their cards and/or drivers to use this, slanting the information even more.
 
mr said:
As for 3DMark2001 being CPU/System limited:

AFAIR two years ago when 3dMark2001 was launched the most powerful Desktop platform was some 1.4 GHz Athlon with Sdr-SDRAM and a Geforce3.
Today the most powerful Desktop platform would consist of a 3GHz P4 with RAMBUS and a GeforceFX/Radeon9700Pro.
So IMO video card performance has increased far more than CPU/system performance over an equivalent amount of time.
This trend should continue and indicates to me that over time 3DMark2001 should become more and more CPU/system performance limited at standard resolutions.

Agreed.

But the same happened with games too...
 
Heh, with a 9700, just about every benchmark is a cpu benchmark. What is good about 3dmark is that there are no options to set differently, aside from any generic driver tweaks the user might make, the code runs one set of features consistantly and doesn't change code paths based on the video card. It's a consistant measure of "something" directx related. Sure there are idiots that cheat, but I would say they are in the vast minority and looking at all the individual results tend to show they are cheating anyway. I defy anybody to come up with a bit of code that someone won't be able to hack eventually. And no, you don't want to compare video cards, but if you have the same video card and cpu/chipset as someone else, it can be a handy comparison for ballpark performance. If you are running the same driver and want to know what a 9700 can do relative to your 9500, you can get a consensus of results from others with similar systems. The 3dmark database is great, if there were one for games I would use that too. Game benchmarks are good, but hardly anybody considers that they have dozens of settings that have to be the same when comparing results and that they are likely to have specific code optimizations for each card. They are really little different than 3dmark, it's always going to be a score somebody else got under different conditions than what you have.
 
Well, here's my (perhaps surprising), $0.02.

For all the bitching we (including myself) do about 3D Mark in one form or another, I'm not sure many of us have taken the time to look at the actual results in a certain way.

And I don't mean comparing the actual numerical scores themselves. I mean, simply rank them.

If you had to generically rank the top 10 cards, would it look much different than this?

1. ATI RADEON 9700/9500 Series

2. NVIDIA GeForce4 Ti 4600

3. NVIDIA GeForce4 Ti 4400

4. NVIDIA GeForce4 Ti 4200

5. ATI RADEON 8500/LE

6. NVIDIA GeForce3 Ti 500

7. NVIDIA GeForce3

8. ATI RADEON 9000

9. NVIDIA GeForce3 Ti 200

10. NVIDIA GeForce4 MX 460

That's the current FutureMark "Hall Of Fame" ranking.

Now granted, there is an issue about separating the 9500 and 9700 series cards. I would put the 9700 first (duh!), followed by the 9500 Pro, then I'd probably have the 9500 Non-Pro in the mix with the GeForce4 ti cards.

Of course, each card will have its pros and cons that would change the rankings given specific circumstances, but in general I have zero qualms about the rankings of those cards, which are based on the 3D Mark score.

So is it really that bad? ;)
 
You ignore several problems, I think.

The margins of difference don't necessarily correspond very well to performance differences. If that 9700/9500 was broken up, things might be a bit more contested.

The infamous Kyro issue...the problem there was how little priority seemed (to me) to be given to correcting the issue. Makes it looks like accurate benchmarking is not the priority, which leaves the question: what is? The concern (now that the Kyro series is pushed out of the top 10, though I wonder how accurately it is placed below that) is for repetition of whatever prioritizing there is instead of the goal of accurate benchmarking in the future. Perhaps you don't have a problem with the list because you only care about the 2 vendors with enough clout to make Futuremark sit up and pay attention?
By the by, is the Parhelia really slower than the GF3 Ti 200? Is that ranking result an accurate benchmark reflecting real world performance?

This last problem is not necessarily Futuremark's fault, and that's the problem of how it facilitates optimizing for one specific benchmark. Given past history in this regard, I'd hope their next effort (if providing a quality benchmark is actually their goal) will display an extreme focus on image quality comparison or verification as part of the benchmarking.

The overall problem, as has been stated, is the use of something with the above problems as an absolute indicator of graphics card performance (which most do use it for).
 
The whole "optimizing for one benchmark" is a marketing BS trick that can never really be gotten rid of. Even if there were a huge amount of image quality comparisons to prevent people from making Quake/Quack or Xabre Turbo Mode, it wouldn't solve the problem that people will optimize their drivers for a specific program. The nVidia 40.xx drivers didn't influence image quality (as long as you didn't select the stupid 0x texture filtering option) but they increased 3dmark scores by a huge, unnatural amount.

Videocard vendors can and will always push hard to optimize their drivers for whatever is the leading benchmark is out there, ignoring other applications. There's nothing Futuremark, or anyone else, can do to stop this.

I also think its utterly unrealistic to expect any benchmark to accurately correspond with the graphics performance of every game. Even two games (Quake 3 and RTCW) based on the same game engine can perform dramatically different - Q3 favoring Pentiums while RTCW favors Athlons. How the feck is a single benchmark program supposed to agree with the performance of both, let alone all the games that people play?

There is no perfect benchmark. People here expect waaaay too much of 3dMark. I believe the problem isn't benchmarking. The benchmark is a great tool, it's just limited. The problem happens when people place too much importance upon a single benchmark, as they did with Quake3 in the past and with 3dmark2001 now. The benchmark is a good tool, it just becomes harmful when you completely depend on a single measure of performance, as no single program can measure all aspects of performance.
 
Joe DeFuria said:
For all the bitching we (including myself) do about 3D Mark in one form or another, I'm not sure many of us have taken the time to look at the actual results in a certain way.

..................

Of course, each card will have its pros and cons that would change the rankings given specific circumstances, but in general I have zero qualms about the rankings of those cards, which are based on the 3D Mark score.

So is it really that bad? ;)

Yes and no.

No; because, as you state, the 3D mark scores at least reflect somewhat on the performance of the GPU. Yes; because of the massive number of times that I have seen people get big boost by raising their FSB a lot (but keeping the CPU almost a the same level). Futuremark is a measure of the overall system performance. And yes, while this is what count overall, it gives the consumer no way to gauge whether they really need a new GPU or motherboard/RAM/CPU combo.
 
Status
Not open for further replies.
Back
Top