New 3DMark05 screenie

Just one addition before I knock off. I want all games to look like 3DMark05.

Problem is 3DMark05 isn't about games. It's about 3D.

<quickly goes to bed before Worm says something> :)
 
Mintmaster said:
Mordenkainen said:
Sadly, even more shadow aliasing artefacts. But I bet they aren't as noticeable in motion. :\
It looks like multiple sample jittered shadow maps to me. Anyone else concur? Sort of like what Carmack was was talking about recently for his next (and last?) 3D engine.

shadow maps they are for sure, and there're hints of multiplicity too, but whether jittered..
 
Scali said:
phenix said:
Will there be another nature test?

Wait... Damn NDA.

The first screenshot that FM released, had an enchanted forest theme.
http://www.beyond3d.com/interviews/fm04/image.php?img=sneak.jpg&comment=3DMark04%20-%20Preview%20Image%20(Work%20In%20Progress)
I suppose this is quite similar to the earlier nature tests.


Looking good. Thanks for the link. :) What about the fourth test? Is there a fourth test or this time it is only three tests?
 
I haven't heard one person exited about using 3DMark05 for benchmarking. It's all about the demo. Could it be because 3DMark03 wasn't anything like the games that came after it?

For example Theif 3 or Deus EX 2 ran better than Battle of Proxycon, Mother Nature, or the evil Troll's Lair. Doom3 was a bigger shock since it can run fine on most DX9 generation video cards.

If they're going to make a gaming benchmark then at least include sound in every test. Sound can have a major impact in games FPS and no gamer plays games with sound disabled.
 
Reverend said:
Yeah... but do most reviewers enable sound for 3D hardware review benchmarking?

So if everyone was jumping off the bridge then so should you? Thats why the selection of sound cards suck today.

To call benchmarks "3D hardware benchmarking" isn't correct. The reason you benchmark is to get the best possible FPS score there is. Thats because higher FPS = better gaming.

We now know that 3D hardware performance is affected by things like motherboard chipset. It's also been proven many times that sound can very much have a huge affect in games FPS. Anywhere from 5 to 40 FPS.

People do things like overclock their GPU or install drivers like Omega's only to get a few FPS boost. Little do they know how much their Sound Card can affect their FPS.

Technically it's not "3D hardware benchmarking" but "Gaming hardware benchmarking". Last I checked 3Dmark was called "Gamers benchmark". Not "3D hardware benchmarking" even though thats really what it does.

At this point we might as well remove the mouse and CD drives to run benchmarks. Without sound all your doing is producing an artificial result.

The point of benchmarks is to create a real world comparison. Using a Cmedia sound card compared to an Audigy2 has a difference of buying a 9800 Pro compared to a 9800XT.

I think it's time to start benchmarking with sound. ;)
 
DukenukemX said:
I haven't heard one person exited about using 3DMark05 for benchmarking. It's all about the demo. Could it be because 3DMark03 wasn't anything like the games that came after it?

For example Theif 3 or Deus EX 2 ran better than Battle of Proxycon, Mother Nature, or the evil Troll's Lair. Doom3 was a bigger shock since it can run fine on most DX9 generation video cards.

If they're going to make a gaming benchmark then at least include sound in every test. Sound can have a major impact in games FPS and no gamer plays games with sound disabled.
I don't know where you have been lurking around, but I have got tons of emails & PM's from excited users.

I am somewhat surprised that there are still some who compare game fps to 3DMark fps. It should be clear that you can not do that. Do you compare DOOM 3 fps with QuakeIII fps, or UT2004 fps with DOOM III fps? I sure hope not. Even if two games (or applications) use the same shadow idea, doesn't mean that you should compare the fps. The amount of polygons, textures, normal maps, light sources, engine, API, etc. all have an impact on the fps.
 
If they're going to make a gaming benchmark then at least include sound in every test. Sound can have a major impact in games FPS and no gamer plays games with sound disabled.

Funny you should mention that actually... Most people talking about Doom3's performance have been using the timedemo's results...
And that one doesn't include sound either, or physics, or AI, or anything...
 
DukenukemX said:
I haven't heard one person exited about using 3DMark05 for benchmarking. It's all about the demo. Could it be because 3DMark03 wasn't anything like the games that came after it?

For example Theif 3 or Deus EX 2 ran better than Battle of Proxycon, Mother Nature, or the evil Troll's Lair. Doom3 was a bigger shock since it can run fine on most DX9 generation video cards.

If they're going to make a gaming benchmark then at least include sound in every test. Sound can have a major impact in games FPS and no gamer plays games with sound disabled.

Bolding mine.

Doom 3 was designed for DX7 class hardware. Don't expect a DX9 class benchmark to give you an idea how Doom 3 is going to run.

Besides, 3DMark is designed to stress your graphics subsystem to the max. Futuremark doesn't have to be concerned with taking shortcuts to stop frame rates dropping below 30FPS.
 
DukenukemX said:
To call benchmarks "3D hardware benchmarking" isn't correct. The reason you benchmark is to get the best possible FPS score there is. Thats because higher FPS = better gaming.
That's simply by proxy though. Any timedemo-based or repeatable-demo benchmark is artificial and not representative of real-time gameplay; the only real exception to this is where the performance is totally GPU bound that path-finding, NPC AI, I/O, netcode, sound, etc make no discernable difference.

The point of benchmarks is to create a real world comparison. Using a Cmedia sound card compared to an Audigy2 has a difference of buying a 9800 Pro compared to a 9800XT.
That may the point of them, but that's not how it pans out for 3D graphics cards.

I think it's time to start benchmarking with sound. ;)
Which will simply give you less FPS in CPU/system bound tests, causing fill rate curves to be a little less flat. The actual comparison between graphics cards would be unchanged.
 
If you're comparing products where sound is a part of it (ie. soundcards, motherboards with onboard chipsets, or complete systems), or if you're doing a "how will hardware x run title y" sort of article, then sound is an important factor (and for the latter, so are all the other factors not covered by timedemos). Otherwise, it's irrelevant.
 
[quote="worm[Futuremark]
I am somewhat surprised that there are still some who compare game fps to 3DMark fps. It should be clear that you can not do that. Do you compare DOOM 3 fps with QuakeIII fps, or UT2004 fps with DOOM III fps? I sure hope not. Even if two games (or applications) use the same shadow idea, doesn't mean that you should compare the fps. The amount of polygons, textures, normal maps, light sources, engine, API, etc. all have an impact on the fps.

So what your saying is that running 3DMark is a waste of time? ;)

When you run 3DMark it's suppose to give the PC user an idea how their PC will play games in the future. Did I get the wrong impression?

So for example when you run Battle Of Proxycon and get 20 FPS and then run Thief 3 to get 60 FPS is that a very accurate prediction to PC games?

Personally I'd rather see timedemos over any synthetic benchmark. As Scali pointed out Doom3 doesn't have sound either, or physics, or AI, or anything... but it's actually a game. Whats 3Dmarks excuse?

If your going to make a "gamers benchmark" than having sound is common sense. It may just lower FPS but that really depends on the sound card you use. Just like which video card, chipset, and CPU.
 
DukenukemX said:
So for example when you run Battle Of Proxycon and get 20 FPS and then run Thief 3 to get 60 FPS is that a very accurate prediction to PC games?

I don't think that Thief 3 uses the same technology as Battle of Proxycon, so it's a bit silly to compare.
Besides, I don't think the goal of 3DMark is to give you an accurate estimate of the actual framerate you'll be getting in games. That's nearly impossible to estimate anyway, since each game works and behaves differently.

I think the goal of 3DMark is to give an idea of how various GPUs respond to new techniques that will likely be used in new games. It's all about relative performance. 3DMark can show for example that a Radeon 9700 generally runs ps2.0 code a lot faster than a GeForce FX5800 can. The logical conclusion then is: for games using ps2.0, the Radeon is the better choice. That was a very accurate prediction.

Or for example, a Radeon 9500Pro is about 30% slower in general than a Radeon 9700. Well, that's another very accurate prediction, and it can give you an idea of whether you want to pay the extra for the faster model.

But if you compare Battle of Proxycon to Doom3, well... in some cases it's quite accurate... If you take an Athlon64 3800+ with a 680Ultra, you get around 105 fps. Battle of Proxycon gets about the same.
In other cases, eg an 1800+ with a Radeon 9600Pro, Doom3 will get about 24 fps, while Battle of Proxycon gets 37.5 fps.


Personally I'd rather see timedemos over any synthetic benchmark. As Scali pointed out Doom3 doesn't have sound either, or physics, or AI, or anything... but it's actually a game. Whats 3Dmarks excuse?

Why would 3DMark need an excuse? Doom3 is a game, but if the timedemo doesn't do any sound or physics or AI or anything, what separates it from 3DMark then? "It's a game" doesn't tell me anything. The timedemo isn't a game, it's a demo, like the 3DMark03 tests. Where exactly is the difference, other than in the 'packaging' (game or benchmark)?
Personally I don't care one bit whether I run a timedemo of a game or a 3DMark test to determine whether a new GPU has decent stencilshadow or ps2.0 performance. It's the exact same stuff that's being tested.

I do care however that 3DMark03 was released long before any stencilshadow or ps2.0 games with timedemos arrived.
Let's face it, 3Dmark03 was the messenger that told us that GeForce FX was bad in ps2.0. It paid the price. But how accurate was it? Every single ps2.0 title that came out since, has performed badly on GeForce FX.
Without 3DMark03, I think a lot more people would have bought GeForce FX cards and be very disappointed and unhappy. So I think 3DMark03 indeed had a good purpose for gamers.

If your going to make a "gamers benchmark" than having sound is common sense. It may just lower FPS but that really depends on the sound card you use. Just like which video card, chipset, and CPU.

3DMark03 had sound tests anyway, and those will be valid for a long time to come, since sound hardware doesn't evolve as quickly as graphics hardware.
 
DukenukemX said:
So what your saying is that running 3DMark is a waste of time? ;)

When you run 3DMark it's suppose to give the PC user an idea how their PC will play games in the future. Did I get the wrong impression?

So for example when you run Battle Of Proxycon and get 20 FPS and then run Thief 3 to get 60 FPS is that a very accurate prediction to PC games?

Personally I'd rather see timedemos over any synthetic benchmark. As Scali pointed out Doom3 doesn't have sound either, or physics, or AI, or anything... but it's actually a game. Whats 3Dmarks excuse?

If your going to make a "gamers benchmark" than having sound is common sense. It may just lower FPS but that really depends on the sound card you use. Just like which video card, chipset, and CPU.
Yes, all benchmarks are a waste of time. Running the same benchmarks on multiple systems and cards is foolish, because it won't predict the performance of any other benchmark on the same systems/cards. Good logic!

You may prefer real games over synthetics just because real games benchmarks requires less analysis on your part: you just read the graphs and buy the card with the longest bar. But you can't extrapolate from a game's benchmarks if the benchmarks doesn't exist, so you're forced to use similar benchmarks to guesstimate future performance. 3DM03's "excuse" is that it was made available over a year before Doom 3, and well before any other "DX9" game, and it reasonably predicted general DX9 performance. But it doesn't need an excuse a synthetic benchmark, because that's exactly what Doom 3 becomes when you remove everything but the graphics engine. I don't hear ppl complaining that D3 benchmarks don't reflect what happens in Thief 3. Oh, is that because D3 was made with a separate purpose in mind? Well, so was 3DM, and it wasn't to predict the exact framerates you'll get from Thief 3.

There's a reason why FutureMark created the Orb. 3DM, like any other benchmark, is mainly a yardstick with which to measure your system with everyone else's. It's not a magic eight ball, and though it does aim to be ahead of the curve, that doesn't stop you from using older 3DMs to compare relative performance with older featuresets.

Really, was this explanation even nec'y? If you're going to debate synthetics, at least put a little more thought into the arguments against them. Pardon my impatience, but this has been debated ad infinitum, and synthetics will always have a place in forward-looking reviews, simply because there's no other recourse save to code your own examples of what you think will be the future direction of consumer PC graphics.
 
DukenukemX said:
I haven't heard one person exited about using 3DMark05 for benchmarking.


HEAR MEEEE!!!

I'm excited about using 3DMark05 for benchmarking! Very excited indeed.

So. From now on if you make a statement like that, please make it like " With the exception of one Mendel in beyond3d forums, I haven't heard one person..."
 
From Scali
3DMark can show for example that a Radeon 9700 generally runs ps2.0 code a lot faster than a GeForce FX5800 can.

Why not Geforce FX 5900? Tomb Raider has given a better prediction on PS2.0 code than 3Dmark. Of course the Geforce FX 5800 sucks at it but for some reason the 5900 is almost equal in 3DMark.

Now that the HL2 source engine is being tested the Geforce FX owners are crying blasphemy. For some reason running ps2.0 code is piss slow on Geforce FX hardware.

I wonder where they got the idea that the Geforce FX ps2.0 performance was as good as the Radeons? :rolleyes:

But if you compare Battle of Proxycon to Doom3, well... in some cases it's quite accurate...

Battle of Proxycon gets around 25.8 on my 9500 Pro system. While Doom3 gets 47.6 FPS Timedemo1 in high quality mode at 1024X768.
http://service.futuremark.com/compare?2k3=2798221

Doom3 is 30-60 FPS playable on my Radeon 8500 system. Sometimes it dips into the 20's. You don't wanna know the FPS for Proxycon. :( It's like watching a slide show.

From Pete
Yes, all benchmarks are a waste of time. Running the same benchmarks on multiple systems and cards is foolish,

You took the words right out of my mouth... and then stomped on it and rewrote what I said.

You can say 3Dmark is a waste of time but not all benchmarks.

Lets take 3Dmark03 "Pixel shader2.0" benchmark for example. Look at the PS2.0 test from TechReport.
http://techreport.com/reviews/2003q4/geforcefx-5950ultra/index.x?pg=9

According to that test the 9800XT is only 2 FPS faster than the 5950 Ultra. Now lets look at the ShaderMark 2.0 results.
http://techreport.com/reviews/2003q4/geforcefx-5950ultra/index.x?pg=11

So 3Dmark03 is saying that the 9800XT is only 2 FPS faster in ps2.0 performance while ShaderMark 2.0 says the 9800XT is way faster than the 5950 Ultra.

From the same website we can see that FarCry seems to have similar results to ShaderMark.
http://techreport.com/etc/2004q3/farcry/index.x?pg=1

So does running 3Dmark03 give you an accurate idea on GPU performance? I've also just proven that it's not even relative to real gaming performance.


So is benchmarking WITH 3Dmark a waste of time? Hell yes it's a huge waste of time. It's only good for looking at all the pretty graphics and colors. 8)
 
DukenukemX said:
Why not Geforce FX 5900? Tomb Raider has given a better prediction on PS2.0 code than 3Dmark. Of course the Geforce FX 5800 sucks at it but for some reason the 5900 is almost equal in 3DMark.

I suppose for that same reason FutureMark does not approve the drivers with which you reach that score, and FutureMark doesn't allow scores with these drivers to be added to the ORB.

I wonder where they got the idea that the Geforce FX ps2.0 performance was as good as the Radeons? :rolleyes:

Blame NVIDIA, not FutureMark. With FM-approved drivers, you get the right view of the FX's lousy ps2.0 performance.

Doom3 is 30-60 FPS playable on my Radeon 8500 system. Sometimes it dips into the 20's. You don't wanna know the FPS for Proxycon. :(

Perhaps you missed the part where it said that Radeon 8500 is a DX8.1 device, while 3DMark03 is a DX9 benchmark.
3DMark03 uses rendering methods that are optimal for DX9 hardware, not DX8.1 or lower hardware. Doom3 on the other hand, uses a path optimal for DX7 hardware. So obviously a Radeon 8500 is going to perform better with Doom3.
Then again, there is a 3DMark that aims at DX8.1 hardware, namely 3DMark2001SE.

Lets take 3Dmark03 "Pixel shader2.0" benchmark for example. Look at the PS2.0 test from TechReport.
http://techreport.com/reviews/2003q4/geforcefx-5950ultra/index.x?pg=9

According to that test the 9800XT is only 2 FPS faster than the 5950 Ultra. Now lets look at the ShaderMark 2.0 results.
http://techreport.com/reviews/2003q4/geforcefx-5950ultra/index.x?pg=11

So 3Dmark03 is saying that the 9800XT is only 2 FPS faster in ps2.0 performance while ShaderMark 2.0 says the 9800XT is way faster than the 5950 Ultra.

From the same website we can see that FarCry seems to have similar results to ShaderMark.
http://techreport.com/etc/2004q3/farcry/index.x?pg=1

So does running 3Dmark03 give you an accurate idea on GPU performance? I've also just proven that it's not even relative to real gaming performance.

If you're not using approved drivers, obviously you will get inaccurate results. That's the whole point of approving drivers. Again, this is NVIDIA's fault, not FutureMark's.
The website even mentions this:
We know NVIDIA's drivers are packed with application-specific optimizations for 3DMark03, but we've thrown in these results anyway, because they are an interesting test case of sorts. After a number of attempts, NVIDIA seems to have gotten its replacement shaders to duplicate the output of FutureMark's original DirectX 9 shaders with a pretty decent degree of fidelity. The 5950 Ultra can't quite keep up with the Radeon 9800 XT either overall or in any of 3DMark03's component tests, but it consistently comes close.

You're right, for you, benchmarking with 3DMark is a waste of time, because apparently you are not aware of how to benchmark properly with it, or how to interpret the results.
This does not go for everyone however, and there are plenty of people who value this product.
 
Ok so you want FM-approved drivers?
http://techreport.com/reviews/2004q2/geforce-6800ultra/index.x?pg=11

We used ATI's CATALYST 4.4 drivers on the Radeon card and ForceWare 60.72 beta 2 on the GeForce cards. One exception: at the request of FutureMark, we used NVIDIA's 52.16 drivers for all 3DMark benchmarking and image quality tests on the GeForce FX 5950 Ultra.

At 640X480 the Pixel Shader test for the Radeon 9800XT is 20 FPS better with these drivers. That seems more like it but at 1024X768 or higher it's starting to look a lot like Xmas for Nvidia.

Again these drivers are old but they are "FM-approved".

At 1600X1200 the 9800XT has a 1 FPS lead. Nothing like the FarCry benchmark. http://techreport.com/etc/2004q3/farcry/index.x?pg=1

Certainly nothing like the CS Source beta benchmarks I've seen.

This does not go for everyone however, and there are plenty of people who value this product.

Huh? Product? Ohh yea you can actually pay to use this program. :LOL:

You're right, for you, benchmarking with 3DMark is a waste of time, because apparently you are not aware of how to benchmark properly with it, or how to interpret the results.

Heres my interpretation. It's only as accurate depending on the amount of money supplied from ATI or Nvidia. :LOL:

When the benchmarks weren't looking good for the FX 5800 then NVIDIA stopped its funding to Futuremark. :eek:

Despite the valiant effort to find every Nvidia cheat, FutureMark only proves how using a benchmark that does the same thing over and over again can have a flaw.

Why do you think some web sites started making their own timedemos? ;)
 
Back
Top