New 3DMark05 screenie

DukenukemX said:
At 640X480 the Pixel Shader test for the Radeon 9800XT is 20 FPS better with these drivers. That seems more like it but at 1024X768 or higher it's starting to look a lot like Xmas for Nvidia.

Again these drivers are old but they are "FM-approved".

At 1600X1200 the 9800XT has a 1 FPS lead. Nothing like the FarCry benchmark. http://techreport.com/etc/2004q3/farcry/index.x?pg=1

Certainly nothing like the CS Source beta benchmarks I've seen.

I think your problem is that you seem to be under the impression that all games work the same, so all games should give the same kind of results for different hardware.
As you may or may not know, the Radeons perform so well in ps2.0 because of their arithmetic processing power. That is the part that is considerably less. Now if you look at texturing, the GeForce FX is better than the Radeon, and in fillrate it is aswell, in most cases.
So depending on the type of shader a game predominantly uses, Radeon or GeForce FX could perform better. And the scaling pattern in higher resolutions can also be different.
Simple, no?

Why do you think some web sites started making their own timedemos? ;)

Because NVIDIA cheats. As long as you use approved drivers, I don't see why you can't use 3DMark aswell.
 
What's your purpose here, DNX? To let everyone know that you're too lazy or ignorant to properly use 3DM without being manipulated by nVidia? How about letting those who know how to use it do so, without berating them with dead-end points?

I'll say it again, one single benchmark will not tell you how everything else will perform.

I'm also glad you think it's unrealistic for a 5950U and its far superior bandwidth to gain ground on a 9800XT as the resolution increases, hacks or not.

Look, not everyone has followed 3DM and nV's objections to it and working around it from the start, so I can understand that you may not want to take the time to read the whole history (especially the part where FM says performance shouldn't improve significantly from the drivers available at 3DMx's launch, and where the 5800 is initially shown via 3DM03 to be as slow compared to the 9700P in DX9 as it is subsequently shown to be in many/most DX9 titles, few though they may be).

But what's your point, that 3DM is useless? If you want to bash 3DM b/c it doesn't offer you Doom 3 or HL2 two years before they ship, feel free to do so elsewhere. If you want to use the best tools available to examine GPU performance, then I'm afraid 3DM is one of the tools you're "stuck" with, like it or not.
 
Pete you got issues.

I've given you straight answers and lots of examples. All I've received is that 3DMark is great and that nothing I say matters. Also somehow I can't run 3Dmark correctly or interpret the results. :rolleyes:

Please give me examples. Don't mouth off.

A single benchmark isn't going to tell you anything. That is correct but isn't 3DMark multiple benchmarks in one? What can 3Dmark tell you?

I know that 3Dmark doesn't really need a fast CPU. Benchmark results for 3Dmark depend on the GPU mostly. That means the results aren't relative to PC gaming since the CPU does make a lot of difference.

Let me give some good and bad points about 3DMark.

The Good

1. It's a great way to find out if your new Video card or CPU is running at it's best.

2. Also a good way to stress your system if your overclocking. Doom3 now does a better job IMO.

3. You can compare results with other PCs.

4. If you have new PC hardware and need to show off what better way to do it than with the latest 3Dmark? The colors. :oops:

The Bad :eek:

1. It's prone to driver cheats. Some that I'm sure that not even Futuremark has detected yet. Not that I know of any.

2. Not relative to PC gaming. Doesn't push the CPU as much as the GPU. No sound when benchmarking to test how sound cards affect FPS.

3. Many of the FPS results contradict real PC game results.

4. Many companies now including Nvidia fund Futuremark.

I'm not surprised of the replys I've seen. Considering that Beyond3D is part of the "Benchmark Development Program" over at FutureMark.
http://www.futuremark.com/bdp/
 
So... if I say the following :

"3DMarkXX shows the true 3D performance of video cards."

"Games do not show the true 3D performance of video cards."

... would I be wrong?

What is Beyond3D about?
 
Actually 3DMark03 was designed with DX9.0 with PS/VS2.0(SM2.0) in mind. Not with SM2.0x or higher. That is why you'll see the R3xx series beat the NV3x series. Sure SM2.0x was released with DX9.0b but it had to be initialised(or something) with the DX9.0b developers kits. HL2 was designed with SM2.0 and not SM2.0 which is faster on R3xx cards than on NV3x.

Someone correct me if i'm wrong.

SM2.0(PS/VS2.0) - 24bit Colour (NV3x runs alot slower)
SM2.0a - supports 16bit Colour (NV3x runs faster in this mode - comparable to R3xx)
SM2.0b - 24bit Colour (NV3x runs alot slower)

So you see .. while in 3DMark03 where the NV3x was alot slower than the R3xx initially, that was until Nvidia released new drivers which forced/s the SM2.0a path/ forces the benchmark to use PS1.1 which benefits the NV3x.

Now in 3DMark05 you should have the option to run in SM2.0, 2.0x(a,b) and SM3.0 .. so anyone with DX9 generation hardware should be able to benchmark it fine. The problem that I forsee though is [speculation]what if the NV3x series cards actually don't do SM2.0a properely either[/speculation]. Guess we'll see once 3DMark05 comes out.

If it does do SM2.0a good then NV3x owners might still get abit more out of their cards in the future.

US
 
DukenukemX said:
The Bad :eek:

1. It's prone to driver cheats. Some that I'm sure that not even Futuremark has detected yet. Not that I know of any.

2. Not relative to PC gaming. Doesn't push the CPU as much as the GPU. No sound when benchmarking to test how sound cards affect FPS.

3. Many of the FPS results contradict real PC game results.

4. Many companies now including Nvidia fund Futuremark.

I'm not surprised of the replys I've seen. Considering that Beyond3D is part of the "Benchmark Development Program" over at FutureMark.
http://www.futuremark.com/bdp/
1. And other benchmarks aren't prone to application specific optimizations? :rolleyes: At least we do something about it.

2. AFAIK many reviewers benchmarks with "sounds off". 3DMark03 has a sound test FYI. 3DMark03 is less CPU dependent than GPU/VPU dependent due to the way the engine works.

3. This should be clear by now... Do you compare Quake3 fps to DOOM3 fps? If not, please explain why.

4. Our BDP is no secret society. If you don't know what the BDP is all about, I suggest that you read the BDP pages you linked to.
 
Nappe1 said:
The Real Secret Societies are famous from one thing: Practically no one knows they exists. ;)

I know what you did last summer :p (just to lighten up the mood a bit).
 
The Real Secret Societies are famous from one thing: Practically no one knows they exists.

Nope .. everyone knows they exist .. just no-one knows where you can join up.

You should've said..

The Real Secret Societies aren't famous because practically no one knows they exists.
 
DukenukemX said:
The Bad :eek:

1. It's prone to driver cheats. Some that I'm sure that not even Futuremark has detected yet. Not that I know of any.

But less than any other benchmark, because Futuremark actually do something do stop cheating.[/quote]

2. Not relative to PC gaming. Doesn't push the CPU as much as the GPU. No sound when benchmarking to test how sound cards affect FPS.

This is a good thing, when you want to test videocard performance.

3. Many of the FPS results contradict real PC game results.

Not really, DX9 games has shown that 3Dmark03 was spot on. Ofcause performance of games are different but looking at the overall picture the 3Dmark03 scores was correct.

4. Many companies now including Nvidia fund Futuremark.

Futuremark need funding and they need industry input to make a good benchmark.
 
Unknown Soldier said:
Someone correct me if i'm wrong.

I'm going to have to correct you here then ;)

SM2.0(PS/VS2.0) - 24bit Colour (NV3x runs alot slower)
SM2.0a - supports 16bit Colour (NV3x runs faster in this mode - comparable to R3xx)
SM2.0b - 24bit Colour (NV3x runs alot slower)

The precision in colour is dictated purely by the hardware, and by a 'partial precision' hint that the programmer can set on each instruction in the code.
ATi cards will run everything in 24 bit, because that's the only precision they support. PP hints have no effect.
NV cards will run in 32 bit when there's no PP hint, and 16 bit with the PP hint. This is independent of the shader version used. All shaders, 2.0 and up, support PP.

The main difference between 2.0, 2.0a, 2.0b, and 2.x are things like max nr of instructions, and some additions to the instructionset. The compiler will also use slightly different optimization strategies for each version.

So you see .. while in 3DMark03 where the NV3x was alot slower than the R3xx initially, that was until Nvidia released new drivers which forced/s the SM2.0a path/ forces the benchmark to use PS1.1 which benefits the NV3x.

Well, more specifically, NVIDIA replaced the shaders completely. I believe 3DMark03 already used PP hints. NVIDIA went further and replaced floating point shaders with integer shaders. While faster on NV3x, they were less quality.

Now in 3DMark05 you should have the option to run in SM2.0, 2.0x(a,b) and SM3.0 .. so anyone with DX9 generation hardware should be able to benchmark it fine. The problem that I forsee though is [speculation]what if the NV3x series cards actually don't do SM2.0a properely either[/speculation]. Guess we'll see once 3DMark05 comes out.

This is true. Because they now use HLSL shaders, and compile them at runtime, they can choose the best target to compile for, which may improve performance a bit. I just hope NVIDIA won't lower itself to cheating in a new benchmark for an old and mostly forgotten product. I don't think there's much use anyway, I don't expect an NV3x or R3x0 to really get decent framerates, no matter how much you cheat. Maybe you can go from 5 fps to 8 fps, who cares :)
 
DukenukemX said:
4. Many companies now including Nvidia fund Futuremark.

This is good, I would say. If it were only one company, it could be suspicious. Such as ID and NVIDIA, or Valve and ATi ;)
But since FM has a LOT of partners, including most videocard manufacturers and many 'independent' partners, such as Beyond3D, I think everything is pretty much balanced out.
 
Scali said:
Well, more specifically, NVIDIA replaced the shaders completely. I believe 3DMark03 already used PP hints. NVIDIA went further and replaced floating point shaders with integer shaders. While faster on NV3x, they were less quality.

No 3DMark03 does not use PP hints, so the NV3x was handicapped a bit. But I agree that the performance problem with the NV3x has nothing to do with SM2.0 vs SM2.0a. The main problem was that NV3x is slow as hell with all shaders (even when FP16 was used).
 
Ailuros said:
Nappe1 said:
The Real Secret Societies are famous from one thing: Practically no one knows they exists. ;)

I know what you did last summer :p (just to lighten up the mood a bit).

you do? :oops: well buster, you could tell me too then. ;)
 
From Reverend
So... if I say the following :

"3DMarkXX shows the true 3D performance of video cards."

"Games do not show the true 3D performance of video cards."

... would I be wrong?

I'd ask you to please back that up. ;)

From Unknown Soldier
Actually 3DMark03 was designed with DX9.0 with PS/VS2.0(SM2.0) in mind.

As far as I know only Mother Nature and PS2.0 test actually use PS/VS2.0. All the other tests use PS1.4 or PS1.3. If I remember correctly the Mother Nature test actually uses mostly PS1.4 and only some PS2.0.

From worm[Futuremark]
1. And other benchmarks aren't prone to application specific optimizations? At least we do something about it.

You mean all other synthetic benchmarks. It's true that 3Dmark took steps beyond any other synthetic benchmark. Who's to say if some cheats are availible in AquaMark or ShaderMark.

In my opinion the best prevention to cheats is to use real games with custom timedemos. So long as review sites make a new timedemo every so often.

2. AFAIK many reviewers benchmarks with "sounds off". 3DMark03 has a sound test FYI.

I've explained before that some changes need to be made. 3DMark03 sound test hardly affects the score. Unless theres a benchmark for PS/VS or fillrate sound should always be on. Mostly for the first few game tests in 3DMark03.

3. This should be clear by now... Do you compare Quake3 fps to DOOM3 fps? If not, please explain why

What I mean is that some hardware may not perform well under certain conditions. So if the Geforce FX hardware underperform in PS/VS 2.0 then it should show up in the benchmarks.

For example a lot of Geforce FX users were surprised to find out that FX cards perform badly in the CS Source Beta tests. I guess they didn't figure it out when FarCry benchmarks were released or from the ShaderMark 2.0 benchmarks. Even Tomb Raider.
 
DukenukemX said:
As far as I know only Mother Nature and PS2.0 test actually use PS/VS2.0. All the other tests use PS1.4 or PS1.3. If I remember correctly the Mother Nature test actually uses mostly PS1.4 and only some PS2.0.

There's a difference between using a certain shader version and aiming at hardware that doesn't support anything more than that shader version.
In the case of Battle Of Proxycon and Troll's Lair, you basically need the vertexprocessing power of a DX9-card. Not because it uses VS2.0, but because it assumes the amount of processing power that only VS2.0-capable cards can deliver. In that respect, 3DMark03 is very much aimed at SM2.0 hardware. Any other hardware may run some of the tests, but they aren't very fast at them. 3DMark2001SE is actually aimed at DX8.1 hardware.
The fact that they only use a handful of ps2.0 shaders is actually very realistic for most DX9 games, especially when running on GeForce FX.

In my opinion the best prevention to cheats is to use real games with custom timedemos. So long as review sites make a new timedemo every so often.

This doesn't prevent cheats like shader replacement or reducing texture quality. These are very common cheats. At least Futuremark bothers to test drivers and approve the ones that actually do what they're supposed to do.
As an example... if you edit the Doom3 shaders, even just switch the order of a few instructions around or such, without altering the function of the shader in any way, performance of GeForce FX cards suddenly drops significantly. Anyone with a GeForce FX can verify that for himself. What's worse, NVIDIA abuses such fake benchmark results to market their products.
 
A person, who shall remain nameless, wrote the following to me during the GFFX time of release. Remember, this is circa mid-2003 or thereabouts :

ATI have a recently wrote an application to
capture the code being passed to the board to see what was occurring -
this app had an inadvertent side effect of disabling the optimisations,
and evidently they were pretty shocked at exactly what was occurring
with NVIDIA's. (<snip>... get ATI to release the source of that
code, but they say they it would be easily identified that they were
the source and they couldn't sustain the PR assault from NVIDIA that
they would inevitably get).

Get this a software dev from NVIDIA told me something that he probably
shouldn't. NVIDIA have a tool internally that will analyse a games
shader code and if that code isn't optimal for NV3x they will extracts,
recode it and drop the new shader code into the drivers - there's every
chance that when you run a game on NV3x hardware you are not actually
running the shader code that is supplied by the game, but what NVIDIA
have put into their drivers. This also easily applies to shaders that
are used in benchmarks (and game benchmarks) they can be tuning their
driver supplied driver code to drop down to lesser formats (FP16 or
even FX12) in order to increase the speed of the shaders in that
benchmark - its long since been suspected that this is the first step
they took with 3DMark03, but this is a lot more difficult to prove.

Again, another thing we see is that people look at the min FPS numbers
that are drawn from some benchmark outputs - you want to be that where
the real heavy frames are in that scene they haven't just stored the
vertex format and geometry positions for those frames and drop them
entirely? In a demo or benchmark run are you really going to notice
individual missing frames?

It's, of course, up to you to believe if all of the above is/was true but I wanted to reveal the above (if it isn't already obvious) to give a perspective on how difficult it is for Futuremark to check chea-, er, optimizations and how much resource it involves. No matter how you view 3DMark and its relevancy to gaming, the one thing you cannot say is that FM doesn't care about their software. You also cannot say that game developers care about the output fidelity of the way they coded their games to the same degree as FM does.
 
DukenukemX said:
Pete you got issues.
Too true, but not with the reality of 3DM.
Please give me examples. Don't mouth off.

A single benchmark isn't going to tell you anything. That is correct but isn't 3DMark multiple benchmarks in one? What can 3Dmark tell you?
Please read my examples. Again, 3DM03, with early drivers (before either company had a chance to cheat), predicted the rough superiority of the R300 over the NV30 with the mix of shaders 3DM03 presented. This prediction was backed up by most "DX9" games since then. What more do you want?
The Good
1. It's a great way to find out if your new Video card or CPU is running at it's best.
Sure, but this applies to any benchmark for which lots of public numbers are available. I guess the Orb makes 3DM less dependent on outside sites to corroborate one's score, though.
2. Also a good way to stress your system if your overclocking. Doom3 now does a better job IMO.
Erm, sure, but, again, that's true of any GPU-punishing benchmark.
3. You can compare results with other PCs.
This is pretty much a subset of your first point, but yes.
4. If you have new PC hardware and need to show off what better way to do it than with the latest 3Dmark? The colors. :oops:
OK, but was the sarcasm nec'y? What else do you expect from a forward-looking benchmark written by guys from the demo scene? Bland scenery? :p

The Bad :eek:
1. It's prone to driver cheats. Some that I'm sure that not even Futuremark has detected yet. Not that I know of any.
Again, how is this specific to 3DM? Did you read what I said that Derek said that nV themselves said, that they have hundreds/thousands of hand-coded shader replacements? Do you think they're all for 3DM?
2. Not relative to PC gaming. Doesn't push the CPU as much as the GPU. No sound when benchmarking to test how sound cards affect FPS.
Last I checked, it was 3DMark, not SystemMark (er, PCMark) or AudioMark. Besides, thanks to Creative buying out Aureal and then basically locking their tech away in a dusty chest, there's really not much to talk about on the audio side.
3. Many of the FPS results contradict real PC game results.
Could you clarify this? Are you talking in absolute terms again, or relative?
4. Many companies now including Nvidia fund Futuremark.
Yes, and nV has their logo on many companies' games, which basically means they "fund" them. Your point?

I'm not surprised of the replys I've seen. Considering that Beyond3D is part of the "Benchmark Development Program" over at FutureMark.
http://www.futuremark.com/bdp/
Yep, I'm a part of BDP, too. In fact, I come with the standard beta package that B3D was granted. :p
 
I am looking forward to the new benchmark with delight, I am surprised at the criticism some folk show futuremark given the market they have to contend with. I think they are trying their hardest and honestly benefiting us, and deserve more appreciation for their work along the 3d pipeline.
 
DukenukemX said:
From Unknown Soldier
Actually 3DMark03 was designed with DX9.0 with PS/VS2.0(SM2.0) in mind.

As far as I know only Mother Nature and PS2.0 test actually use PS/VS2.0. All the other tests use PS1.4 or PS1.3. If I remember correctly the Mother Nature test actually uses mostly PS1.4 and only some PS2.0.

Yes true .. but where does most of the score for 3DMark03 come from percentage wise???

If I remember correctly I remember GT4 contributes to alot of the score .. and GT4 is primarily SM2.0.

If I remember correctly my GF3 gives me 1052 without nv's cheats .. and the 5200 gives in the region of 1700(I could be wrong but I think this is what I remember seeing). That's 700 points SM2.0 gives to 3DMark03.

And I still think my GF3 is a better game card than the 5200 when playing DX8 games(smoother gameplay).

US
 
Back
Top