The First DX9 "Game" Benchmark... Well, Nearly

Considering that Parhelia supports VS2.0 in hardware (although, I'm not sure their drivers have exposed it yet) but PS1.3 we all term this as a DX8 part, shouldn't that same logic apply to this benchmark? If Parhelia is a DX8 chip then this is a DX8 benchmark, or if this is a DX9 benchmark then Parhelia is a DX9 chip!

Doomtrooper said:
There appears to be a very large grey area when talking DX9 compliant including precision modes, and I always thought the PS version is what aligns a benchmark/game to what version of DX it supports.
To me this benchmark is a DX8 benchmark, as it certainly isn't using the strengths of modern cards.

Not that i don't agree. But can't the same thing be said about 3D Mark 2003 also since only one out of the four game tests requires DX9 ?
 
3Dmark 2001 only required DX8 for GT4 too, so there is no difference here.
Well, not entirely. DX8 features are used in nearly more than just Nature in 3DMark2001 - vertex shaders are used to transform the dynamic shadows in 3 of the 7 test and shaders are also used for skinning in 2 tests. Of course, all of these are done via software if hardware support for VS isn't available, but this is something that DX8 specifically permits anyway.
 
If you can't run something without DX9, doesn't it makes the application a DX9 one?

Well there's quite a few games out there which force you to install DX9 (The viking expansion pack to Medieval total war for example) but don't actually require DX9. Are they actually DX9 apps? Hell no.
 
they would only be considered a dx9 app if they used dx9 features otherwise cards without these features wouldn't be able to run them ie the example he gave can be run on the original geforce so its more like a dx7 app
 
Evildeus said:
If you can't run them without DX9 then they are...
Take a DX9 D3D SDK application that uses multitexturing and fixed-function T&L. This program won't run on DX8 because it was compiled with the DX9 SDK, does that make it a DX9 application? Maybe. Does it use DX9 features? No.
 
If I make an app that uses DX9 interfaces to draw a blank screen and include a framerate counter, have I made a DX9 benchmark :?:
 
Well you are considering 2 thinds:
- Application with DX9 features
- Application which required DX9 to run
If you require just 1 part of DX9 to run, then you are a DX9 application, but it doesn't mean than you are full featured or anything else.
OpenGL guy said:
Evildeus said:
If you can't run them without DX9 then they are...
Take a DX9 D3D SDK application that uses multitexturing and fixed-function T&L. This program won't run on DX8 because it was compiled with the DX9 SDK, does that make it a DX9 application? Maybe. Does it use DX9 features? No.
 
Not that i don't agree. But can't the same thing be said about 3D Mark 2003 also since only one out of the four game tests requires DX9 ?

Actually, thanks for bringing that up. IIRC Futuremark even characterise GT1 as a DX7 test despite the fact that it uses DX8 Vertex Shaders - there no delusions of grandure here.

WRT to 3DMark - when they talk about the individual tests then they always talk about them in terms of the requirements (DX7, DX8, DX8, DX9). If you were to roll all the tests into a single test it would be DX9 since it would fail to operate.
 
Hmm, is DX9 just graphics? ;)

How 'bout someone explain DX9 & DDI9 & the differences between the 2?

Maybe some of the DX9 functions these games are requiring DX9 installed for have nothing to do with DDI9? Hmmm?

.01,
 
nVidia and Yeti know what semantics are

Well, you can require the DX 9 API, and you can expose DX 9 functionality. Separately, you can either be doing something the can only be done this way, or not.

The Gunmetal benchmark seems to require the DX 9 API. It is unclear that it is doing anything that can only be done this way, however, or even something that is made noticeably faster by doing things this way...the requirement seems artificial for some purpose. It might be using "floating point mode", as in the original demo (the NV30-NV34 performance figures should tell that story), but I don't think it is even doing that. In any case, I've always been curious how the range offered by the 8500 in PS 1.4 would have served for implementing the improvements offered in the "floating point mode".

The purpose seems closely linked to nVidia's effort to say "game" benchmarks are valid independent of technical merit...ps 1.1 and vs 2.0 seems rather suited to the GF FX lineup, and Yeti seems pretty open to selective developer relations influence. :-?

nVidia-reasoning: Game benchmarks are better than dedicated benchmarks, this benchmark is based on a "real game", and this benchmark requires DX 9...therefore, it is a better "DX 9 benchmark" than 3dmark 03. Technical details like the actual utilization of DX 9 featureset to deliver improved content (among other details) are nowhere to be found in this logic sequence, and the impact of that on benchmarking (mis)analysis is just illustrated more clearly with this example.


As for the Parhelia: it seems to expose some DX 9 functionality, but not the most signifcant part of its functionality (i.e., both PS 2.0 and VS 2.0 specification shader processing). I'd term it a card defined by DX 8, and enhanced more significantly by DX 9's featureset exposure than most cards defined by that level of functionality. "DX 9 card" or "DX 8 card" are simplifications of that, with "DX 8 card" being the most useful simplification with the current crop of available cards.
 
Actually, thanks for bringing that up. IIRC Futuremark even characterise GT1 as a DX7 test despite the fact that it uses DX8 Vertex Shaders - there no delusions of grandure here.
Actually no they don't - GT1 isn't labelled as being a "DX7 test" anywhere. However, the style of the game represented is supposed to be of a "DX7 level"; ie. low poly count, relatively simple texturing demands and so on. Let me quote the help file and the whitepaper:

Help file
"The style of this game test is a combat flight simulator, with additional cinematic shots. It also represents 3D games mainly written for slightly older hardware (DirectX 7 and upwards); the polygon count per frame is relatively low; a large part of the pixels on screen are on the single textured background objects; but there are plenty of particles."

Whitepaper
"Early in the design of 3DMark03, we knew from information in our benchmark results database that a significant portion of PCs had DirectX 7 graphics hardware. We needed one game test that could run on these mid-range PCs. For these, a lighter game test was required. We concluded that a flight simulator type scenario would be a reasonable choice, as simpler background objects would occupy the majority of the screen...

...This test is not meant to as a definitive evaluation of DirectX 7. It is not designed to give the average performance of DirectX 7 3D graphics usage. For example, typical DirectX 7 games use fixed vertex processing, whereas this game test uses 1.1 vertex shaders. We believe this is the future of vertex processing on both graphics cards and CPUs. The overall goal of game test 1 is to complete the collection of the four game tests as a test that can run on DirectX 7 hardware and one that is requires a lower fill-rate. To fully evaluate DirectX 7 performance, the previous version of the benchmark, 3DMark2001 SE, is more appropriate.
"
 
Re: nVidia and Yeti know what semantics are

demalion said:
nVidia-reasoning: Game benchmarks are better than dedicated benchmarks, this benchmark is based on a "real game", and this benchmark requires DX 9...therefore, it is a better "DX 9 benchmark" than 3dmark 03. Technical details like the actual utilization of DX 9 featureset to deliver improved content (among other details) are nowhere to be found in this logic sequence.......
And this is the unfortunate truth.....
 
If you ATi users want to get rid of the artifacts set your DX Anisitropic Filtering mode to 'performance' instead of 'quality'.

No more artifacts!! My 9700 Pro runs it good considering its optimized for nVidia cards.
 
RingWraith said:
If you ATi users want to get rid of the artifacts set your DX Anisitropic Filtering mode to 'performance' instead of 'quality'.

No more artifacts!! My 9700 Pro runs it good considering its optimized for nVidia cards.

My concern is why would this game have problems with the Quality settings. IIRC, Quality for AF turns on trilinear + AF while Performance turns on bilinear + AF. Is there a bug dealing with trilinear filtering + AF in the demo? Is it just for ATi cards?
 
Back
Top