Reverend at The Pulpit #12

Status
Not open for further replies.

Reverend

Banned
3DMark05
I thought I'd comment on this in one post instead of posting replies in a few threads started by others.

After viewing two pre-release versions and its gold version prior to being launched publicly, I emailed 3DMark05's producer, Patric Ojala, about how pleased I was with it. From the presentation, to implementation, to graphics, to music, even right down to the end-credits were all very well produced. Like its predecessor, it is not "synthetic enough" for Beyond3D but this is not our application and we fully respect the priorities Futuremark places on 3DMark05. We're not sure how useful the three game tests will be in B3D's video card reviews; we'll need to spend more time investigating what 3D features are stressed on in each game test. The extras (the tools) is an improvement on 3DMark03 however and that is appreciated by B3D. To recap what I posted a long time ago during the formulation stages of 3DMark05, where Futuremark asked for input from its BDP members :

Reverend said:
Things in DX9 that matters :

- floating point pixel shader performance
- multiple render target / multiple element texture performance
- performance when rendering with tons of simultaneous textures
- performance when rendering using high precision (64/128-bit textures).
- performance with long, complex pixel shaders.
- performance in scenes with tons of complex state changes.
- performance in scenes that use lots of render-to-texture

Things (whether in DX9 or not) that don't matter :

- complex vertex shaders (they'll tend to be quite simple, just setting up some vectors for use by really complex pixel shaders).
- higher order surfaces... lots of talk, nobody actually use it
- polygon throughput (that's not a limiting factor in games)
Those were my suggestions to Futuremark for 3DMark05 when Futremark asked for suggestions in the infancy stages of 3DMark05. How much of my suggestions were actually implemented is an on-going investigation of mine currently.

There is, however, an issue that Dave and I have viewed as serious and that is enabling Depth Stencil Textures (DST, probably a new buzzword) as a DEFAULT setting in 3DMark05. While we won't argue that using this when depth shadow maps are used is A Good Thing, we do have concerns about this being enabled by default. Although Futuremark's intentions behind every 3DMark is to more-or-less predict the routes that will be taken by game developers, we need to recognize the big difference between making games and making a benchmark application. Which category does 3DMarkXX belong to? More important than how the public views this is how Futuremark themselves view this. At the moment, I'm not sure if Futuremark knows what 3DMark is. Here's something I wrote Futuremark yesterday :

Reverend said:
Finally have some time to think about this in a "serious" manner.

It's too late for any comment of mine to have any effect -- it's gold. So I'll instead stick to talking about how bad I think the BDP program is.

You guys have been gathering feedback on basics from way back. You collate them, discuss internally, ask for some more feedback and you go ahead and decide how to make 3DMark05, for eaxmple, "Smoothie" was tossed around early on but no feedback was forthcoming from FM about what kind of shadows has been decided upon by FM. Essentially, as far as B3D is concerned, me and Dave had absolutely no idea what had been decided upon by FM. A better way, but perhaps much more time-consuming and will probably delay 3DMark, is for FM to tell BDP members "OK, after getting the feedback from you guys, we think this is what we're doing with the game tests... blah, blah explaining to us what you want to do... and then WHAT DO YOU BDP MEMBERS THINK OF WHAT WE'RE DOING WITH EACH GAME TEST?" Sure, you asked the BDPs for opinions on certain "default settings" but you never asked the BDPs (well, you didn't ask B3D, not sure if you asked other BDP members) about this DST thing.

It's just not satifactory. I realize there must come a time when FM has to make a decision (otherwise, you'll never get 3DMark05 done if you keep asking for feedbacks and opinions) but I feel more can be done to involve the BDP members without delaying a project. Progressively providing information to BDP members about certain crucial areas is what I'm saying.

I just learned the intricacies about DST yesterday. I never played with it on a NVDIA hw. My knowledge is based on what DX tells me and that was simply the ability to query, set texture format and away we go. I had no idea how NVIDIA hw implements this. In a big way, I blame MS for this obscurity (and there are other obscurities and non-defined stuff in DX).

But on to the main topic -- 3DMark is not a game. It is a benchmark meant to show how different hardwares perform on standard, hw-platform-wide API features that the benchmark author can control.

While FM can implement features that FM thinks will be "standard practice" in games, FM needs to ensure that everything they implement in every 3DMark favours no IHV or specific hw without having obscurities in the API influencing their decisions.

Taking another parallel -- depth bounds checking is good, no arguments. Should it be implemented as a DEFAULT, and therefore penalizing hw that do not have support for this?

I think FM has made the wrong decision wrt to this particular DEFAULT setting (DST). Sure, we can disable DST (and this is in fact what you recommend for better hw comparisons) but the basics are wrong. The ORB itself will be tainted, IMO.

I re-iterate : 3DMark is not a game. If FM wants to make a game, make a game. If FM wants to make a benchmark, the considerations are a whole lot different than those associated with making games.

Futuremark replied that the part about the BDP is some ways from the truth and that they will comment about this when they have time after the hectic-ness of 3DMark05's launch. They did say I have some good points. I'll see what they have to say in a more detailed reply to my email above.

Anyway, at first I thought DST (setting a depth stencil surface as a texture) was pretty straightforward; there are 12 depth stencil formats afforded by DX9 :

Code:
D3DFMT_D16_LOCKABLE 
D3DFMT_D32 
D3DFMT_D15S1 
D3DFMT_D24S8 
D3DFMT_D24X8 
D3DFMT_D24X4S4 
D3DFMT_D32F_LOCKABLE 
D3DFMT_D24FS8 
D3DFMT_D16 
D3DFMT_VERTEXDATA 
D3DFMT_INDEX16 
D3DFMT_INDEX32
3DMark05 uses D3DFMT_D24X8, FYI. With those available formats, I thought it was simple enough a task to see if : a enumerated render target surface (that we know is compatible with a display adapter format) can be used combined with enumerated depth stencil formats, when we create the D3DDEVICE; check to see whether a surface format can be used as a texture, whether a surface format can be used as a texture and a render target, or whether a surface format can be used as a depth-stencil buffer; verify depth buffer format support and depth-stencil buffer format support; set texture format and away we go. But I didn't know how NVIDIA cards actually implements this until two days ago, courtesy of a developer that Dave started talking to to discuss DST. Dave will tell us what that developer has to say in his 3DMark05 article (which I was supposed to do since Dave just came back from his honeymoon but I backed out two days before the launch of 3DMark05 due to time constraints on a personal level, which is a good thing since I know Dave will write a better article than me anyway). The point is that if this feature is basically a IHV-specific extension (I believe there's a NVIDIA-specific extension in OpenGL), then it should NOT be enabled by default. Dave and I have no argument with it actually being used; we just have issues with basically the "core values" (Dave's words) of Futuremark and what they constitute, simply by virtue of Futuremark deeming that games would enable this too by default -- 3DMark CANNOT use IHV-specific extensions/features/whatnots. It Is Not A Game.

There is of course several parallels to be drawn with this : ATI's 3Dc has been discussed already. The point is we want "apples-to-apples" (yes, the dreaded phrase) as the default settings for a benchmark app. Then again, there is no such thing if we enable the "Force Full Precision" option in 3DMark05 -- FP32 is what this option means, but on ATI hardware, you don't get 32-bit anyway.

Due to this "DST as Default" issue, Dave has questions about Futuremark's BDP (Beta Development Program). I don't think B3D has anything to lose, so we shall discuss this internally.

Early on during the development of 3DMark05, Patric Ojala said that (after my suggestions) he was actively campaigning to change the slogan from "The Gamers benchmark" to something that better reflects what 3DMarkXX really is. At the last minutes, when reviewing the pre-release versions, I noticed the slogan was still the same. I suggested to the Exec. VP of Sales & Marketing of FM, Tero, that this be changed (subtly but honestly) to "The Gamers' 3D Benchmark". Too late.

That said, I repeat : 3DMark05 is impressive looking. There is a problem with Perspective Shadow Mapping (PSM), especially in Game Test 3 but I brought this up already. If you look at the canyon walls in Game Test 3, you'll see flickering shadows.

Reviews
I am in the midst of reviewing a 6800GT by a new board vendor located in the UK. I am also expecting a X800 XT from Visiontek. These two reviews of mine will deviate significantly from the usual B3D review format, which usually goes along the lines of theoretical performance, games performance, AA performance, etc. Hopefully, my new format will be attractive. It's been a long time since my last review so hopefully my rustiness won't show through. The problem is that the 6800GT review started with the 61.77 official NVIDIA drivers and I am expecting (guessing, actually) new official NVIDIA drivers before the review is done -- there is a considerable difference between the 61.77s and the 66.51s which are drivers offerred (and approved, but only for 3DMark05) by Futuremark for reviewing 3DMark05. I'll see how it is in a few days' time.

This post has probably been too long and repetitive.
 
This post has probably been too long and repetitive.
No, not at all...it was actually quite good from start to end. My only change/critique to it would be that mebbe you should have used the "BDP (Beta Development Program)" line in the beginning rather than near the end for thickies like me. (I suck with acronyms)

But a question, or rather a clarification/elaboration please....I got to preface it with a chopped up quote from what you wrote. (I didn't mean to take ya out of context here, I'm just trying to set it up to frame my question.)

Anyway, at first I thought DST (setting a depth stencil surface as a texture) was pretty straightforward; there are 12 depth stencil formats afforded by DX9....3DMark05 uses D3DFMT_D24X8

.....

But I didn't know how NVIDIA cards actually implements this until two days ago

.....

The point is that if this feature is basically a IHV-specific extension (I believe there's a NVIDIA-specific extension in OpenGL), then it should NOT be enabled by default.
If DST is a DX9 extension than why would it matter if the card renders it in hardware or software? Or is it not part of the DX9 spec but there is an extension for it?

I'm all confused on that point now, if you could shed some understand on that for me I would be much obliged.
 
You can't "render in software" if you use hw acceleration to start with. The Refrast can't even render it. You can specify in an app to use it (the part in my post above regarding the codes I use). FM don't have any "workarounds" (maybe via a pixel shader) if a hw don't support this, so if a hw don't support this, DST is never used even if it is enabled by default. The DX9 spec allows you specify the format for this (DST) but you cannot control how it works even on hw (=NV cards) that support DST.

The developer I mentioned explained this quite clearly but I won't post that here; Dave will do that in his 3DMark05 article.

The issue is most folks will use 3DMark05's "Default" setting. That is the problem.
 
Reverend said:
The DX9 spec allows you specify the format for this (DST) but you cannot control how it works even on hw (=NV cards) that support DST.
Ah, so even though DST is part of the DX9 spec the way it is done on nV's hardware is not the proper one the spec calls for? (Sorry, I'm not trying to be annoying...just trying to understand it.)
 
No, "DST" itself is not part of the DX9 spec. There is/are no CAPS for it. You can specify depth stencil surface formats (and see if the surface can be a texture) but that is as far as it goes -- NVIDIA drivers will change the filter mode (this is related to PCF, which is important) and in DX9 you can't do anything about this.

Also, you can't lock the depth buffer on anything but a NV card.
 
One question why have you hidden the thread in general???

Well I think the smart thing for both ati and nv to do would be to add aplication detection code for any game out their that uses this and if the game isn't supported default to point sampling would break any of the futuremark rules by the sounds of it.
 
bloodbob said:
One question why have you hidden the thread in general???
You can see it; it is not hidden.

This is my soapbox. I have posted 11 other "Reverend at The Pulpit" threads here. These are my thoughts on a variety of things, including personal stuff that can't belong in any other forum. If folks miss my RATP posts, perhaps it's for the better (for me).
 
Reverend said:
No, "DST" itself is not part of the DX9 spec. There is/are no CAPS for it. You can specify depth stencil surface formats (and see if the surface can be a texture) but that is as far as it goes -- NVIDIA drivers will change the filter mode (this is related to PCF, which is important) and in DX9 you can't do anything about this.

Also, you can't lock the depth buffer on anything but a NV card.

Does it do the same thing as Xbox with DS surfaces? i.e. can you specify any of the filter modes, but only bilinear and point have well defined behavior, or do they just force bilinear in the driver?

Actually I always kind of like the way the bicubic filter looked, despite the fact that it isn't working as you'd expect.
 
Is the DST/PCF(or bilinear filtering whatever) even properly working right now? Looking at some screenshoots it looks not that way. IIRC is the filtering on nvidia hardware supposed to blur or anti-alias the edges isn't it but it doesn't look like it does anything like that at all
 
Reverend said:
No, "DST" itself is not part of the DX9 spec. There is/are no CAPS for it. You can specify depth stencil surface formats (and see if the surface can be a texture) but that is as far as it goes -- NVIDIA drivers will change the filter mode (this is related to PCF, which is important) and in DX9 you can't do anything about this.

Also, you can't lock the depth buffer on anything but a NV card.

What is DST? Why does it matter? What is PCF? Why does THAT matter?

And what the hell is the significance of if you can lock the depth buffer or not?! Is it too much to ask for a brief explanation of the terms when they are introduced? :devilish:
 
Reverend said:
This is my soapbox. I have posted 11 other "Reverend at The Pulpit" threads here. These are my thoughts on a variety of things, including personal stuff that can't belong in any other forum. If folks miss my RATP posts, perhaps it's for the better (for me).
You should start a BLOG. ;)
 
Reverend said:
I am in the midst of reviewing a 6800GT by a new board vendor located in the UK. I am also expecting a X800 XT from Visiontek. These two reviews of mine will deviate significantly from the usual B3D review format, which usually goes along the lines of theoretical performance, games performance, AA performance, etc. Hopefully, my new format will be attractive.

Intriguing!

Anybody care to guess what the 6800GT is? Evesham maybe.
 
I can certainly see your point, especially in light of B3D's position in the "3D review spectrum".

I haven't noticed the flickering problem with PSM that you mention, but I am disappointed with the resolution artefacts (as I pointed out before). If this is how "future games" will do shadows then give me harsh/cpu intensive stencils any day, especially since in 3dmark05 they are not "soft" at all, just fuzzy/pixellated.
 
MuFu said:
Reverend said:
I am in the midst of reviewing a 6800GT by a new board vendor located in the UK. I am also expecting a X800 XT from Visiontek. These two reviews of mine will deviate significantly from the usual B3D review format, which usually goes along the lines of theoretical performance, games performance, AA performance, etc. Hopefully, my new format will be attractive.

Intriguing!

Anybody care to guess what the 6800GT is? Evesham maybe.

I was under the impression that all cards sold under the Evesham name are from other manufacturers [possibly rebranded] - a bit like the 'Dabsvalue' brand used by Dabs.com. Reason being that the Evesham part nos. exactly match the part numbers of other manufacturers:

Radeon X800XT PE = Part 6063 [Evesham and C3D]
Radeon X800XT = Part 6064 [Evesham and C3D]
Radeon X800 Pro = Part 6061 [Evesham and C3D]
Radeon 9250 = Part 6058 [Evesham and C3D]
Radeon 9550 = Part 11032-01-10 [Evesham and Sapphire]
GeForce FX 6800Ultra = Part A400-TD128SI [Evesham and Leadtek]
GeForce FX 6800 = Part A400-TD128SI [Evesham and Leadtek]
etc.

P69
 
Phantom69 said:
MuFu said:
Reverend said:
I am in the midst of reviewing a 6800GT by a new board vendor located in the UK. I am also expecting a X800 XT from Visiontek. These two reviews of mine will deviate significantly from the usual B3D review format, which usually goes along the lines of theoretical performance, games performance, AA performance, etc. Hopefully, my new format will be attractive.

Intriguing!

Anybody care to guess what the 6800GT is? Evesham maybe.

I was under the impression that all cards sold under the Evesham name are from other manufacturers [possibly rebranded] - a bit like the 'Dabsvalue' brand used by Dabs.com. Reason being that the Evesham part nos. exactly match the part numbers of other manufacturers:

Radeon X800XT PE = Part 6063 [Evesham and C3D]
Radeon X800XT = Part 6064 [Evesham and C3D]
Radeon X800 Pro = Part 6061 [Evesham and C3D]
Radeon 9250 = Part 6058 [Evesham and C3D]
Radeon 9550 = Part 11032-01-10 [Evesham and Sapphire]
GeForce FX 6800Ultra = Part A400-TD128SI [Evesham and Leadtek]
GeForce FX 6800 = Part A400-TD128SI [Evesham and Leadtek]
etc.

P69

Yep. Doesn't make them not a "board vendor" though.
 
Mordenkainen said:
If this is how "future games" will do shadows then give me harsh/cpu intensive stencils any day, especially since in 3dmark05 they are not "soft" at all, just fuzzy/pixellated.
Everything is pixellated when you zoom-in and analyze.

The shadows were well done in 3DMark05 IMO. I prefer it to Doom3's :)
 
Perhaps the problem is that the people at Beyond3D here overrate themselves. B3D is just *one* of the beta members. So if you suggest one thing, and all other beta members suggest something else, your suggestion is discarded. Simple no?

Another thing is that I think you seem to have forgotten that you are members of the press, not game developers. To point out the problem: how many games or other high-end realtime graphics applications have you written lately? You may know the theory, but you don't have the hands-on experience that game developers do.
This became painfully obvious to me when I discussed 3Dc with DaveBaumann, and he didn't understand the implications of 3Dc not being able to return unnormalized vectors.

So I think you should look at yourself first.
 
Man, you really are on a crusade aren't you Scali? :rolleyes:

How many posts have you wasted on this over the past few days?
 
Status
Not open for further replies.
Back
Top