Reverend at the Pulpit #2

Reverend

Banned
Well, it's been a big week for Beyond3D and that is solely down to being announced as a FutureMark Beta Press Member. The guys here have known this for some time prior to the official press release - the decision, as should be the case, was down to Dave simply accepting FutureMark's offer, and Dave decided positively on this weeks back. Dave had to consider this carefully, for the sake of the site, and it would've been a no-brainer but for :

a) NVIDIA having made their position clear wrt 3DMark... but only so, in a very uncharacteristically but understandably public manner, after the rather lackluster game-based and 3DMark03-based reviews of their NV30-based products
b) Beyond3D being "labelled" a anti-NVIDIA site, for as long as I can remember, for various reasons

Like Dave said in his 24/4 News Update post, personally, I find it more exciting to be involved in helping to shape the next 3DMark than being in the priveleged position to beta test patches and updates prior to gold versions of FM's suite of benchmarking apps. I was extremely excited by this prospect when Dave made known to the B3D crew about being approached by FM -- my brain was already working up an appetite! Prior to Dave making his decision, I had written FM's Patric Ojala, 3DMark project lead, about what I'd like to see in the next 3DMark (technical stuff, about vertex and pixel shading benchmarking technicalities), and once Dave made his decision, I wrote another email to FM about what I thought the next 3DMark should focus on -- I'd always viewed 3DMark as a "forward looking" benchmarking app and I suggested that the next 3DMark should dispense entirely with trying to approximate how DX7 and DX8 games should and could be presented and hence benchmarked (IOW, there already exists DX7 and DX8 games that can be benchmarked... why should FM bother with DX7 and DX8 other than to simply be "complete"?). IOW, I said that the next 3DMark should be focussed entirely on DX9 and things about DX9 that excites me (and presumably programmers in the business of making games). This, of course, will depend heavily on future DX9 hardware... with vs/ps_3_0 primarily in mind, as well as the schedule of Longhorn/DX10. As such, the gist of what I wrote FM regarding what I feel FM should concentrate on re the next 3DMark comes down to :

Things in DX9 that matters :

- floating point pixel shader performance
- multiple render target / multiple element texture performance
- performance when rendering with tons of simultaneous textures
- performance when rendering using high precision (64/128-bit textures).
- performance with long, complex pixel shaders.
- performance in scenes with tons of complex state changes.
- performance in scenes that use lots of render-to-texture

Things (whether in DX9 or not) that don't matter :

- complex vertex shaders (they'll tend to be quite simple, just setting up some vectors for use by really complex pixel shaders).
- higher order surfaces... lots of talk, nobody actually use it
- polygon throughput (that's not a limiting factor in games)

Insofar as my position/role at B3D is concerned, I have personally placed being involved with helping to shape the next 3DMark as high priority and will be spending considerable time on this. Dave can have the pleasure of reviewing all the deliciously new video cards although I wouldn't reject them :) Being a FM beta member is an(other) important milestone in B3D's history... I'd place it slightly below Dave/Kristof's 3dfx-commissioned AA whitepaper ATM.

As for other mundane matters...

My sexy newish work (=real estate) colleague seems to be ignoring me in favour of my other male colleagues. Maybe it's coz I sit in a room (due to my "senior" position in the company) and haven't been able to interact with her as much as I'd like. Wot a bummer. Oh well, I suppose it's all part of the "hitting middle age" feeling when you've been married for 10 years... I think. Don't you agree that middle aged guys always want to have that "rush" of still being attractive to women, even if thy're married? ;) :)

The SARS scare continues unabated, and with no foreseeable cure for at least a few years, I have to be careful, especially being in South East Asia. I have banned me and my family from shopping complexes and the cinemas. Which is actually a good thing since that means me and the woman have more time to spend with our son. SARS really is scary and Fairway (ATi board supplier and recently-turned retailer) told me I'd have to wait a little while for their Radeon 9500 review unit to arrive because their Hong Kong office guys are afraid to even venture out the door to send the board over here.

ManU vs Real ... wot a match, eh? I hope Arsenal wins the Premier League, not so much that they actually win it but because I hate ManU and hope ManU end with nothing again.
 
Reverend said:
My sexy newish work (=real estate) colleague seems to be ignoring me in favour of my other male colleagues. Maybe it's coz I sit in a room (due to my "senior" position in the company) and haven't been able to interact with her as much as I'd like. Wot a bummer. Oh well, I suppose it's all part of the "hitting middle age" feeling when you've been married for 10 years... I think. Don't you agree that middle aged guys always want to have that "rush" of still being attractive to women, even if thy're married?

I think our egos just need the feedback at our ages more than in the younger years. Been with my wife for 14 years now and it's very nice when a woman at the office gives me the, "Hey, muscle man," or "Hey skinny" (I've lost a LOT of weight in the past two years). The male ego is such a fragile thing. 8)
 
Reverend said:
...

- floating point pixel shader performance
- multiple render target / multiple element texture performance
- performance when rendering with tons of simultaneous textures
- performance when rendering using high precision (64/128-bit textures).
- performance with long, complex pixel shaders.
- performance in scenes with tons of complex state changes.
- performance in scenes that use lots of render-to-texture

Things (whether in DX9 or not) that don't matter :

- complex vertex shaders (they'll tend to be quite simple, just setting up some vectors for use by really complex pixel shaders).
- higher order surfaces... lots of talk, nobody actually use it
- polygon throughput (that's not a limiting factor in games)
...

Assuming discussing something besides the fragility of male egos is OK (would yet another guy confirming it add anything? :p), which it might not be since this was started in general discussion and not the technical section...:

I'm presuming you are discussing the scoring components of the "next 3dmark", since I don't see a reason to limit the scope of testing of the benchmark outside of scoring at all.

First, how do you know higher order surfaces will not be a concern for DX 10? Ideas of introducing more functions to the VS, whether PS instruction possibilities or geometry creation, seem to require focusing on VS control being more localized for efficiency. HOS seems well suited for localized model detail management...I thought the reason it wasn't a focus at this moment was due to insufficient flexibility and power of implementation. <-You have some expectation or knowledge contradicting this?

Hmm...well, I guess I have a complaint for all of your vertex/polygon comments for the same reason, since they all seem predicated on vertex usage not evolving. I'm assuming your talks with game developers are related to your comments, but how many game developers are coding with DX 10 in mind? Aren't they concentrating on lighting solutions for the time being? What happens when they shift focus away from that again, or what about the idea of a solution for some lighting problems being provided by more flexible and powerful vertex processing?

What would be interesting is more discussion on your thoughts on why you discount this for DX10 (and for DX 9 in the interim), but perhaps not in this thread/forum for it (perhaps some answers could be provided in the shadow thread, which to me seems to be indicating that vertex processing performance and flexibility increase could serve as a solution for some discussed problems).

Also, I think it is a mistake to strongly de-emphasize stressing particular components, even if they aren't the most important...as a general principle, it is still possible for new solutions to depart from expectations. I think testing should stress everything effectively to some degree, and then scoring weighting should be used to determine representation of these factors for anticipated gaming workloads. Even if for no other reason than the weighting can be changed at the last minute.

For instance, doing something "DX 7" isn't inherently bad because all that means is that the particular test with base driver efficiency and simple bandwidth/fill rate utilization. It would be bad if it failed to successfully represent these factors. Each test doesn't have to be completely representative (and can't be, effectively), as long as what it does represent is not unduly weighted in the "score". Such a test would just have to be dedicated to providing sufficient demands on these factors to be a useful component. That said, if every game has some shader utilization at the time of this "next" version , going completely shaderless for testing these would be a bad idea instead since shader usage interaction in these factors would an important part of it (this last doesn't disagree with what you said about DX 7, just the emphasis I perceived in it).

Of course, it is my opinion that a test solely based on testing such simple factors is not helpful, but that's related to why I don't think your de-emphasis of vertex processing is a good thing. Completeness in representation is important, and the factor I think should be the focus moving forward is representing interaction between the different components of card performance as well as providing opportunities for bottlenecks in all factors to be expressed, both to be distinct from simple synthetic testing (that's what the rest of the benchmark suite is for) and, hopefully, to make "cheating" a bit less common place and/or more likely to be related to something that can be offered in games (assuming some basic common ruleset won't be violated by the "cheating" :-?)

For instance, the performance characteristics of a deferred renderer would also have a possible opportunity to be highlighted (characteristics that could easily matter in games, but not in shader stressing tests), and there are all sorts of possibilities of other design decision variations if the 3d industry becomes more than a "two party" system again.
 
demalion said:
I'm presuming you are discussing the scoring components of the "next 3dmark", since I don't see a reason to limit the scope of testing of the benchmark outside of scoring at all.
Actually, my musings above is not about the scoring system of the next 3DMark but what I felt the concentration should be on. I did suggest to FM about a possible scoring system if FM still incorporates DX7/DX8 "game tests" into the next 3DMark scoring system but I will not discuss that here (short-of-it : reduce influence of DX7/DX8 -- if they still exists in the next 3DMark -- in overall 3DMark score).

First, how do you know higher order surfaces will not be a concern for DX 10? <snipped other HOS stuff>
I'm sorry if it came out that way but I wasn't indicating that HOS will not have a say, or a priority, in DX10 -- I'm saying that with DX9 and DX9 hardware ATM that HOS is not exactly attractive. You are correct in your assumption abiut its current lack of flexibility but not so much on lack of power.

Hmm...well, I guess I have a complaint for all of your vertex/polygon comments for the same reason, since they all seem predicated on vertex usage not evolving.
Again, it is based on DX9 VS, not about it not evolving.

I'm assuming your talks with game developers are related to your comments
Not quite... we all know DX9, its capabilities, its pros and its cons... all of which should be transparent to all who is interested in it.

, but how many game developers are coding with DX 10 in mind?
I think it is safe to say "None" :)

Aren't they concentrating on lighting solutions for the time being? What happens when they shift focus away from that again, or what about the idea of a solution for some lighting problems being provided by more flexible and powerful vertex processing?
I think lighting solutions, or alternatives to that, is one of the areas of focus of game developers but more than anything else, with the availability of DX9 publicly, as well as availability of DX9 hardware, I think they're more focussed (=I'm more focussed!) on taking advantage of pixel programming. When, and if, the next iteration of DX and its compliant hardware becomes available and public, it is logical to assume that developers would start considering new stuff. Until then, we can only work with what we have.

Also, I think it is a mistake to strongly de-emphasize stressing particular components, even if they aren't the most important...as a general principle, it is still possible for new solutions to depart from expectations. I think testing should stress everything effectively to some degree, and then scoring weighting should be used to determine representation of these factors for anticipated gaming workloads. Even if for no other reason than the weighting can be changed at the last minute.
My comments are based on what DX9 offers, for now and for the foreseeable future prior to the next 3DMark. Note that I said that I wrote FM about specific VS and PS tests that I would like to see in the next 3DMark... I didn't mean to de-emphasize specific technology tests... on the contrary, I want such tests to be in but the decision is ultimately down to FM to make -- make such tests as "Feature Tests" or incorporate them into "Game Tests" in the next 3DMark. And then it is also down to FM to determine what "tests" (Feature, Game or otherwise) should influence the weighting to the degree it may.

The rest of your comments :

I do agree that "completeness" is a probably a strong component of 3DMark but I disagree that it is essential to its success. It depends a great deal on the public's perception of 3DMark, much more so that reviewers that use 3DMark. The majority of the perception of the public of 3DMark is that it is not to be taken as representative of gaming performance but of future gaming performance -- the former can easily be proven by existing games, the latter by either 3DMark's "forward looking" tests or other "forward looking" synthetic tests/demos. "Completeness" however doesn't come down to just public preception of 3DMark -- there are other political considerations to think of for FM... as FM told me, integrated solutions, being always behind, has to be considered.

Anyway, there are many other considerations ("cheating" drivers, for instance, and how best to overcome this) -- my thoughts in my original posts are based on what we have ATM and thus based on what we should concentrate on.
 
I guess I mistook your mention of DX 10, since I perceived your DX 9 commentary as "things from DX 9 to consider when addressing DX 10".

However...

Reverend said:
Actually, my musings above is not about the scoring system of the next 3DMark but what I felt the concentration should be on.
...

I'm confused by this statement, because the scoring tests are based on what they concentrate on. As I said, I see even less need to specify what is tested by synthetic tests outside of the scoring tests than what I was trying to address. There's a question later that should afford an opportunity to clarify for me.

...
Hmm...well, I guess I have a complaint for all of your vertex/polygon comments for the same reason, since they all seem predicated on vertex usage not evolving.
Again, it is based on DX9 VS, not about it not evolving.

But vertex usage will evolve within DX 9: there is PS/VS 3.0 that aren't offered in hardware yet. Would PS/VS 3.0 displacement mapping be considered HOS, or do you consider vertex creation necessary? I do have some curiosity as to how adaptive tesselation will be expressed, if it will (again, I feel I may be stepping outside of the intended thread direction).

I'm assuming your talks with game developers are related to your comments
Not quite... we all know DX9, its capabilities, its pros and its cons... all of which should be transparent to all who is interested in it.
I understand how the pros and cons are...just as long as you aren't stating that the solutions within those limitations are already defined.

, but how many game developers are coding with DX 10 in mind?
I think it is safe to say "None" :)

Heh, feel free to substitute something defined like "3.0" shaders. Basically, something beyond what has already been "done".

Aren't they concentrating on lighting solutions for the time being? What happens when they shift focus away from that again, or what about the idea of a solution for some lighting problems being provided by more flexible and powerful vertex processing?
I think lighting solutions, or alternatives to that, is one of the areas of focus of game developers but more than anything else, with the availability of DX9 publicly, as well as availability of DX9 hardware, I think they're more focussed (=I'm more focussed!) on taking advantage of pixel programming. When, and if, the next iteration of DX and its compliant hardware becomes available and public, it is logical to assume that developers would start considering new stuff. Until then, we can only work with what we have.

Hmm...well, I wasn't under the impression that the complete extent of even what PS 2.0/VS 2.0 is capable of had been uncovered in discussion or execution as of yet, so when I said "shift focus" I wasn't excluding still working within DX 9. In short, I thought there was still quite a bit more left in "what we have" already that seem to me to argue against your list of things that "don't matter".

Also, I think it is a mistake to strongly de-emphasize stressing particular components, even if they aren't the most important...as a general principle, it is still possible for new solutions to depart from expectations. I think testing should stress everything effectively to some degree, and then scoring weighting should be used to determine representation of these factors for anticipated gaming workloads. Even if for no other reason than the weighting can be changed at the last minute.
My comments are based on what DX9 offers, for now and for the foreseeable future prior to the next 3DMark. Note that I said that I wrote FM about specific VS and PS tests that I would like to see in the next 3DMark... I didn't mean to de-emphasize specific technology tests...

Hmm? My premise was based on the scoring tests, not technology tests...when I see "technology tests", I read it as the benchmark tests outside of those. I guess it is an issue of my taking "don't matter" a certain way, and you meaning something more like "don't focus on"? Answering this will address a lot of my other comments if the answer is "yes".

on the contrary, I want such tests to be in but the decision is ultimately down to FM to make -- make such tests as "Feature Tests" or incorporate them into "Game Tests" in the next 3DMark.

Well, I understood a particular emphasis in your list, and was expressing my disagreement in general. Of course Futuremark will decide, I was just disagreeing with your input here since I don't have a direct line to Futuremark myself. :(

The rest of your comments :

I do agree that "completeness" is a probably a strong component of 3DMark but I disagree that it is essential to its success.
...

You're talking about final success, which is a matter of "packaging", which I'm not trying to discuss here. I'm talking about success at being a good benchmark, and leaving the final success up to how that achievement is presented (my own solution would be to make the focus of benchmarking in the free version more interactively related to "image quality" like I proposed a while ago, to get users to get "jollies" from competing on more than just "big numbers"). Then again, perhaps that's the reason you chose this forum, and my focus on that is out of place ... but then I'd have to blame you for providing a list of technical concerns in the first place. :p

...
"Completeness" however doesn't come down to just public preception of 3DMark -- there are other political considerations to think of for FM... as FM told me, integrated solutions, being always behind, has to be considered.

Well, my discussion of completeness had nothing (atleast, directly) to do with political or (general) user perception, I'd leave that to their presentation, and artistic choices for scenes (i.e., GT 2's resemblence to Doom 3, "wow" factor in visuals, etc).

Anyways, integrated solutions could be suitably covered by older tests...actually, I think the de-emphasis of year name to facilitate less negative connotation in using prior benchmarks is a step towards this. There should be plenty of viable ways to go about handling that without necessitating that the "completeness" I was referring to take a back seat. Of course, DX 9 integrated chip sets could certainly be out by the time of the next 3dmark in any case.

Anyway, there are many other considerations ("cheating" drivers, for instance, and how best to overcome this) -- my thoughts in my original posts are based on what we have ATM and thus based on what we should concentrate on.

Hmm...I guess my picture of what we have at the moment isn't as fixed as what you propose. For instance, my view of the shadow thread is that hardware improvements, while still being "only" DX 9 hardware, would facilitate different choices for the given solutions and that there are still problems that might be solved by such a hardware change, or even things yet to be implemented or discussed in terms of shader programs, while still being offered within the existing functionality...my point was primarily addressing that your list of things that "don't matter" seem (to me) to be proposing such possibilities be actively precluded.
 
demalion said:
I guess I mistook your mention of DX 10, since I perceived your DX 9 commentary as "things from DX 9 to consider when addressing DX 10".
No, what I typed above means things to consider in the next 3DMark, where its WIP may have to consider DX10... or not :). It's simply down to FM being kept closely informed by MS.

...
Hmm...well, I guess I have a complaint for all of your vertex/polygon comments for the same reason, since they all seem predicated on vertex usage not evolving.
Again, it is based on DX9 VS, not about it not evolving.

But vertex usage will evolve within DX 9: there is PS/VS 3.0 that aren't offered in hardware yet. Would PS/VS 3.0 displacement mapping be considered HOS, or do you consider vertex creation necessary? I do have some curiosity as to how adaptive tesselation will be expressed, if it will (again, I feel I may be stepping outside of the intended thread direction).
Vertex usage will evolve but I was talking about what we have/know now (even including _3_0) and based on this, I do not think the next 3DMark should focus on vertex shading per se. Again, this is related to what I perceive as interesting within 2_0 and 3_0 and that is using simple vertex programs for more complex pixel shading.

Also, I think it is a mistake to strongly de-emphasize stressing particular components, even if they aren't the most important...as a general principle, it is still possible for new solutions to depart from expectations. I think testing should stress everything effectively to some degree, and then scoring weighting should be used to determine representation of these factors for anticipated gaming workloads. Even if for no other reason than the weighting can be changed at the last minute.
My comments are based on what DX9 offers, for now and for the foreseeable future prior to the next 3DMark. Note that I said that I wrote FM about specific VS and PS tests that I would like to see in the next 3DMark... I didn't mean to de-emphasize specific technology tests...

Hmm? My premise was based on the scoring tests, not technology tests...when I see "technology tests", I read it as the benchmark tests outside of those. I guess it is an issue of my taking "don't matter" a certain way, and you meaning something more like "don't focus on"? Answering this will address a lot of my other comments if the answer is "yes".
And the answer is "Yes" -- don't matter = don't focus on, is what I meant. We can, of course, discuss about the possibility that you disagree with what I consider to be "don't focus on" stuff :)

The rest of your comments :

I do agree that "completeness" is a probably a strong component of 3DMark but I disagree that it is essential to its success.
...

You're talking about final success, which is a matter of "packaging", which I'm not trying to discuss here. I'm talking about success at being a good benchmark, <snip>
And that is what I was trying to suggest in my original post -- what my thoughts are in what constitutes a good 3DMark benchmark based on :

1) DX9 shaders_2_0
2) DX9 shaders_2_0 hardware
3) DX9 shaders_3_0
4) possible DX9 shaders_3_0 hardware
 
Reverend said:
...
And the answer is "Yes" -- don't matter = don't focus on, is what I meant. We can, of course, discuss about the possibility that you disagree with what I consider to be "don't focus on" stuff :)

Well, that still leaves some questions about HOS, adaptive tesselation, displacement mapping (if effective and robust in PS/VS 3.0), and my general indication that I think such things should be a focus as well, so indeed we have been and could continue to do. :p Maybe with more responses in the shadow thread the discussion will cover some of this. For instance, I'm thinking dynamic branching in the vertex shader is one thing that hasn't had a chance to be displayed to advantage yet (if it can be).

What would happen if a future chip had effectively "8" or more vertex "pipelines" (even if just "sometimes")? It certainly seems possible within the lifetime of DX 9 to me (NV40? R4x0?), and I'm thinking that ties into all of the above questions.

The rest of your comments :

I do agree that "completeness" is a probably a strong component of 3DMark but I disagree that it is essential to its success.
...

You're talking about final success, which is a matter of "packaging", which I'm not trying to discuss here. I'm talking about success at being a good benchmark, <snip>
And that is what I was trying to suggest in my original post -- what my thoughts are in what constitutes a good 3DMark benchmark based on :

But that last post (not the original one) was talking about "final" success (not just benchmarking) in particular, this part of my post was dealing with that discussion specifically.

1) DX9 shaders_2_0
2) DX9 shaders_2_0 hardware
3) DX9 shaders_3_0
4) possible DX9 shaders_3_0 hardware

Welll, there is the opportunity for more than one productive discussion I think: The technical one, which can be somewhat indirectly discussed in the shadow thread, and the discussion of "packaging" (some of which is associated with the thoughts in this post, and discussion both before and after it in the thread, maybe "before" is clearer for a start since some actually tried to get the thread back "on topic" in the "after"... :rolleyes: :p).
 
Back
Top