3D Mark 2001 SE Today?

On 2002-02-13 04:20, Doomtrooper wrote:
Huh ..WTF...The Radeon has a Pixel shader..jump on Rage3D already a thousand posts all over..ATI Sucks can't even get the Pixel shader right..yada..yada
Now we come to find out the Radeon Supports Pixel Shader 1 not 1.1, at the last minute MS changed it..nice.

Doom, the original Radeon never supported and was never designed to support pixel shaders in any shape or form. It's a traditional multitexture card.
 
On 2002-02-13 20:08, Humus wrote:
On 2002-02-13 04:20, Doomtrooper wrote:
Huh ..WTF...The Radeon has a Pixel shader..jump on Rage3D already a thousand posts all over..ATI Sucks can't even get the Pixel shader right..yada..yada
Now we come to find out the Radeon Supports Pixel Shader 1 not 1.1, at the last minute MS changed it..nice.

Doom, the original Radeon never supported and was never designed to support pixel shaders in any shape or form. It's a traditional multitexture card.

http://www.ati.com/na/pages/technology/hardware/radeon/visual_details.html#23

Programmable Pixel Shaders

As you can see from the list of topics described thus far, ATI's Pixel Tapestry architecture is capable of applying a wide range of effects to 3D surfaces in order to enhance their level of realism and detail. The RADEON graphics processor can accelerate all of these features in real time for maximum performance. But game developers are a creative bunch, and the best ones frequently come up with ideas for new graphical techniques to achieve a desired visual effect. In the past, these new techniques could not be hardware accelerated without creating a new graphics chip. Game developers therefore had to write special non-accelerated software routines if they wanted to use these new techniques, and the performance cost associated with this type of solution could often end up being prohibitive. The developer might be forced to leave the feature out of the game altogether, in the hopes that it could be used in their next game when faster computer hardware is available to consumers.

Programmable pixel shaders are an exciting innovation that gives developers a new degree of freedom in creating advanced visual effects. A pixel shader is a simple routine that determines the color of a pixel based on variety of inputs (base material color, light color, surface reflectivity, bumpiness, transparency, etc.). Pixel Tapestry architecture allows a developer to actually program custom shader routines into the graphics processor itself, allowing them to be accelerated in hardware.

Pixel Tapestry architecture's advanced pixel shader support allows up to three stages of mathematical operations with up to three inputs each to be performed on every pixel in a scene. Each stage of the shader routine can use the results of a previous stage as one of its inputs, and can perform a number of operations including addition, subtraction, multiplication, alpha blending and dot product. The following diagram illustrates the flexibility of the pixel shader hardware found in the RADEON graphics processor

1530-1651.gif


Huh...now I'm really confused .../looks for some form of alcoholic beverage.
 
It's just marketing bs. When DX8 introduced Pixel Shaders, both NVidia and ATI started reinterpreting their combiner stuff into Pixel Shader lingo. But neither the GF2 nor the Radeon were ever PS1.0 compliant.

There was a PS0.5 or PS0.9 (can't recall version number) that was mapped to existing DX7 texture pipeline, and you could interpret this as saying DX7 hardware could do pixel shading, but that's only by an enormous stretch.


Basically, marketroid bs.
 
Ok Humus, put up or shut up then. I'm sure if you code a demo in PS1.4 that is both dramatically impressive and better than any PS1.1 demo *AND* CANNOT BE DONE on any PS1.1 hardware, ATI will not only distribute your demo, but might even give you some cash.


In the meantime it would serve to defend your position, because if you're going to BOAST, you're going to have to prove it. And no, the trivial little texture mapped room demonstrating [0,8] range precision wasn't dramatically better than anything done on other hardware. (and please don't assume that because you've done the equivalent of a HelloWorld in OpenGl that you are now some kind of Carmack)





<font size=-1>[ This Message was edited by: DemoCoder on 2002-02-13 20:39 ]</font>
 
"Finally the true colors show themself, if you can't understand the concept of being objective and FAIR then fine. Your EMBM example was totally unrelated, were talking tests that affect the overall SCORE here since everyone looks at 3DMARK, EMBM does nothing for the score. Nature DOES affect the score, is that too hard to understand ?"

Is it too hard to understand how people here filter everything through anti-Nvidia glasses? Is it too hard to understand the fallacy of claiming that MadOnion didn't put EMBM into 3dmark until NVidia released the GF3 which supported it?


Is it too hard to understand the concept that much of the time you can code an identical algorithm in both PS1.4 and with PS1.1 + multipass and test them both?

Let's look at the theory:

1) You try to do a benchmark demonstrating how PS1.4 leads to better performance
2) You write a PS1.4 demo that can be done in one pass with PS1.4
3) You recode the same demo, but have to resort to multipass for PS1.1
4) The multipass demo turns out to run just as good as PS1.4 showing that collapsing passes isn't always a performance winner


You cry that MadOnion didn't make the test "PS1.4" exclusive. But if the PS1.1 test has identical outpu, what's the difference? It's not the code that matters, it's the final output.
 
Is it too hard to understand the concept that much of the time you can code an identical algorithm in both PS1.4 and with PS1.1 + multipass and test them both?

Damnit, DemoCoder, then why didn't Mad-Onion do this! That's the entire point. There is NO GAME PERFORMANCE TEST IN 3D MARK where MadOnion coded the same scene with both 1.4 and 1.1 + multipass methods, so we could begin to meaningfully analyze results!

Let's look at the theory:

1) You try to do a benchmark demonstrating how PS1.4 leads to better performance
2) You write a PS1.4 demo that can be done in one pass with PS1.4
3) You recode the same demo, but have to resort to multipass for PS1.1
4) The multipass demo turns out to run just as good as PS1.4 showing that collapsing passes isn't always a performance winner

Interesting THEORY. The PROBLEM is, Mad-Onion has STATED that they never had the intention of doing step number 1. Again, THAT'S the POINT.

If MadOnion created a "Advanced Pixel Shader Game Scene" with the purpose of looking at PERFORMANCE, then we would all be able to see for ourselves how it turns out. The point is, 3D Mark 2001 does virtually NOTHING to help us examine the impact of coding for PS 1.4 vs. 1.1. So you have a theory...wouldn't it be nice to have some way of testing that theory? Mad-Onion missed out on that opportunity.

Again, I repeat from the Mad-Onion FAQ, my own emphasis added:

A: The Advance Pixel Shader test is what we call a Feature Test, which means that we, above all, want to present some new technology. It was decided that a fall-back was to be included in addition to the 1.4 pixel shader, since the same looking effect can be achieved using pixel shader 1.0 hardware. These two different modes of that same test work a bit DIFFERENTLY and should, therefore, NOT BE DIRECTLY COMPARED. Both modes COULD be optimized to show more performance either way, but now the TEST just OPTIMIZED for maximum COMPATIBILITY. Vertex shader performance also affects the score, somewhat, due to this compatibility optimization.

<font size=-1>[ This Message was edited by: Joe DeFuria on 2002-02-13 21:15 ]</font>
 
I still don't get it.
Why include a PS1.4 in the score when there is a lot of other tests that are not included? Whats the big deal with PS1.4? I understand that it will give Radeon owners a higher score, but I really didnt think you guys were so obsessed with high 3DMark scores :smile: (thats "our" job as nvidiots :D )

How about including a test with lots of overdraw? That would really help Kyro cards. I don't see them bitching.
Version 1.1 is a upgrade, NOT a new test. Get that into your heads.

It's not a big conspiracy every time :smile: Not everyone is bought by NVIDIA. Each time a review write some bad stuff (even if they are very positive on most points) they are bought by NVIDIA. If they do some wrong testing, not testing only games where Radeon shines; it's a conspiracy. Anandtech, Tom's Hardware, MadOnion blah blah blah ..... the list goes on and on and on.

I just laught my ass of every time a review writes only positive stuff about Radeon: YES GREAT REVIEW. Finally a site not bought by NVIDIA hehehehe.

<font size=-1>[ This Message was edited by: Galilee on 2002-02-13 22:13 ]</font>
 
Why include a PS1.4 in the score when there is a lot of other tests that are not included? Whats the big deal with PS1.4?

The "big difference" is that PS1.4 is the major architectural difference between official Direct3D revisions. PS 1.4 is the primary difference between DX 8.0, and 8.1.

How about including a test with lots of overdraw? That would really help Kyro cards. I don't see them bitching.

The point of the "game" tests are to generate scenes that hopefully mimic the types of scenes in today's, and near future games. Note that the "high detail" tests do usually feature more object and hence more overdraw, which should benefit deferred renderers.

I just laught my ass of every time a review writes only positive stuff about Radeon: YES GREAT REVIEW. Finally a site not bought by NVIDIA hehehehe.

I agree with you there. ;)
 
Overdraw is not a Dx 8.1 feature is it, how hard is for you to understand that.

The front page of Madonion:

Finally! The long awaited update to 3DMark2001 is released! 3DMark2001 Second Edition brings you the latest in benchmarking! With support for DirectX8.1, a new test, all reported issues fixed and full support in WindowsXP, this package is unbeatable!

How hard is it for you to grasp this, PS 1.4 is a part of DX 8.1 and should be PART of the scoring..its really not that hard is it. :rollseyes:
 
I think the 64,000 dollar question is: What's the big stink about 3DMark2001? Unless something has changed, it's not a very fun game. :smile:

Are you using as an extension of your anatomy ?(hehe, my 3DMark score is bigger!!) ;) Sure, it may influence your buying decisions, but it really shouldn't make or break the deal. If it is, then *ATI*, not a forum of PC gamers should worry about it....unless, of course, you're worried that someone else's is bigger than yours (3DMarks, that is)

Or maybe you're just justifying your purchase. Anyway I look at it, it seems rather silly to me.

<font size=-1>[ This Message was edited by: BenM on 2002-02-13 22:42 ]</font>
 
Once again Doomtrooper shows his "cut and pasting'" habit without having a clue about the content of what he even just pasted ;)

Claiming 640x480 stressed the video card, and now he's claiming the original Radeon has DX pixel shader support. Doom what are the requirements for DX1.0 pixel shader compliance? (Oh, and answer in your own words and not another cut and paste this time ;) )

How hard is it for you to grasp this, PS 1.4 is a part of DX 8.1 and should be PART of the scoring..its really not that hard is it.

How hard is it for you to grasp that PS 1.4 is not required for DX8.1 compliance? The difference between the Nature test and the new PS 1.4 test is that the nature test used PS 1.0, which is a requirement for DX8.0 compliance. Why would Madonion make a PS 1.4 test score matter when it is not even a requirement? As a matter of fact, you should have no problem if they followed your suggestion, which would allow the GF4's PS 1.3 support to take into final mark consideration. You see the irony of your stance?
 
I think the 64,000 dollar question is: What's the big stink about 3DMark2001? Unless something has changed, it's not a very fun game.

Well, the "big stink" is that 3DMark scores is one tool that I (and I bet many others) personally use when deciding to buy a new card. It's not a fun game at all ;), and no single benchmark should be used in isolation. But it has the "potential" to be one of the best single indicators of "combination of current and future performance", precisely because it is not a game, but a synthetic benchmark that supposedly uses cutting edge API techniques in "game like" situations.

3D Mark is also a huge marketing tool for the IHVs. So I simply want to see the "results" of the tool reflect, as accurately as possible, how "good" each card really is.

I simply don't feel that enough consideration for PS 1.4 was given, and how that might impact game performance and/or quality. In fact, ZERO consideration is given for Ps 1.4 architectures, because having 1.4 shaders does not impact 3D Mark score at all. Therefore, I don't feel the scores that 3D Mark generates are "fair" for products supporting PS 1.4.

Someone else said it here, and I'll repeat it...I don't really care whether the "changes" I suggested would put ATI further on top or not. The point is, I feel it could be a more useful and accurate tool.

If adding a PS1.4 performance test with 1.1 fallback would in fact increase the Radeon score, so be it. If a Ps 1.1 card can render it faster with several passes with comparable image quality, great.

Point is, we don't know. :cry: And that makes buying decisions and recommendations like "do I get a Radeon 8500, or a GeForce3 Ti" more difficult.



<font size=-1>[ This Message was edited by: Joe DeFuria on 2002-02-13 22:52 ]</font>

<font size=-1>[ This Message was edited by: Joe DeFuria on 2002-02-13 22:55 ]</font>
 
On 2002-02-13 22:40, BenM wrote:
I think the 64,000 dollar question is: What's the big stink about 3DMark2001? Unless something has changed, it's not a very fun game. :smile:

Are you using as an extension of your anatomy ?(hehe, my 3DMark score is bigger!!) ;) Sure, it may influence your buying decisions, but it really shouldn't make or break the deal. If it is, then *ATI*, not a forum of PC gamers should worry about it....unless, of course, you're worried that someone else's is bigger than yours (3DMarks, that is)

Or maybe you're just justifying your purchase. Anyway I look at it, it seems rather silly to me.

<font size=-1>[ This Message was edited by: BenM on 2002-02-13 22:42 ]</font>

Actually it would be nice to see how much improvement PS 1.4 gives over 1.1, Madonion had the opportunity to do so, and chose not to. So the Radeon 8500 which is a true DX 8.1 card is not gaining ANYTHING in the nature test (which affects the score) but instead is relegated to running some lame technology demo...So in retrospect, the original NATURE TEST should also be removed from the scoring as its really just a TECHNOLOGY to them.
 
MO should take a scene, send the code to NV and ATI, tell them to optimize the hell out of it, but don't change any of what is actually happening. So we've got one scene, one doing it the PS 1.4 way and the other 1.1. Both are heavily optimized and they get the same result. Then run a performance comparison. That would be fair.
 
On 2002-02-13 22:48, Exposed wrote:
Once again Doomtrooper shows his "cut and pasting'" habit without having a clue about the content of what he even just pasted ;)

Claiming 640x480 stressed the video card, and now he's claiming the original Radeon has DX pixel shader support. Doom what are the requirements for DX1.0 pixel shader compliance? (Oh, and answer in your own words and not another cut and paste this time ;) )

How hard is it for you to grasp this, PS 1.4 is a part of DX 8.1 and should be PART of the scoring..its really not that hard is it.

How hard is it for you to grasp that PS 1.4 is not required for DX8.1 compliance? The difference between the Nature test and the new PS 1.4 test is that the nature test used PS 1.0, which is a requirement for DX8.0 compliance. Why would Madonion make a PS 1.4 test score matter when it is not even a requirement? As a matter of fact, you should have no problem if they followed your suggestion, which would allow the GF4's PS 1.3 support to take into final mark consideration. You see the irony of your stance?

The other post was a direct link to ATI referring to programmable Charisma engine that had EVERYONE confused..including me. There is a form of Pixel Shader functions on the Radeon from that link.
 
The other post was a direct link to ATI referring to programmable Charisma engine that had EVERYONE confused..including me. There is a form of Pixel Shader functions on the Radeon from that link.

Yes, but they are not DX 1.0 compliant PS. The original Radeon cannot render the same images that DX PS 1.0 can render, even with some kind of "fallback." (Some people over various boards a while back claimed the same pixel shading effects on the GF3 could be achieved with EMBM on the Radeon, which would ONLY be true if PS were used for EMBM type effects).
 
On 2002-02-13 22:48, Exposed wrote:
How hard is it for you to grasp that PS 1.4 is not required for DX8.1 compliance? The difference between the Nature test and the new PS 1.4 test is that the nature test used PS 1.0, which is a requirement for DX8.0 compliance. Why would Madonion make a PS 1.4 test score matter when it is not even a requirement? As a matter of fact, you should have no problem if they followed your suggestion, which would allow the GF4's PS 1.3 support to take into final mark consideration. You see the irony of your stance?

Please tell me where Microsoft has stated that PS 1.4 (or 1.2 or 1.3 for that matter) are not a required feature for DX8.1 compliance. I know Nvidia has stated that, but of course they would. ATI has rebutted it, but of course they would. For that matter, where does MS state that Pixel Shaders (regardless of version) are a required feature of DX8.0?

The reality is that MS released DX8.1 with PS 1.2, 1.3, and 1.4 support included. GF3 supports none of those. GF4 supports some, so I would concede that GF4 is Compliant.

<font size=-1>[ This Message was edited by: nooneyouknow on 2002-02-13 23:39 ]</font>
 
(Some people over various boards a while back claimed the same pixel shading effects on the GF3 could be achieved with EMBM on the Radeon, which would ONLY be true if PS were used for EMBM type effects).

Right...so why didn't Mad Onion skip the nature "game" performance test, and instead make a "Pixel Shader 1.1" feature test with EMBM fallback? This would be Pixel Shader 1.1 test "designed for compatibility." Then we can see what "improvment" PS 1.1 is over EMBM...if any...right?

<font size=-1>[ This Message was edited by: Joe DeFuria on 2002-02-13 23:35 ]</font>
 
On 2002-02-13 20:38, DemoCoder wrote:
Ok Humus, put up or shut up then. I'm sure if you code a demo in PS1.4 that is both dramatically impressive and better than any PS1.1 demo *AND* CANNOT BE DONE on any PS1.1 hardware, ATI will not only distribute your demo, but might even give you some cash.


In the meantime it would serve to defend your position, because if you're going to BOAST, you're going to have to prove it. And no, the trivial little texture mapped room demonstrating [0,8] range precision wasn't dramatically better than anything done on other hardware. (and please don't assume that because you've done the equivalent of a HelloWorld in OpenGl that you are now some kind of Carmack)

LOL, democoder, I knew you would react like this :smile:, but you don't need to get so offended by everything I say just because you hate me ;)
Really, I only said that it may be "hard for some", I didn't say that it wasn't hard for me, nor did I say that I think you are uncapable of doing it, but it's kinda fun to test your temper :p

Anyway, speaking of my "trivial little texture mapped room", (which is btw a little more than a texture mapped room and the screenshots were pulled out of a work in progress), I released a demo based on this not very long ago. There's a topic about it here:

http://216.12.218.25/domain/www.beyond3d.com/forum/viewtopic.php?topic=128&amp;forum=2&amp;0

But really, I don't claim to be anywhere near Carmacks brilliance, but ps 1.4 aren't "hard". Coming up with an idea for an effect that requires ps 1.4 shouldn't take long, specular exponent stored in alpha for instance can really enhance the output of specular lighting. The treasure chest demo from ATi shows many effects that requires ps 1.4 and it shouldn't be too hard to modify or take those ideas further.
 
Humus --

That effect doesn't require PS1.4. It can be achieved just fine with PS1.1 and multipass/render-to-texture.

You use the first pass to render per-pixel specular exponent and H dot N into the R and A channels of the frame buffer.

Then, apply that frame buffer as a projective texture in stage 0, and use an AR dependent read into a 2D r^s texture map in stage 1. Exactly the same effect performed in 2 passes.
 
Back
Top