r420 may beat nv40 in doom3 with anti-aliasing

It would seem that provided the application is correctly coded, it is legal to use partial precision exclusively under DX9 (since the July 2003 HLSL update).

Here is the relevant quote from nVidia's GPU Programming guide, page 17. http://developer.nvidia.com/object/gpu_programming_guide.html

You can use the /Gpp flag (available in the July 2003 HLSL update) to force
everything in your shaders to half precision. After you get your shaders
working and follow the tips in this section, enable this flag to see its effect on
performance and quality. If no errors appear, leave this flag enabled. Otherwise,
manually demote to half precision when it is beneficial (/Gpp provides an
upper performance bound that you can work toward).

So, that should silence those who claim nVidia illegally forces FP16.
 
radar1200gs said:
So, that should silence those who claim nVidia illegally forces FP16.

What good is it that FP16 is legal, if you get artifacts from poor precision? (disregarding the Doom III engine)
 
I'm not saying its good or bad. I'm just pointing out it is legal, where people on this forum have said that forcing FP16 is illegal.
 
Radar, it might be legal to use partial precision in dx9, but a problem with Doom 3 and its engine, lies in the fact that it is actually written with OpenGL.
 
radar1200gs said:
I'm just pointing out it is legal, where people on this forum have said that forcing FP16 is illegal.

Point being: If a game developer decides that he needs more than FP16 precision in some specific shaders then you would be breaking the rules of that engine by forcing FP16 über alles.

In this case it really doesn't matter what MS calls illegal or legal.
 
It's all about choice.

If your shader requires FP32 to work properly, don't use the /Gpp flag for that shader and use _PP inside it wher appropriate.

For other shaders, just use the flag.

Of course I personally am of the opinion that D3D9 would be vastly improved by having the /Gpp flag default to on and the _PP hint replaced by a _FP hint.
 
radar1200gs said:
Of course I personally am of the opinion that D3D9 would be vastly improved by having the /Gpp flag default to on and the _PP hint replaced by a _FP hint.

Of course, this is a bad idea since, by default, you are then asking developers to constantly chase down which assets are causing the precision issues rather having the default causing no or few issues. The default should cause the less issues, and those developers that want to improve performance for some hardware then have the choice of spending more time with the app deciding which assets can utilise lower precision.
 
radar1200gs said:
I'm not saying its good or bad. I'm just pointing out it is legal, where people on this forum have said that forcing FP16 is illegal.

It's only legal because of recent changes to DX9 in order to accomodate Nvdia's slow hardware. Even then, it's still illegal where Nvidia force lower precision silently in their drivers via app detection. This is in direct opposition to developers that have programmed their game without PP hints expecting to get higher (as per the spec) precision, rather than lower precision silently swapped in over their code.
 
radar1200gs said:
If you think they are lying, call them out on it on EB or something. I'm sure nVidia would take great delight in suing you out of existance.
It wouldn't be the first time they tried, it probably won't be the last, and it won't be the first nor last time they fail. (The Dig has a most excellent free legal department. 8) )

I just found it rather hilarious that you used nVidia as a reference to claim what nVidia is doing is "legal". :LOL: It's really kind of sad, in a pathetically desperate fanboy-kind-o-way. :(
 
Bouncing Zabaglione Bros. said:
radar1200gs said:
I'm not saying its good or bad. I'm just pointing out it is legal, where people on this forum have said that forcing FP16 is illegal.

It's only legal because of recent changes to DX9 in order to accomodate Nvdia's slow hardware. Even then, it's still illegal where Nvidia force lower precision silently in their drivers via app detection. This is in direct opposition to developers that have programmed their game without PP hints expecting to get higher (as per the spec) precision, rather than lower precision silently swapped in over their code.

Try, reading the first line of my first post on the subject http://www.beyond3d.com/forum/viewtopic.php?p=313705#313705

And, yes, I do realise that it was a relatively recent addition an doesn't apply before then or excuse deliberate shader rewrites/replacements that don't use the described method.
 
radar1200gs said:
And, yes, I do realise that it was a relatively recent addition an doesn't apply before then or excuse deliberate shader rewrites/replacements that don't use the described method.
Stupid question: then why are you defending it so much? :|
 
digitalwanderer said:
Stupid question: then why are you defending it so much? :|

He's obviously trying to convince us that Nvidia's way is the "right" way, and everyone else got it "wrong". Nvidia's inability to run at high precision with speed, or at high speed with precision, has always been a point of criticism.

Radar is trying to convince us that it's not an issue now that Microsoft has done some political shuffling and allowed low precision under certain circumstances to pander to Nvidia's hardware problems.
 
I'm not and never have defended blatant cheats. nVidia will have to live with their past as best they can. Mind you I do differentiate between reversible software cheats and cheats/shortcuts built into hardware.

All I have done is point out the situation as it actually stands at present, rather than the skewed situation some would like to portray as existing.
 
Bouncing Zabaglione Bros. said:
He's obviously trying to convince us that Nvidia's way is the "right" way, and everyone else got it "wrong". Nvidia's inability to run at high precision with speed, or at high speed with precision, has always been a point of criticism.

Radar is trying to convince us that it's not an issue now that Microsoft has done some political shuffling and allowed low precision under certain circumstances to pander to Nvidia's hardware problems.
That was my thinking too, which makes no sense in light of the reality of the situation. :?

radar1200gs said:
All I have done is point out the situation as it actually stands at present, rather than the skewed situation some would like to portray as existing.
Uhm, it IS a rather skewed situation right now! :LOL:
 
digitalwanderer said:
Bouncing Zabaglione Bros. said:
He's obviously trying to convince us that Nvidia's way is the "right" way, and everyone else got it "wrong". Nvidia's inability to run at high precision with speed, or at high speed with precision, has always been a point of criticism.

Radar is trying to convince us that it's not an issue now that Microsoft has done some political shuffling and allowed low precision under certain circumstances to pander to Nvidia's hardware problems.
That was my thinking too, which makes no sense in light of the reality of the situation. :?

radar1200gs said:
All I have done is point out the situation as it actually stands at present, rather than the skewed situation some would like to portray as existing.
Uhm, it IS a rather skewed situation right now! :LOL:

No, I'm not saying nVidia's way is the only way, or the right way.

I'll leave that sort of thing to the ATi supporters who seem to think that ATi's way is the only way.

Having said that though, I fully expect nVidia's way to prevail over time simply because more ordinary consumers own nVidia cards than ATi cards, and its the specs of low end, ordinary consumers, not high end enthusiasts that controls what tech we see appearing in future games.
 
Back
Top