FP16 and market support

Chalnoth said:
What you are speaking of, mistakenly, is of course a result of nVidia's choice to retain integer units in the original NV3x lineup (NV30-34), in conjunction with Microsoft's refusal to support integer types.

Ah, yes. It's all MS fault, of course. The fact that some IHV managed to implement (and come first to market) a full FP part running at great speed should indeed tell you something (that Nvidia got badly burned by their old strategy of designing their next gen parts as a faster previous gen part, plus some next gen features thrown in for OEM checkboxes).

Since the NV30-34 suffer a huge performance hit when going all FP (they lose about 2/3rds of their available power), nVidia has been forced to use integer precision anyway under DX9.

Of course, Nvidia would never decide on their own to hack image quality in order to gain some speed, wouldn't they ? In Chalnoth's wonderful NV-colored world, they are forced to by bad, evil MS and their forward-looking standards, and bad, nasty ATI that releases parts that follow this spec and offer great speed and image quality. Just like BB is forced to lie, bully and spew BS all day long to feed his family.

It would have been vastly better for everybody involved if Microsoft had just supported integer types.

"Everybody", in Chalnoth's NV-colored vision, means "Nvidia and the mindless Nvidia fanb0ys who bought what they knew was an inferior product and went into denial afterward".

5900 Ultra.

I'm not sure if your Nvidia driver is replacing every mention of it, but he said "0085 Ultra" (reversing the order of digits so your driver won't catch it, hopefully you won't mind the dip in 2D rendering performance).


Integrated chipsets.

Nvidia's excellent sales in the high end market indeed prove that Intel is Nvidia's main competitor... Looks like NV pretty much dropped out of high end and doesn't consider itself in competition with ATI. :)

False. ATI had DX9 hardware first, so developers started developing DX9 titles on ATI hardware. ATI got a head start.

Denial at its finest...

ATI doesn't do any trilinear on any texture stage but the first (when aniso is selected from the control panel).

And it does full trilinear when you specify application-controlled filtering and the app has a setting for trilinear. How can you get full trilinear filtering with the Cheatonators ? You can't.

Specifically: what has S3 sacrificed to get that performance?

Well, they have many options, ranging from butchering filtering to replacing shaders with hand-tuned versions (calling it a generic compiler) or to static clip planes. I trust your answer was ironic, or was it a case of pot calling the kettle black at its finest ?

Regardless, the 5600 line is being replaced by the much better 5700 line.

Which is a damn fine consolating thought to all the people having bought Nvidia's li(n)e and a 5600...
 
chalnoth,

funny that I said 5200 and 5200 does NOT have FP32 units FP16 yes but NOT FP32

I guess that is why they only run as DX 8 cards when detected by DX9 games

Funnly I can still find 9700Pro on ATi's website

why do you insist that Ati does not do trilinear, when in fact they do when the application requests it.

i know that while most of us want to have the best performance and the best IQ for the $ you would much rather have nVidia try and split the industry

tell me is it in the gamers best intrest to have nVidia doing what they do? IT would be totally different if Nvidia just admitted this time they did not do the best that they could and at least adhere to the spec and produce quality drivers that made the most out of their hardware instead of cheating their rumps off.

S3 was exterminated for the savage 2000's flawed hardware but we are supposed to give nvidia a pass? (sonic blue the company directly responsible for the savage 2000 has filed for chaper 11) Now S3 graphics is tainted by the savage 2000 (rightfully so).

you know what was funny when 3dmark 03 game out my 9500Pro performed better than the 5800 ultra and DX9 titles bear this out before nVidia gets a chance to hack them.........

Diamond MM may or may not sufffer from this mess, I guess we'll find out in a few months when they ship their cards.....
 
Chalnoth said:
Microsoft's refusal to support integer types
You told me you never laid out blame. Oops. WHy didnt you say "nVidias refusal to have decent floating point performance, instead of including worthless legacy precisions"

nVidia has been forced to use integer precision anyway under DX9
Awww, poor nVidia, "forced" by their terrible performance to cheat. Doesnt sound like a "forced" to me, sounds like a choice - "nVidia choose to use integer precision anyways because anything else would result in crappy performance due to the inclusion of worthless legacy precisions in the chip design"

It would have been vastly better for everybody involved if Microsoft had just supported integer types.
It would have been vastly better for everyone involved if nVidia had designed a chip that didnt suck so very badly at floating point operations.

Chalnoth, do you ever get the feeling that everyone is against you (except for radar1200gs, lol) ?
Do you think it could be because everyone is finally sick of your FUD, misinformation, and hugely blinding bias?
 
Demirug said:
Why? Because it ist DX9 chip.

A NV34 kann execute 2 vector4 FP32 operations per clock.

I wouldn't be so quick to label a chip DX9 that is considered DX8 with developers. Lets not be kidding anyone here, a 5200 is unusable with DX9 shaders at full precision.
Intelligent people look beyond marketing PR labels, and the 5200 is classic example of misleading PR....it may support 'some' DX9 feature set but is far too slow to be useable, so the end user will have to disable that option in their game which is no different then not having it at all.
 
Chalnoth said:
FP32 is used for texture ops.

Very slowly, along with other usage of FP32 in NV3x?


Chalnoth said:
It has the same basic architecture as the 5600 and 5800. With no FP32 units, proper texturing could not be achieved.

What you are speaking of, mistakenly, is of course a result of nVidia's choice to retain integer units in the original NV3x lineup (NV30-34), in conjunction with Microsoft's refusal to support integer types. Since the NV30-34 suffer a huge performance hit when going all FP (they lose about 2/3rds of their available power), nVidia has been forced to use integer precision anyway under DX9. It would have been vastly better for everybody involved if Microsoft had just supported integer types.

Ahh, so because Nvidia doesn't support the spec, the solution is to downgrade the spec back down to DX7/8 so that Nvidia can still claim to be "DX9 compliant"?

Chalnoth said:
6. What ever happened to the 5800Ultra? can you find it on nVIdia's webpage?
5900 Ultra.

Which is not the same product is it?

Chalnoth said:
6. Why their FSAA sucks in comparison IQ and FPS wise to ATi's why is it that Ati's 6x FSAA looks better and runs faster than nVidia's 8x?
Unfortunate. Hopefully fixed in the NV40.

Meh. Or maybe in the next drivers. Or maybe in NV50. Or maybe when monkeys fly out of Jen-sun's butt.

Chalnoth said:
7. why nvidia has been quoted as saying INTEL is their main competitor?
Integrated chipsets.

Or what we like to call "living in denial". It just shows how Nvidia are worried by ATI when they won't even publicly speak about the competition we know exists between the two companies.

Chalnoth said:
8. why Ati can pretty much run DX9 titles right out of the box and nVidia has to hand code for each game and even worse yet pressure developers to remove benchmark modes from their games since it makes them look bad (TRAOD)
False. ATI had DX9 hardware first, so developers started developing DX9 titles on ATI hardware. ATI got a head start.

Well NV3x was far, far behind schedule because Nvidia couldn't get it to work properly. ATI's early adopter avantage is just another point in ATI's favour.


Chalnoth said:
9. why don't they do full trilinear in UT 2K3? while Ati does
ATI doesn't do any trilinear on any texture stage but the first (when aniso is selected from the control panel).

That's because that mode is designed for forcing triliner on old apps that don't otherwise support it. When the control panel is set to "application preference" and the application is set to ask for trilinear, it get it on all the stages it requests it for. This is in start contrast to Nvidia products, where the application never gets anything except brilinear at best.

Chalnoth said:
10. why that S3's new card is faster on some of the DX9 benchmarks than their 5600 and 5600 ultra cards and will be about the price of a 5200?
Heh. S3's card is going to suck in so many different ways, its performance will be a secondary consideration. Specifically: what has S3 sacrificed to get that performance?

Maybe they'll lie and cheat on the benchmarks too?

Chalnoth said:
Regardless, the 5600 line is being replaced by the much better 5700 line.

Whoop-de-do. A year after R300 ships, Nvidia finally goes from "sucks badly" to "adequate in the midrange".
 
Doomtrooper said:
Demirug said:
Why? Because it ist DX9 chip.

A NV34 kann execute 2 vector4 FP32 operations per clock.

I wouldn't be so quick to label a chip DX9 that is considered DX8 with developers. Lets not be kidding anyone here, a 5200 is unusable with DX9 shaders at full precision.
Intelligent people look beyond marketing PR labels, and the 5200 is classic example of misleading PR....it may support 'some' DX9 feature set but is far too slow to be useable, so the end user will have to disable that option in their game which is no different then not having it at all.

Yes, the use of DX9 Features is slow on a FX5200 but it is not unusable slow. If you calculate most pixel with the DX8 Featureset there is some room for the use of DX9 Features on small portions of the Image.
 
The only thing I've seen the FX5200 to be useful for is as a cheap feature compatability testing platform. Everyone I know who bought one for gaming has sold it and gotten a Radeon 9600 -- and they're much happier.
 
Ostsol said:
OpenGL guy said:
What's your point? Is anyone interested in staring at four texels stretched across the whole screen?
The point is to see if the same image can be produced, which would show that the same precision (FP32) is being used in texture lookups. Given that the NV25 supports textures up to 4096x4096, I think it quitely likely that it does lookup textures with FP32 texcoords.
Texture coords, maybe, but not texture lookups. What need does NV2x have for FP32 per component texture lookups? None, because the shader is far lower precision.
 
Ostsol said:
The only thing I've seen the FX5200 to be useful for is as a cheap feature compatability testing platform. Everyone I know who bought one for gaming has sold it and gotten a Radeon 9600 -- and they're much happier.

Sure a 9600 is a better gameing solution as a 5200. Hell you have to pay nearly twice the price for a 9600 as for a 5200.

IMHO a cheap (and slow) DX9 card is better than something like a GF4MX again. I do not need such a "Technology brake" again.
 
OpenGL guy said:
Ostsol said:
OpenGL guy said:
What's your point? Is anyone interested in staring at four texels stretched across the whole screen?
The point is to see if the same image can be produced, which would show that the same precision (FP32) is being used in texture lookups. Given that the NV25 supports textures up to 4096x4096, I think it quitely likely that it does lookup textures with FP32 texcoords.
Texture coords, maybe, but not texture lookups. What need does NV2x have for FP32 per component texture lookups? None, because the shader is far lower precision.

No, it isn't. The NV2X Texture shader works with FP32.
 
radar1200gs said:
I would not put too much weight on what OpeGL guy has to say. Remember he is/was responsible for OpenGL on the S3 Savage series and for ATi cards. We all know how the OpenGL quality of those products stacks up vs nVidia's OpenGL.
That's it, resort to personal attacks when you can't refute someone else's argument. Anyway, back in the days when I was at S3, the OpenGL driver was quite decent at the time. The few OpenGL games there were played quite well given the limitations of the HW.

But go ahead and insult me some more if it makes you feel good.
 
I've seen the 5200 Ultra for the same price as a 9600 Pro in one of our local stores........ so you never know.

I do think the FX5200 is a step forwards from the MX series, but I don't know that I'd call it a true DX9 card as far as performance goes. It has the check-box, but I don't know if it can use the features.
 
OpenGL guy said:
radar1200gs said:
I would not put too much weight on what OpeGL guy has to say. Remember he is/was responsible for OpenGL on the S3 Savage series and for ATi cards. We all know how the OpenGL quality of those products stacks up vs nVidia's OpenGL.
That's it, resort to personal attacks when you can't refute someone else's argument. Anyway, back in the days when I was at S3, the OpenGL driver was quite decent at the time. The few OpenGL games there were played quite well given the limitations of the HW.

But go ahead and insult me some more if it makes you feel good.

Mate, I owned a Savage3D. The card should have ran the 3dfx Banshee out of the market and provided the TNT-1 with stiff competition. It did neither, mainly thanks to the most abysmal drivers ever to "grace" a graphics card (and because of the drivers, the OEM's and system integrators who would have ensured the cards success would not touch it with a fifty foot barge pole).

I'm well aware of how Savage3D did in OpenGL - experienced it firsthand, which is what lead me to buy my first nVidia card...
 
Savage3d and 4 were not bad not at all....but there was no way that savage3d was going to bury TNT as it lacked single pass multi texturing support

the one thing that crippled the savage3d/4 was it's 64 bit memory interface......
 
YeuEmMaiMai said:
Savage3d and 4 were not bad not at all....but there was no way that savage3d was going to bury TNT as it lacked single pass multi texturing support

the one thing that crippled the savage3d/4 was it's 64 bit memory interface......

I know that - I said "provide stiff competition".

At the time TNT-1 was very expensive and Savage3D implimented right would have succeeded on price. The drivers never allowed it that chance.
 
DaveBaumann said:
Exactly what experience do you have to make the call that drivers or hardware was the issue?
That hasnt stopped him from making any calls in the rest of this thread, why do you expect it to matter now?
 
radar1200gs said:
Mate, I owned a Savage3D. The card should have ran the 3dfx Banshee out of the market and provided the TNT-1 with stiff competition. It did neither, mainly thanks to the most abysmal drivers ever to "grace" a graphics card (and because of the drivers, the OEM's and system integrators who would have ensured the cards success would not touch it with a fifty foot barge pole).
Which goes to show how much you know.
 
radar1200gs said:
I would not put too much weight on what OpeGL guy has to say. Remember he is/was responsible for OpenGL on the S3 Savage series and for ATi cards. We all know how the OpenGL quality of those products stacks up vs nVidia's OpenGL.
:rolleyes:
By that same token I will assume then that you will offer full credit and congratulations to OpenGL Guy for the excellent DX drivers that he must solely be responsible for.
 
Back
Top