Is 24bit FP a recognized standard?

Brent said:
I wasn't looking for anything to paste onto reviews, just for my own knowledge. I was just curious if 24bit FP in video cards was a recognized standard after hearing what NVIDIA said about it.
Basically it's at least as much of a 'standard' as any of the various other widths that have been used for pixel processing in the past - various hardware has used precisions like 9-bit and 12-bit in the past (even now we have 'FX12' on the GF:FX cards - where is the standard for that?)

The whole standard talk was just a smoke-screen. IEEE doesn't set the standards for the DirectX API - Microsoft does, and that standard is 24-bit floating point.
 
andypski said:
Brent said:
I wasn't looking for anything to paste onto reviews, just for my own knowledge. I was just curious if 24bit FP in video cards was a recognized standard after hearing what NVIDIA said about it.
Basically it's at least as much of a 'standard' as any of the various other widths that have been used for pixel processing in the past - various hardware has used precisions like 9-bit and 12-bit in the past (even now we have 'FX12' on the GF:FX cards - where is the standard for that?)

The whole standard talk was just a smoke-screen. IEEE doesn't set the standards for the DirectX API - Microsoft does, and that standard is 24-bit floating point.

cool, thanks for the info
 
geo said:
Back in the day when we all thought that 1mb on your videocard was pretty rocking and (if I recall correctly) Matrox Milleniums ruled the "high performance" world, 24-bit color was called "True Color" for 2d (16-bit was called "High Color"). Then, and I never quite caught why, it got promoted to 32-bit color, tho I think there was a time (maybe still?) where the 24-bit was still the output and the 32-bit was the internal palette.

Or something like that.

For a long time wasn't 32 bit "24 bit colour + 8 bit alpha channel" ?
 
Bouncing Zabaglione Bros. said:
geo said:
Back in the day when we all thought that 1mb on your videocard was pretty rocking and (if I recall correctly) Matrox Milleniums ruled the "high performance" world, 24-bit color was called "True Color" for 2d (16-bit was called "High Color"). Then, and I never quite caught why, it got promoted to 32-bit color, tho I think there was a time (maybe still?) where the 24-bit was still the output and the 32-bit was the internal palette.

Or something like that.

For a long time wasn't 32 bit "24 bit colour + 8 bit alpha channel" ?
And still is. Or, mostly, 24 bit color + 8 bit wasted for nothing but memory alignment.

In the age of "2D accelerators", several cards started to offer 32bit instead of 24bit modes for the sake of memory alignment. Back then many operations were still done on the host CPU with direct access to the frame buffer. And 32bit aligned pixels proved to be faster in many circumstances, despite the bandwidth and memory waste.
 
another fine example of Nvidia trying to push FUD on the rest of us......FP24 is the min percision called for in DX and Ati adheres to that spec... funny how Nvidia does their best NOT to adhere to that spec..... I just wish MS had the balls to forget about FP16...........and force Nvidia to follow the industry standard API.......
 
YeuEmMaiMai said:
another fine example of Nvidia trying to push FUD on the rest of us......FP24 is the min percision called for in DX and Ati adheres to that spec... funny how Nvidia does their best NOT to adhere to that spec..... I just wish MS had the balls to forget about FP16...........and force Nvidia to follow the industry standard API.......
I wonder how many knows what really happened during the time when the NV3x was being designed while NVIDIA (and other IHVs) were talking to MS to define the DX9 specifications....
 
The whole "standard" thing is not just FUD. You can see many people are considering using GPU to do things other than just rendering pixels now. If you really want to do this, you need a good standard for floating point numbers, and IEEE 754 is pretty good. Did Microsoft publish such standard for Direct3D? If so, is it good enough? I think not, considering there's no direct way to get maximum value, minimum value, and eplison.

Of course, for now it's a non-issue. For people who use GPU just for pixel rendering, 24 bits FP should be enough for many tasks.
 
Reverend said:
I wonder how many knows what really happened during the time when the NV3x was being designed while NVIDIA (and other IHVs) were talking to MS to define the DX9 specifications....
Probably NVIDIA initially aimed at FP32. Note: "partial precision" modifiers were added only in second beta of original DX9.0 SDK.
 
pcchen said:
The whole "standard" thing is not just FUD. You can see many people are considering using GPU to do things other than just rendering pixels now. If you really want to do this, you need a good standard for floating point numbers, and IEEE 754 is pretty good. Did Microsoft publish such standard for Direct3D? If so, is it good enough? I think not, considering there's no direct way to get maximum value, minimum value, and eplison.

That's not what Direct X was designed for. It's designed as a gaming API, and that's why they chose FP24. If it was designed for realtime realistic rendering, they might have chosen FP32. The fact that Nvidia can't reach FP32 at a reasonable speed shows that the MS was correct to choose FP24 as an intermediate mimimum precision format that offers high quality image with good speed.

Nvidia *may* have taken the route you suggest of designing NV3x with "things other than just rendering" in mind, but in doing so they have developed hardware unsuitable to matching the requirements of Direct X as a gaming API. NV3x only offers FP32 at too low a speed, or FP16 at too low a precision.
 
Too low a precision in any case?
It's quite funny to hear people complain about FP16 being unacceptable while most graphics cards in use still have far lower precision.
 
X2 said:
Too low a precision in any case?

For newer applications it isn't enough, especially where there is lots of texture passes. Sure there are some instances where FP16 will be sufficent, but that's not what the spec calls for. Even if you look at Doom3 where Carmack has coded to the Nvidia spec and has a game that doesn't use any of the higher precision, he still says in some cases there is a lowering of IQ. With more advanced games that are programmed for DX9, FP16 often isn't enough unless you want to drop down to DX8.

X2 said:
It's quite funny to hear people complain about FP16 being unacceptable while most graphics cards in use still have far lower precision.

That's why people are trying to *advance* graphics, and why we have DX9 with a higher spec, rather than still using DX6 with a lower spec.

It's quite funny to hear people justify that FP16 is acceptable by pointing to all the old tech still in use that have far lower specs. More so when that older hardware obeyed the spec they were built to service, when NV3x does not obey the spec it was supposedly built to service.
 
Bouncing Zabaglione Bros. said:
X2 said:
Too low a precision in any case?

For newer applications it isn't enough, especially where there is lots of texture passes. Sure there are some instances where FP16 will be sufficent, but that's not what the spec calls for. Even if you look at Doom3 where Carmack has coded to the Nvidia spec and has a game that doesn't use any of the higher precision, he still says in some cases there is a lowering of IQ. With more advanced games that are programmed for DX9, FP16 often isn't enough unless you want to drop down to DX8.

X2 said:
It's quite funny to hear people complain about FP16 being unacceptable while most graphics cards in use still have far lower precision.

That's why people are trying to *advance* graphics, and why we have DX9 with a higher spec, rather than still using DX6 with a lower spec.

It's quite funny to hear people justify that FP16 is acceptable by pointing to all the old tech still in use that have far lower specs. More so when that older hardware obeyed the spec they were built to service, when NV3x does not obey the spec it was supposedly built to service.

nVidia hardware is invariably built with an eye more towards OpenGL than Direct3D.

If you look back over the history of nVidia cards and drivers, you will find the vast majority of issues occur with Direct3D, not OpenGL.

Its no accident STALKER has an OpenGL engine, and I can confidently predict Doom3 will enjoy a similar success in engine licencing to Quake3 over the next few years.

The few DirectX9 titles we have seen so far have hardly set the world on fire - rarely has the PC seen so many disappointing games come out. DirectX9 appears to be a great technology for slowing your games down to the point of unplayability so far, with DirectX8.1 not far behind (Morrowind).

Over the years it has been OpenGL titles that have driven 3D gaming forward and I don't see this changing anytime soon (despite Microsofts fervent wishes).
 
2 radar1200gs: STALKER is DirectX9 game, so stop ranting here.
Now we complete the transition to DirectX 9.0. When completed, the latest accelerator generation boards, such as GeForce FX will allow up to 3.000.000 polygons per frame under good fps in S.T.A.L.K.E.R. The game world will be very detailed then.
The minimum guaranteed is 200.000 polygons. The maximum is 3.000.000.
 
DirectX9 appears to be a great technology for slowing your games down to the point of unplayability so far

That is only if, as a good fanb0y, you insist on having Nvidia hardware, of course...

Over the years it has been OpenGL titles that have driven 3D gaming forward and I don't see this changing anytime soon (despite Microsofts fervent wishes).

Do you really badly lack a clue, or is your Nvidia bias so strong that is has eaten the perception part of your brain ? OpenGL driving 3D gaming forward ? That's probably why 99% of PC games released use DirectX... If I look back at games that have driven 3D gaming forward, I don't see any OpenGL game, except perhaps Quake1, where OpenGL support was added as an afterthought. Unreal 1, Deus Ex, Homeworld (supported both D3D and OpenGL), Half-Life (same), Giants, Battlezone... are games that drove 3D gaming forward on the PC.
 
ok it looks like chalnoth called in some of his friends........

regardless of what your personal opinions of MS are you have to admit they did what no one else could:

They unified the pc standard with windows and also the gaming standard with direct x

In longhorn it is stated that EVERY window will HAVE to be a Direct X surface. So what does that mean? OGL will be wrapped to DX.

Those of us with Radeon 9500Pro or better are going to be able to enjoy an awesome desktop without worring about if your cards will be fast enough. The same cannot be said about NV30,31,34 as NVidia will have to hack their drivers like they are currently doing for just about every popular game or you can just stick to a DX 7 desktop while Radeon R3x0 or greater will get to use DX 9.

radar1200gs, it is quite the opposite, Nvidia did their best to slow down DX 9 and the advancement of realistic rendering....the way they acted throughout the last 1.5 years is proof of that......
 
radar1200gs said:
Over the years it has been OpenGL titles that have driven 3D gaming forward and I don't see this changing anytime soon (despite Microsofts fervent wishes).

Name a few except the creations of Carmack.
Personally I find that the vast majority of good modern games are Direct3D, not open GL. The exception is the Q3 based first person shooters of course. But I would hardly call the myriad of run of the mill shooters "driving 3d gaming forward". Sure there have been good ones, very good ones, but they still are pretty standrad when it comes to graphics and gameplay.

As a matter of fact the only recent Open GL based game I've been playing except the Q3 based shooters is KOTOR. Which of course is a good game but seems poorly programmed, or rather programmed with only nVidia in mind (just like Biowares other titles as well). But even that seems pretty strange since it is originally a DX8 title for the Xbox.

BTW very off topic but has anyone heard Bioware explaining why they chose to "port" their game to Open GL when doing the PC version?

Most games are D3D and the trend is still in favor of D3D and it will continue tt be that especially now when the Xbox is here and all. At least IMHO.
 
Slides said:
KOTOR is poorly programmed? By what definition?

Perhaps poorly worded. I rather meant to say "carelessly". Bioware seems to time after time screw up their ATi support. It's almost as if they didn't test the games on ATi hardware.

Sure it might very well be bad drivers when it comes to the ATi part of the problem. It just seems to me as if Bioware doesn't really bother enough.
 
Back
Top