high-precision 10-bit per channel frame buffer format in DX9

rwolf

Rock Star
Regular
Does anyone know anything about this feature of the 9700Pro and DirectX9.0? It is suppose to let you see billions of colors instead of millions. Has anyone noticed a difference? Does this work on any monitor? Is this feature even enabled currently? I know it won't buy you anything on an LCD, but how about on a high quality monitor?

Please comment if you have any info. :?:
 
Re: high-precision 10-bit per channel frame buffer format in

rwolf said:
Does anyone know anything about this feature of the 9700Pro and DirectX9.0? It is suppose to let you see billions of colors instead of millions. Has anyone noticed a difference? Does this work on any monitor? Is this feature even enabled currently? I know it won't buy you anything on an LCD, but how about on a high quality monitor?

Please comment if you have any info. :?:
The eye can't see billions of colours. Human vision, however, has a non-linear response and so display systems are usually set-up to give a non-linear (i.e. gamma) mapping from bit values to intensity.

What a 10bpc back/framebuffer would (almost) let you do is work entirely in linear colour space and then, if you wanted to, convert back to an 8bpc non-linear space to save framebuffer memory.
 
Re: high-precision 10-bit per channel frame buffer format in

Simon F said:
rwolf said:
Does anyone know anything about this feature of the 9700Pro and DirectX9.0? It is suppose to let you see billions of colors instead of millions. Has anyone noticed a difference? Does this work on any monitor? Is this feature even enabled currently? I know it won't buy you anything on an LCD, but how about on a high quality monitor?

Please comment if you have any info. :?:
The eye can't see billions of colours. Human vision, however, has a non-linear response and so display systems are usually set-up to give a non-linear (i.e. gamma) mapping from bit values to intensity.

What a 10bpc back/framebuffer would (almost) let you do is work entirely in linear colour space and then, if you wanted to, convert back to an 8bpc non-linear space to save framebuffer memory.

Good enought for me, so when do we see this implemented?
 
I thought the whole idea was to give you greater control of brightness.

So you can have bright colors and dark colors in the same scene. For example blue is 0-255 with 10-bits per channel you get 0-1023.

You may not be able to see billions of colors but you should be able to see more than 255 shades of blue.
 
Hello?? Nobody can forget Parhelia spec sheet this fast? can you?

Matrox has had option for Gigacolor rendering over a half year already via DX8.1 fallback. Still nice to see it on DX9 as supported too.

so it has been already implemented.
 
Nappe1 said:
Hello?? Nobody can forget Parhelia spec sheet this fast? can you?

Matrox has had option for Gigacolor rendering over a half year already via DX8.1 fallback. Still nice to see it on DX9 as supported too.

so it has been already implemented.

You saying the R300 uses it already in every D3D game?
 
Nappe1 said:
Hello?? Nobody can forget Parhelia spec sheet this fast? can you?

Matrox has had option for Gigacolor rendering over a half year already via DX8.1 fallback. Still nice to see it on DX9 as supported too.

so it has been already implemented.
High precision framebuffers have been around for a lot longer, eg SGI workstations.
 
I understand this is a hardware feature that is supported under DirectX9.0 only.

I don't think Matrox invented this feature just because they were first. I think it was part of the directx 9.0 spec.
 
Simon F said:
Nappe1 said:
Hello?? Nobody can forget Parhelia spec sheet this fast? can you?

Matrox has had option for Gigacolor rendering over a half year already via DX8.1 fallback. Still nice to see it on DX9 as supported too.

so it has been already implemented.
High precision framebuffers have been around for a lot longer, eg SGI workstations.

yes, they have, but I am kind of suprised that it has been less than 7 months from Parhelia launch and ppl are forgetting even it's rare good sides. Parhelia was disapointment, but that is not an excuse to downplay or forget what new features it introduces. (so, it's no wonder why ppl still think that hardware T&L setup was invented by nVidia with NV10, or that higher precision frame buffers haven't existed until now...) ;)

and KILLER, I don't know about ATI but Parhelia has been supporting 10 bit per channel (except 2 bits for final FB alpha) rendering since first drivers. all drivers have got possibility to force D3D work on GigaColor mode. (with some games, it works well, with others, not so well. due to fact that FB alpha has only 2 bits.)
 
From an article at Gamespy (I think... me forgets the link) :

Gamespy's article said:
Those who are not proponents of 64-bit color are also often quick to point out that current monitors only have approximately 32 bits of resolution, and roughly a 10-bit color component. While true, Carmack believes that this is beyond the point. The benefits of 64-bit color, he feels, manifest themselves in other ways.

He went on to explain that "While a monitor may only display this, say, 10-bit resolution on there, the human eye is capable of perceiving well over a range of 64,000 [colors]... like the difference between these lights that are shining right in my eyes here, and the floor... sitting down there, between the aisles. That's a difference of hundreds of thousands of levels."

"The way you should be calculating all graphics... the way it ought to be done is: you're basically counting out photons that are, you know, imprinted on a surface. Lights spray out a whole lot of photons, that are collected on surfaces. If you're doing things really right, you have an inverse square falloff, and you have a radiosity map, and all this... So what we want to do is do all of this calculation the right way, and then... we know at the end that it's going to be going on to some... not exactly optimal solution that we want on the monitors there. But there's still a lot of benefit to gain by doing all of the intermediate calculations the way you really should do it. "

He explained that the benefit to more realistic lighting is that it reduces the dimming effect that the various "crappy approximations" that engine coders have had to use until now often produce. This dimming effect is often a source of frustration for artists and content designers, as the piece of artwork they create in photoshop can (and frequently does) look vastly different from what is rendered out in realtime by the game. While Doom does not fully calculate light the way Carmack would like, it is capable of specifying lights that are much brighter than the normal range. The downside of this capability is that, without radiosity bounces, it can make the edges much more harsh.

"But... with the next generation stuff, all you really have to do is basically say 'this light is arbitrarily bright,' and everything just kind of magically works."
 
Human vision system can indeed "see" something like 100 000 colors or so. The main point is, that the distribution of these different colors isn't uniform amongst the R,G,B components. If I remember correctly humans are most sensitive to the green and can see several thousand shades of it. Of course also the dynamic range is very important, both the magnitude and the non-linear (somewhat exponential ?!) distribution of color values amongst the total range.
 
People are also much more sensitive to brightness gradiations at low brightness. That is, you're most likely to see banding in a dark portion of the scene (this is most particularly true if you increase the gamma).
 
The quesiton remains. Do current games on the R300 line of cards have it enabled? The Parhelia has it enabled, at least you cna force it on.
 
I believe the 9700Pro only supports the A2R10G10B10 back-buffer format in windowed mode, which is rather useless.
 
I checked it and found that currently R300 only support 10/10/10/2 in windowed mode. Furthermore, it does not support 10/10/10/2 display format, and I don't see any dithering. However, it could still help if gamma correction is used.
 
eSa said:
Human vision system can indeed "see" something like 100 000 colors or so. The main point is, that the distribution of these different colors isn't uniform amongst the R,G,B components. If I remember correctly humans are most sensitive to the green and can see several thousand shades of it. Of course also the dynamic range is very important, both the magnitude and the non-linear (somewhat exponential ?!) distribution of color values amongst the total range.
IIRC, Human vision is actually quite sensitive to blue levels but not as spatially sensitive as the other colours. (The theory is that blue receptors were a later development in (primate) vision).

The level sensitivity is logarithmic so that we can detect finer changes when the brightness is lower, with the deltas increasing with increasing intensity.
 
Interestingly enough, the fact that the human eye relies mainly on green and red for detailed images is being developed into new displays by the following company:

http://www.clairvoyante.com

Quite an interesting concept and it will be interesting to see how the displays look when they finally come out.
 
Matrox has a PhotoShop plugin that allows you to display 10 bit per channel images. I would guess that most images won't show a difference since as others have said the extra bits only help in certain situations. I am not aware of a plugin like this for the 9700.
 
Mariner said:
Interestingly enough, the fact that the human eye relies mainly on green and red for detailed images is being developed into new displays by the following company:

http://www.clairvoyante.com

Quite an interesting concept and it will be interesting to see how the displays look when they finally come out.
Not surprising as digital camera manufacturers have been doing similar things for a while in their sensors.
 
Back
Top