10 bit per component on R9700 again

kika

Newcomer
Hello everyone,.

Application I'm writing needs 10 bits per component badly. We're working with a high dynamic range images (textures) and a high dynamic range display device. And our images actually do contain a lot of precious information beyond 8 bits.
Despite the fact that 10-bit RAMDAC and framebuffer were widely announced by ATi they're not supported in DirectX and OpenGL drivers. ATi's developer relations support says that MS prohibited this mode due to the lack of alpha-blending (which we don't need in our app). Is there a way to circumvent this restriction? Any other card with similar performance (read: Geforce FX) that really supports 10 bits per component? Registry tweak? Driver hack? Whatever else?

Thanks in advance,
Cyril
 
So you want a display format using the A2R10G10B10 format ? As far as I can see in the DX docs there is nothing stopping an IHV from exposing this. The following backbuffer formats are supported :

BackBuffer or Display Formats
These formats are the only valid formats for a back buffer or a display.

Code:
Format              Back buffer       Display 
A2R10G10B10           x                  x (full-screen mode only) 
A8R8G8B8                x  
X8R8G8B8                x                  x 
A1R5G5B5                x  
X1R5G5B5                x                  x 
R5G6B5                    x                  x

As can be seen in this list the format is supported for display but fullscreen only.

So in 3D mode this is definitely possible, are you talking about a 2D mode maybe ?

K-
 
Kristof said:
So you want a display format using the A2R10G10B10 format ?

Exactly

Kristof said:
So in 3D mode this is definitely possible, are you talking about a 2D mode maybe ?

Nope, I'm talking about fullscreen 3D. The problem is that in theory there's no difference between theory and practice, but in practice there is. :D
If you have a R9700, C compiler, DX9 SDK and 10 minutes of time you can take any MS sample and try to call
pD3D->GetAdapterModeCount(0, D3DFMT_A2R10G10B10 ) and see what it returns. It returns exactly zero.
 
Radeon drivers don't support A2R10G10B10 as a back buffer in current drivers. They do however support it as a render target. So you could 'hack' around the lack of support of it as a back buffer using render to texture.
 
Hmm, but that means its not MS putting up this limitation (unless the docs are wrong - which would not be impossible ;) ). I wonder if Matrox exposes this on Parhelia, then again have they even released DX9 drivers ?

K-
 
Kristof said:
... then again have they [Matrox] even released DX9 drivers ?

K-

No they haven't :(
There isn't even one on their registered developer site.
Parhelia needs new drivers badly - the current one is buggy as hell.
 
Colourless said:
Radeon drivers don't support A2R10G10B10 as a back buffer in current drivers. They do however support it as a render target. So you could 'hack' around the lack of support of it as a back buffer using render to texture.

But how do I display it? I need 10 bits end to end, from texture to display device.
 
Another fun

Another nice feature. Even my 2-year old daughter knows well that R9800 will support the coolest technology ever invented since the wheel - The Mighty F-Buffer. I've read the description of what it is and for what purpose and discovered that it might be of great help for our upcoming application (we have complex shaders and very complex shaders). Imagine what? Right, it isn't supported and nobody (even ATi) knows when it will be implemented in drivers. I was told that 'somewhere around OGL2 release' (I'll be a grandfather then).
Nice business, heh? Announce the feature, get the credits, listen to the applauds, and then say "sorry guys, the feature is there, but...alas... not supported".
Everybody tells that R300 have a 10-bit RAMDAC. How do you know that? It might be 12 or 14 bits. Or 8, as usual. The only way to check is to reverse engineer the chip.
 
kika said:
But how do I display it? I need 10 bits end to end, from texture to display device.

I tried it before and R300 does not support 10 bits modes for display mode, so you won't see 10 bits on the display device (although you can have a 10 bits render target).
 
Re: Another fun

kika said:
Everybody tells that R300 have a 10-bit RAMDAC. How do you know that? It might be 12 or 14 bits. Or 8, as usual. The only way to check is to reverse engineer the chip.
10-bit RAMDAC doesn't mean it can display a 10-bit per component framebuffer because it's 10-bit after the gamma LUT, not necessarily before.
 
Re: Another fun

Xmas said:
10-bit RAMDAC doesn't mean it can display a 10-bit per component framebuffer because it's 10-bit after the gamma LUT, not necessarily before.

That's not good, but acceptable for our purposes. LUTs are 8 bits anyway as seen through API.
Now we're doing 10-bit wide LUT on the CPU, using SSE and other neat features of the modern CPUs. Given the bandwidth we require we put Xeon 2.8 on it's knees. Instead of doing this in RAMDAC absolutely for free performance-wise.
 
Saem said:
I'm guessing you're going to be looking forward to Canterwood and the new 800MHz FSB P4s.

How do you know that?!? :devilish: I'm looking for a 875-based mobo for my development workstation, but they're not on the shelves yet (but 865-based ones are there already for about a week or so, I wonder how, they're not yet even announced).

In fact even new CPUs won't help ust. We have a lot of useful things to do on CPU instead of dumb table lookups. I think we'll choose a 'RAMDAC emulation" with texture lookups on the card, but it also not free (3 extra lookups plus ps 2.0 limitations with dependent reads).
 
Hey people!!! I need 16 bits per component as a render target in OpenGL, on Linux!!! Can't get that with my current cards, I saw Parrhelia advertising 10 bits, but I read here about buggy drivers...

Now, will the new ATIs or NVs support FP framebuffers?
If I get the resolution I need in floating point then that will work for me!!

I don't care about display (RAMDACs), since I always do an 8 bit LUT.

Right now the best I can get is 12 bits per component on an Onyx2 IR3.
 
am2020 said:
Hey people!!! I need 16 bits per component as a render target in OpenGL, on Linux!!! Can't get that with my current cards, I saw Parrhelia advertising 10 bits, but I read here about buggy drivers...

Now, will the new ATIs or NVs support FP framebuffers?
If I get the resolution I need in floating point then that will work for me!!

I don't care about display (RAMDACs), since I always do an 8 bit LUT.

Right now the best I can get is 12 bits per component on an Onyx2 IR3.
I don't know about current OpenGL support (and even less about Linux drivers), but the Radeon 9500s and above all support FP textures/rendertargets in D3D.
 
A slightly related question: if I render to a floating point (or other higher precision) render target, and copy it into a 8 bits back buffer, will it be dithered?

If not, maybe I should start to work on a shader to do this... :)
 
Currently, the 9700/9800 do not support blendable 10/10/10/2 formats, and so would fail WHQL if it was made available as a displayable render target.

It is available as a texture render target or a as windowed mode render target.
 
Colourless said:
Currently

Does that mean there is hope that sometime in the future blendable 10/10/10/2 will be supported?

Ati only release WHQL certified drivers and even if they did make a driver set that could do it, they would never release it.
 
Correct Me if I'm wrong didn't Nvidia claim the NV30 does 10 bit color through the entire pipeline 24/7?
 
Back
Top