128 bit color

sancheuz

Newcomer
I think one of the best features that i have seen capable in the gffx and 9700 is the ability to do 128 bit colors. Screenshots at 128 are amazin. I have a question, can you apply 128 bit color to any previous, and current game, or do we have to wait till forthcoming games that support it. And if we have to wait, how long will it be till we see games with 128 bit color?
 
Until value cards domiate the mainstream that can support 128bit color, it will be left out of most titles. Wild guess in 3 years maybe...maybe....
 
From one layman to another...

I think it is more accurately referred to as: 128-bit...floating point...color ...processing. The actual images you've seen are still just 32-bit color.

There is more than one reason they can look better:

Game utilizes extra capabilities: for a game that can handle layering many effects in as few passes as possible and has enough effects layers to worsen the image quality of these effects on older cards, the newer cards would look better. The problem with this is that I don't know that any old games operate like this, so only newer games would see this benefit.

For multipass: the "floating point" part means that if the cards operated like old cards (render one pass to a puffer, then layer on more effects in another pass, and maybe another pass after that) and they used floating point buffers (which the game would have to specify to use), then they would look better. In this case, either the game is old and isn't fully utilizing the cards capabilities when multipassing (an old game quickly hacked to support new capabilities would possibly do this), or the game really needs to do this even with newer capabilities, and I don't think that will be common, if it happens at all, due to slow performance.

New ways of doing things: the ability for higher precision, and using floating point, allows things like High Dynamic Range to be rendered more easily, as well as I don't know how many Other Cool Things That Might Become Common. All of these will also require new(er) game coding approaches, and hacking them into old games seems like a low return investment for most cases (maybe valve, for example, would have motivation do something like that, but I don't think their half-life engine is capable enough to benefit).

The good thing about the "game has to be coded to support this" situations above is that a game coded to support shaders at a DX 8 level should be able to be enhanced for the advantages offered by DX 9 level shaders easily in many cases, and many upcoming (as in near future) games seem to be doing exactly that.

Something final to note: for color processing the 9700 family has 96-bit color, not 128-bit. For real time rendering (i.e., games), the difference between this and 128-bit color would end up being invisible in final output (32-bit color). Hmm...in fact, I'm not sure how much processing would be required for there to be any difference whatsoever between the two in the 32-bit color, or whether that amount of processing is reasonable for games.

---and for my friends across the Atlantic:
American English Error Accumulator: uuuuuuuuuuuuuu
 
demalion said:
From one layman to another...

I think it is more accurately referred to as: 128-bit...floating point...color ...processing. The actual images you've seen are still just 32-bit color.

If you want to be really picky, then you should say that you only see 24bits (8:8:8 RGB) of colour information. The 8bits of the alpha channel don't contribute to the final image.

The R300 and Parhelia also support 30bit outputs (10:10:10 R:G:B). The GeForceFX DOESN'T, as far as I know.
 
fresh said:
demalion said:
From one layman to another...

I think it is more accurately referred to as: 128-bit...floating point...color ...processing. The actual images you've seen are still just 32-bit color.

If you want to be really picky, then you should say that you only see 24bits (8:8:8 RGB) of colour information. The 8bits of the alpha channel don't contribute to the final image.

The R300 and Parhelia also support 30bit outputs (10:10:10 R:G:B). The GeForceFX DOESN'T, as far as I know.

I actually typed 24-bit originally and considered discussing 30-bit color in contrast.

But the topic was 128-bit and redefining that to 96-bit would have been very confusing with my discussion of the 9700 96-bit mode.

And I think 30-bit color is just is a completey different topic altogether. I've even understood one comment about the GF FX to seem to indicate it doesn't support outputting 30-bit color, for example, and that would be an entire thread topic right there.
EDIT: Heh, and you sort of referred to that, didn't you? ;) I think it was a rather strong indication, atleast my understanding of the wording, and seemed to be from the horse's mouth. Perhaps we can discuss it when we have a demonstration of it for the R300 (is it a DX 9 feature..?).
 
demalion said:
Something final to note: for color processing the 9700 family has 96-bit color, not 128-bit. For real time rendering (i.e., games), the difference between this and 128-bit color would end up being invisible in final output (32-bit color). Hmm...in fact, I'm not sure how much processing would be required for there to be any difference whatsoever between the two in the 32-bit color, or whether that amount of processing is reasonable for games.
It shouldn't ever be visible for color processing, but differences may become apparent for non-color rendering (normals and whatnot). It may be a while before any differences become apparent (if ever).

I think the primary benefit of going full 32-bit is that you can do 16-bit at twice the speed without many more transistors.
 
"Not so many more transistors" ... I seriously doubt that. Floating point processing is nowhere near as straightforward to split into small units as with integer.
 
Chalnoth said:
I think the primary benefit of going full 32-bit is that you can do 16-bit at twice the speed without many more transistors.
I suppose when your 32 bit computations are slow, that becomes important.
 
Back
Top