X360 GPU emulate 30bit color with 24bit output + dithering?

Quaz51

Regular
X360 GPU emulate 30bit color range on 24bit output with dithering?

When i displayed this test pattern (http://upsilandre.free.fr/images/MireDarkColor.jpg X360 read only jpeg) on my X360 HDMI 1080p and Full Range RGB, I found strange pattern dithering on every tint, Not clear and uniform tint, whereas when i displayed same test pattern on my PS3 HDMI 1080p Full Range RGB and same wire HDMI and same input TV i see perfect clear and uniform tint!

And on this Dashboard HDMI Lossless Screenshot (thanks Richard Leadbetter of eurogamer ) http://img242.imageshack.us/my.php?image=videooutputat0.png
I see the same dithering like this:

DashZoom.jpg


I verified on game with "the Simpsons", because use uniform color on every surface, and on my TV screen and particularly on this lossless Simpsons game HDMI screenshot (re-thanks Richard) i find the same dithering on every all tint!
http://upsilandre.free.fr/images/Simpsons1.bmp
http://upsilandre.free.fr/images/Simpsons2.bmp
http://upsilandre.free.fr/images/Simpsons3.bmp

SimpsZoom1.jpg

SimpsZoom3.jpg

SimpsZoom4.jpg

SimpsZoom6.jpg

SimpsZoom7.jpg

http://upsilandre.free.fr/images/SimpsZoom2.jpg
http://upsilandre.free.fr/images/SimpsZoom5.jpg


I read that recent ATI GPU support internally 30bit color mode and emulate on 24bit output/screen with dithering, i think Xenos support this but i don't know how game can profit this ( i think FP10 can't use this color display range?)


edit: last information, dithering use 4x4 pattern with 5 different 24bit color per pattern
 
Last edited by a moderator:
yes and PS3 can do true 48bit color output :cool: but it's not used, games are in 24bit ;)

the question is why X360 don't output just clear 24bit color without dithering, games can use this 30bit color range?
 
Last edited by a moderator:
the question is why X360 don't output just clear 24bit color without dithering, games can use this 30bit color range?

X360 does seem to have 30bit support, since the front buffer accepts D3DFMT_LE_X2R10G10B10. I don't know if that gets output through the HDMI though.
 
last information... this dithering use 4x4 pattern with 5 different 24bit color per pattern
 
Last edited by a moderator:
The display pipeline (where the scaling is done) is 10-bit per component as well. After that, though, its passed off to ANA/HANA AFAIK.
 
Quaz is referring to HDMI 1.3a ;)

(which would actually be 16 bits per component, but I don't know about actual output from RSX to the front buffer)
 
last information... this dithering use 4x4 pattern with 5 different 24bit color per pattern

4x4 (with multiple patterns) with 2 colors would at most give you 4 additional fake bits.
How do you get 6 bits from a single pattern even with 5 colors (which can only help to locality for less than 4 pixels, not color depth)?
 
The display pipeline (where the scaling is done) is 10-bit per component as well. After that, though, its passed off to ANA/HANA AFAIK.

then this 30bit range color can't be used for rendering?
in this case it's less interesting... but i think it's use for gamma correction? or other display process? it's good regardless
 
I seen an interesting thing on 1080p screenshot. The dithering is added after upscale and it have a 1080p precision when you configure your X360 in 1080p output.
it's good for FULL HD TV owner that play 720p games in 1080p. A dithering in 1080p precision is better than a dithering in 720p precision. quality of dithering is very dependant of this discretion. 720p games have better dithering in 1080p mode. 1080p TV owner win! :)

Gamma correction or other display process are added also after upscale with 1080p precision too?
 
Last edited by a moderator:
Two questions:
  1. Does the dithering show up when it's outputting 720p (rendered 720p) or 1080p (rendered 1080p)?
  2. Does the dithering show up for a game that isn't rendered in 720p (output either 720p or 1080p)?
 
Dithering is present in all case (all output resolution and native resolution, it's built in the display pipeline)

And quality dithering is only output resolution dependant, and the best output resolution is 1080p for a better dithering (for 720p game or sub 720p or 1080p or other...)
 
I thought that HDMI Full Range RGB was 24 bit and the difference was that it uses the full 0-255 8 bit range for each component while the common is to use less in some TVs (16-242?).
 
That's just the difference between video systems RGB and computer systems RGB, not the amount of bits per pixel which in both cases there should be 8.
 
Back
Top