6th generation consoles and dithering?

eddman

Newcomer
I don't see a newbie/beginner sub-section for the console section, so excuse me if this is not in the right place.

Although I've been following PC/console hardware for many years, I'm still an amateur when it comes to graphics processing.

It's well known that the large majority (all?) of PS1 games use dithering heavily. There are a few informative youtube videos on this subject.

Apparently, it also seems to be used on 6th gen consoles to some extent. The explanation that I've read around the internet is that although their GPUs are capable of processing at a higher color depth, the frame buffer is not large enough to hold the frames at those depths, so the output is reduced to 16-bit + dithering.

1. Is that explanation correct?
2. Does that apply to all these consoles? If not, what's the reason for each one?
3. Does the original xbox also use dithering? There are posts claiming its output is 24-bit.
4. Is it widespread or used in just a few titles?

I became interested in this subject mainly because of the HDMI mods in recent years. The dithering pattern on old consoles is not supposed to be visible when using a composite connection, and specially on a CRT display, but becomes very apparent on modern displays with HDMI.
 
I'm not sure about the final video output, but remember that TVs also can do dithering to achieve better color range using their own processing.

It's also worth noting that when people talk about seeing dithering in video games on modern machines they often mean the actual shader (materials of objects in the virtual scene), which is a video game's "fault". Dithering is still widely used (commonly in hair) and will be used in future games (it's still the only way to get pleasant type of transparency in some materials in Unreal Engine 5; transparency is still a very challenging subject in rendering) and it has nothing to do with the hardware in the console. It's a different purpose (faking transparency of objects instead of color depth of final picture), but some people may get confused by seeing the recognizable visual artifact.
 
Dithering is still widely used (commonly in hair) and will be used in future games (it's still the only way to get pleasant type of transparency in some materials in Unreal Engine 5; transparency is still a very challenging subject in rendering) and it has nothing to do with the hardware in the console. It's a different purpose (faking transparency of objects instead of color depth of final picture), but some people may get confused by seeing the recognizable visual artifact.
Perhaps I used the wrong terms; I suppose I mean render output and not video output. I'm not talking about dithering for game assets, but for the entire screen.

For example, PS1 games have a dithering pattern that is applied to the entire image. The explanation that I've got is that the PS1 hardware renders the games at 16-bit color and then applies a dithering pattern to essentially emulate a 24-bit color rendering (32-bit?), to get rid of color banding. This pattern was designed to be effective when used with a composite output. On a CRT+composite, the pattern is resolved, gets rid of color banding, and becomes unnoticeable (mostly).

When running PS1 games through an emulator, this pattern becomes fully visible because it's no longer being resolved. Most emulators have the option to render at 32-bit and disable the pattern. By doing so it matches a 16-bit+dithering output.

I suppose my question is, do 6th gen consoles also render certain games at 16-bit color, and if so, which ones?
The original xbox seems to be rendering at 24-bit by default. Are there any OG xbox games rendered at 16-bit+dithering?
 
I don't know why there is no edit option.

"By doing so it matches a 16-bit+dithering output seen on a composite+CRT setup; no color banding and no pattern."
 
The original xbox seems to be rendering at 24-bit by default. Are there any OG xbox games rendered at 16-bit+dithering?
I don't know about Xbox, but I do believe many Gamecube (and maybe Wii) games have effectively 16bit color.
 
I don't see a newbie/beginner sub-section for the console section, so excuse me if this is not in the right place.

Although I've been following PC/console hardware for many years, I'm still an amateur when it comes to graphics processing.

It's well known that the large majority (all?) of PS1 games use dithering heavily. There are a few informative youtube videos on this subject.

Apparently, it also seems to be used on 6th gen consoles to some extent. The explanation that I've read around the internet is that although their GPUs are capable of processing at a higher color depth, the frame buffer is not large enough to hold the frames at those depths, so the output is reduced to 16-bit + dithering.

1. Is that explanation correct?
2. Does that apply to all these consoles? If not, what's the reason for each one?
3. Does the original xbox also use dithering? There are posts claiming its output is 24-bit.
4. Is it widespread or used in just a few titles?

I became interested in this subject mainly because of the HDMI mods in recent years. The dithering pattern on old consoles is not supposed to be visible when using a composite connection, and specially on a CRT display, but becomes very apparent on modern displays with HDMI.

The 6th gen machines were flexible enough in their pipeline that such things are way more dependant on each game engine rendering architecture than the hardware. The machines could render at full RGB depth. They could render above that even for HDR (internally, output was still standard color range) they could also render bellow for performance. It was down to dev's choices and priorities at this point.
 
Back
Top