why does the gamecube have color dithering?

How in the world can "lower quality cables" and interlaced output increase a "dithering" effect??? If anything, it should work in the opposite direction whereby better cables are more revealing of dithering in the program material.
 
randycat99 said:
How in the world can "lower quality cables" and interlaced output increase a "dithering" effect??? If anything, it should work in the opposite direction whereby better cables are more revealing of dithering in the program material.

Less color bandwidth? I don't know, but it happens. On composite cables, underwater sections in metroid prime 2 and the bespin clouds in rogue leader are a banded mess, but pscan component cables fix that.
 
The fact is, of the things that poor video cables can do, creating a dithering effect isn't one of them. Less color bandwidth would help in that situation, since it would blend together color transitions, rather than emphasize them. Color bandwidth is unlikely to be at issue unless your luminence bandwidth is really hurting, anyway. If that were the case, dithering would be the least of your problems.

I don't doubt that you are seeing what you are seeing, but the explanation you give is simply doesn't make sense. What's more possible is your display simply has a very poor implementation for composite inputs (this is really no big surprise, since the makers are simply betting that no one would need to use that hookup method, in the first place).
 
randycat99 said:
The fact is, of the things that poor video cables can do, creating a dithering effect isn't one of them. Less color bandwidth would help in that situation, since it would blend together color transitions, rather than emphasize them. Color bandwidth is unlikely to be at issue unless your luminence bandwidth is really hurting, anyway. If that were the case, dithering would be the least of your problems.

I don't doubt that you are seeing what you are seeing, but the explanation you give is simply doesn't make sense. What's more possible is your display simply has a very poor implementation for composite inputs (this is really no big surprise, since the makers are simply betting that no one would need to use that hookup method, in the first place).

But even with component cables, the effect is still noticable when interlaced and doesn't fully disappear until progressive is enabled.
 
Whatever it is, a simple analog cable isn't going to generate digital-based artifacts. The answer simply lies with a different explanation than the one you have chosen.
 
There is definite banding in a few sections of Metroid Prime. "Murky" areas like the sunken frigate or the hologram room in the Phendrana labs show it quite clearly, presumably due to the effect used. You can also easily see the color depth drop when the scan visor effect is invoked.
 
Just got my gamecube VGA cable, and compared to my component cables the image is virtually identicle....except that the dithering that occured only on transparencies previously (such as smoke) now affects the entire image at time. (perhaps it always does, but is only visible with brighter colors?) I'd guess it's because the component cables aren't shield to carry a VGA signal and there's just some integrity lost during the transmission. I believe a similar effect used to occur when I tried to transmit high resolutions (like 1600x1200 at 100hz) over VGA.
 
Some various old quotes regarding this very issue, sorry I am unable to attribute the statements to their original posters, as I simply cut-and-pasted these to my GC tech. specifications folder:

The flipper to take a performance hit when implementing trilinear filtering on 24-bit texels. Using uncompressed textures is something you will try to avoid on consoles - working with limited memory. Outside a few special cases, I can't think of many uses that really need 32bit textures, and out of those special cases, not all would even need trilinear filtering to begin with. So the flipper would be able to handle 32-bit color or more internally, however it is penalized when using anything more than 24-bits because of the available framebuffer bandwith and the S3TC. I am guessing that the ramdacs (or whatever hardware outputs the rgba signals to the monitor) also display only 6 or 8 bits per-component, which additionally limits the ability for 32-bits.

Game cube has no support for 32bit color. Max bit depth of the framebuffer is 24bit, which can be used as 8:8:8, or 6:6:6:6 if you need destination alpha.
A flicker filter is a a weighted average applied normally applied in the DAC.

In it's simplest form it's a 50/50 filter applied to the even and odd fields (although 3 line filters are more common), what this does is remove high contrast color transitions between the fields, reducing interlaced flicker.

Dreamcast used a 3 line filter, as does GC, Xbox allows the developer to specify the extent of the filter 1, 3, or 5 lines. You can effectively enable one on a PS2 aswell, but for some bizarre reason it isn't the default so a lot of PS2 titles shipped without it.

I would imagine most Xbox games run in 24/32 bit color, 16 bit is pretty ugly and the additional cost is relatively small.

Li Mu Bai said:
With S-video and component connections, the comb filter of the TV is completely bypassed. It's only needed to separate chroninance and luminance in inferior connections like composite or straight RF cable. Both connections separate chroma & luma (which is what the comb filter essentially does, so it's bypassed), but component goes a step further and splits the color elements as well.

Hope this helps somewhat.
 
[q]Dreamcast used a 3 line filter, as does GC, Xbox allows the developer to specify the extent of the filter 1, 3, or 5 lines. You can effectively enable one on a PS2 aswell, but for some bizarre reason it isn't the default so a lot of PS2 titles shipped without it.

I would imagine most Xbox games run in 24/32 bit color, 16 bit is pretty ugly and the additional cost is relatively small.[/q]

I have a feeling some games leave the flicker filter on even when progressive scan is on, which would explain while some games are still insanely blurry even with pscan. (Resident Evil 4...)


And Xbox has even lower framebuffer bandwidth than gamecube, so wouldn't 24/32 bit color on it be even worse for performance? Or does its advanced memory controller (LMA?) eliminate the performance hit?
 
Back
Top