Don't understand Wii/GC image quality. Dithering, texture colors and all that jazz.

I started a thread a few weeks ago asking about dithering in Wii games, and it got locked for some reason. I hope there isn't some rule against that...

Anyway, I saw the Wii emulator 720p thread, and before even seeing the screen shots, I laughed my ass off at that thought. Even good looking Wii games get screens that make it look piss bad, so seeing these 24 bit textures with all their dithering and crap can't be pretty at 720p, right? Wrong.

Dude, WTF? The games look amazing. I mean, for the first time, I'm looking at non-bullshot (ok, some people like having their AA at 16x) Wii games and I don't see any ugly color issues, no dithering, no banding, no nothing. Clean beautiful textures with minimal blurring. I figured even though we can see the details better, wouldn't the flaws show up just as much?

I understand that even though the textures might be designed for ED/SD screen can still look amazing, but what about all the color dithering, and generally low colored mess we see in Wii screens sometimes? Does it have something to do with the way the screen shots are taken?

Here's an example screen shot from Pikimin (no idea which version, not like it matters).
http://image.com.com/gamespot/images/2001/gamecube/pikmin/pikmin_1113_screen002.jpg

There's something incredibly muddy about the image that's also in a lot of Wii games.

Edit - Ok, I found a Pikimin screen shot to compare. This is the only one I can get my hands on so far. It's the title screen so it's probably not the best comparison.

http://i40.tinypic.com/2cojwqe.png
 
Last edited by a moderator:
o_O Donkey Kong looks really good in that bottom pic there...


I've no idea why this may be the case- perhaps some of the smart fellows here can answer.
 
Oh, so even though the system wasn't built with a high precision rendering, it doesn't mean the textures would have to be at a lower bit rate? That explains a lot.
 
Oh, so even though the system wasn't built with a high precision rendering, it doesn't mean the textures would have to be at a lower bit rate? That explains a lot.


It doesnt matter, even if the texture have very high qualitity it is reduced in the frame buffer, the frame buffer will only produce a 24bit image.
 
It doesnt matter, even if the texture have very high qualitity it is reduced in the frame buffer, the frame buffer will only produce a 24bit image.

I was always under the impression that textures had to be created based on how much frame buffer the system has available. Obviously, I'm dead wrong.

BTW, how much frame buffer is needed for 32bit textures anyway?

That's how a game looks like when you're playing your GameCube/Wii with the included composite cables. Playing some of the game with VGA cables, the difference is as clear as night & day.

You know what's weird. I was at the Nintendo World Store a few weeks ago and I saw Pikimin running on their HD screens. Even at a glance, I saw the ugly grain moving across the screen as the camera moved, but when I was look at Wii Fit and Mario Kart in the store, they didn't have that extra ugly grain moving around the screen.
 
BTW, how much frame buffer is needed for 32bit textures anyway?

A really basic framebuffer is the number of pixels times the bit colour depth. So you have x colour bits per pixels. Once you get into Z buffers and FSAA, I have no idea what is going on.
 
A really basic framebuffer is the number of pixels times the bit colour depth. So you have x colour bits per pixels. Once you get into Z buffers and FSAA, I have no idea what is going on.

Back-Buffer = Pixels * FSAA Depth * (Pixel Colour Depth + Z Buffer Depth)
Front-Buffer = Pixels * (Pixel Colour Depth + Z Buffer Depth)
Total = Back-Buffer + Front-Buffer
from here.
http://www.beyond3d.com/content/articles/4/5

:smile:
 
What everyone else said. The problem isn't 24-bit textures (24-bit textures look fine). It's the low-precision render target. Games running in 6:6:6:6 mode (as opposed to 8:8:8, which is prettier but more restrictive) have all the banding and dithering Cube games are famous for. Also, the deflicker filter eliminated some of the dithering...but only worked in interlaced mode.
 
and the reason for that 6/6/6/6 format is the 3MB edram in the GPU I think. it is the thing holding the framebuffer (like the 2MB framebuffer on a Voodoo1, giving 16bit double-buffered Z-buffered 640x480)

of course an emulator running the graphics in 32bit will look much better.
If you're amazed by that emulator you should try a N64 emulator. with a 10 times bigger res on a low end PC, 8x AA, 16x AF and something like a X360 controller. That's "HD Mario 64" for sure! (I had it in 900p).
The awful blur is gone and such low textures are tremendously helped by the AF.
You can even force filtering of the pixelated scores and fonts (in Project64 at least).
Zelda looks brilliant too.
 
from here.
Back-Buffer = Pixels * FSAA Depth * (Pixel Colour Depth + Z Buffer Depth)
Front-Buffer = Pixels * (Pixel Colour Depth + Z Buffer Depth)
Total = Back-Buffer + Front-Buffer
http://www.beyond3d.com/content/articles/4/5

:smile:


something doesn't compute. Why keep the Z-buffer with the front-buffer? Aren't we done when the frame is in the front-buffer?
I'm adding up to 2400KB for the voodoo1 framebuffer with it, and 1800KB without.

I've not read everything from the link though and there's talk of Z-only pass and other Z-schmoo. So I don't know if I was smart enough to spot a forgotten copy-paste mistake, or if the formula is valid but only in the context of the article.

anyway doing it my way, Wii/GC can afford a 720x480 24bit framebuffer with 16bit Z right under the 3MB mark.
 
Last edited by a moderator:
something doesn't compute. Why keep the Z-buffer with the front-buffer? Aren't we done when the frame is in the front-buffer?
I'm adding up to 2400KB for the voodoo1 framebuffer with it, and 1800KB without.

I'm not 100% on the line of thought since I didn't write the article, but I believe it was a specific case for Xenos because the contents of the eDRAM are resolved to main memory although the last line would not have made sense. The front buffer is simply the colour data sent to the display. :oops:
 
I'm not 100% on the line of thought since I didn't write the article, but I believe it was a specific case for Xenos because the contents of the eDRAM are resolved to main memory although the last line would not have made sense. The front buffer is simply the colour data sent to the display. :oops:

Oh crap. Seems I made a mistake and your right, just ignore that as its for the xenos only :oops:
 
anyway doing it my way, Wii/GC can afford a 720x480 24bit framebuffer with 16bit Z right under the 3MB mark.

The eDRAM of Wii/GC isn't structured that way. 1MB is texture cache, and you have 2MB for frame/Z. It's not unified, so you can't use all 3 MB for a frame buffer. Also, you can't texture directly from main RAM, so even if you could use the whole thing for the frame, you'd be stuck without textures.

something doesn't compute. Why keep the Z-buffer with the front-buffer?

If I recall correctly from ERP's discussions on Gamecube technology, the front buffer gets spit out to main RAM, assuming you're double buffering (Cube games with screen tearing are very, very rare, so I assume most games do this)...so the answer is, "You don't."
 
Back
Top