Don't understand Wii/GC image quality. Dithering, texture colors and all that jazz.

Discussion in 'Console Technology' started by DeadlyNinja, Mar 31, 2009.

  1. DeadlyNinja

    Veteran

    Joined:
    Jan 3, 2007
    Messages:
    1,221
    Likes Received:
    4
    I started a thread a few weeks ago asking about dithering in Wii games, and it got locked for some reason. I hope there isn't some rule against that...

    Anyway, I saw the Wii emulator 720p thread, and before even seeing the screen shots, I laughed my ass off at that thought. Even good looking Wii games get screens that make it look piss bad, so seeing these 24 bit textures with all their dithering and crap can't be pretty at 720p, right? Wrong.

    Dude, WTF? The games look amazing. I mean, for the first time, I'm looking at non-bullshot (ok, some people like having their AA at 16x) Wii games and I don't see any ugly color issues, no dithering, no banding, no nothing. Clean beautiful textures with minimal blurring. I figured even though we can see the details better, wouldn't the flaws show up just as much?

    I understand that even though the textures might be designed for ED/SD screen can still look amazing, but what about all the color dithering, and generally low colored mess we see in Wii screens sometimes? Does it have something to do with the way the screen shots are taken?

    Here's an example screen shot from Pikimin (no idea which version, not like it matters).
    http://image.com.com/gamespot/images/2001/gamecube/pikmin/pikmin_1113_screen002.jpg

    There's something incredibly muddy about the image that's also in a lot of Wii games.

    Edit - Ok, I found a Pikimin screen shot to compare. This is the only one I can get my hands on so far. It's the title screen so it's probably not the best comparison.

    http://i40.tinypic.com/2cojwqe.png
     
    #1 DeadlyNinja, Mar 31, 2009
    Last edited by a moderator: Mar 31, 2009
  2. Aeoniss

    Regular

    Joined:
    Mar 23, 2007
    Messages:
    557
    Likes Received:
    0
    Location:
    Nebraska
    o_O Donkey Kong looks really good in that bottom pic there...


    I've no idea why this may be the case- perhaps some of the smart fellows here can answer.
     
  3. DeadlyNinja

    Veteran

    Joined:
    Jan 3, 2007
    Messages:
    1,221
    Likes Received:
    4
    Hehe, you think Donkey Kong looks good in that screen? Wait till you see him up close.

    http://img8.imageshack.us/img8/3969/brawl6hd.png

    Strangely enough, I'm going crazy trying to find some ugly brawl screens to compare. The ones that I can find have been shrunken down to 400p or some crap.
     
  4. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,583
    Likes Received:
    5,681
    Location:
    ಠ_ಠ
  5. DeadlyNinja

    Veteran

    Joined:
    Jan 3, 2007
    Messages:
    1,221
    Likes Received:
    4
    Oh, so even though the system wasn't built with a high precision rendering, it doesn't mean the textures would have to be at a lower bit rate? That explains a lot.
     
  6. pc999

    Veteran

    Joined:
    Mar 13, 2004
    Messages:
    3,628
    Likes Received:
    31
    Location:
    Portugal

    It doesnt matter, even if the texture have very high qualitity it is reduced in the frame buffer, the frame buffer will only produce a 24bit image.
     
  7. sfried

    Regular

    Joined:
    Apr 9, 2006
    Messages:
    542
    Likes Received:
    2
  8. DeadlyNinja

    Veteran

    Joined:
    Jan 3, 2007
    Messages:
    1,221
    Likes Received:
    4
    I was always under the impression that textures had to be created based on how much frame buffer the system has available. Obviously, I'm dead wrong.

    BTW, how much frame buffer is needed for 32bit textures anyway?

    You know what's weird. I was at the Nintendo World Store a few weeks ago and I saw Pikimin running on their HD screens. Even at a glance, I saw the ugly grain moving across the screen as the camera moved, but when I was look at Wii Fit and Mario Kart in the store, they didn't have that extra ugly grain moving around the screen.
     
  9. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,161
    Likes Received:
    3,546
    A really basic framebuffer is the number of pixels times the bit colour depth. So you have x colour bits per pixels. Once you get into Z buffers and FSAA, I have no idea what is going on.
     
  10. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,544
    Likes Received:
    10
    Location:
    In the land of the drop bears
    from here.
    http://www.beyond3d.com/content/articles/4/5

    :smile:
     
  11. fearsomepirate

    fearsomepirate Dinosaur Hunter
    Veteran

    Joined:
    Sep 1, 2005
    Messages:
    2,743
    Likes Received:
    65
    Location:
    Kentucky
    What everyone else said. The problem isn't 24-bit textures (24-bit textures look fine). It's the low-precision render target. Games running in 6:6:6:6 mode (as opposed to 8:8:8, which is prettier but more restrictive) have all the banding and dithering Cube games are famous for. Also, the deflicker filter eliminated some of the dithering...but only worked in interlaced mode.
     
  12. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    and the reason for that 6/6/6/6 format is the 3MB edram in the GPU I think. it is the thing holding the framebuffer (like the 2MB framebuffer on a Voodoo1, giving 16bit double-buffered Z-buffered 640x480)

    of course an emulator running the graphics in 32bit will look much better.
    If you're amazed by that emulator you should try a N64 emulator. with a 10 times bigger res on a low end PC, 8x AA, 16x AF and something like a X360 controller. That's "HD Mario 64" for sure! (I had it in 900p).
    The awful blur is gone and such low textures are tremendously helped by the AF.
    You can even force filtering of the pixelated scores and fonts (in Project64 at least).
    Zelda looks brilliant too.
     
  13. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256

    something doesn't compute. Why keep the Z-buffer with the front-buffer? Aren't we done when the frame is in the front-buffer?
    I'm adding up to 2400KB for the voodoo1 framebuffer with it, and 1800KB without.

    I've not read everything from the link though and there's talk of Z-only pass and other Z-schmoo. So I don't know if I was smart enough to spot a forgotten copy-paste mistake, or if the formula is valid but only in the context of the article.

    anyway doing it my way, Wii/GC can afford a 720x480 24bit framebuffer with 16bit Z right under the 3MB mark.
     
    #13 Blazkowicz, Apr 3, 2009
    Last edited by a moderator: Apr 3, 2009
  14. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,583
    Likes Received:
    5,681
    Location:
    ಠ_ಠ
    I'm not 100% on the line of thought since I didn't write the article, but I believe it was a specific case for Xenos because the contents of the eDRAM are resolved to main memory although the last line would not have made sense. The front buffer is simply the colour data sent to the display. :oops:
     
  15. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,544
    Likes Received:
    10
    Location:
    In the land of the drop bears
    Oh crap. Seems I made a mistake and your right, just ignore that as its for the xenos only :oops:
     
  16. fearsomepirate

    fearsomepirate Dinosaur Hunter
    Veteran

    Joined:
    Sep 1, 2005
    Messages:
    2,743
    Likes Received:
    65
    Location:
    Kentucky
    The eDRAM of Wii/GC isn't structured that way. 1MB is texture cache, and you have 2MB for frame/Z. It's not unified, so you can't use all 3 MB for a frame buffer. Also, you can't texture directly from main RAM, so even if you could use the whole thing for the frame, you'd be stuck without textures.

    If I recall correctly from ERP's discussions on Gamecube technology, the front buffer gets spit out to main RAM, assuming you're double buffering (Cube games with screen tearing are very, very rare, so I assume most games do this)...so the answer is, "You don't."
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...