Gamecube Framebuffer

Apoc

Regular
I did a search and found nothing, so if this is already posted, just delete it.

What's the gamecube framebuffer format? I remember reading that it wasn't 640x480 32 bits, and a friend told me that RE4 renders at 640x480x16. If this is true i'm very surprised, knowing it has its own 1t-sram for the FB, and it will gain nothing for usin 16 bits for color.

Is this true or am I totally wrong? Just be easy with me ;) .

Thanks in advance.
 
If IRC and it's been a while, it supports 24 bit or 16 bit framebuffers.

You would generally use a 16 bit framebuffer if you were using the AA, it lets the full frame fit into the EDRAM and you don't have to resort to drawing 2 tiles.
 
ERP said:
If IRC and it's been a while, it supports 24 bit or 16 bit framebuffers.

You would generally use a 16 bit framebuffer if you were using the AA, it lets the full frame fit into the EDRAM and you don't have to resort to drawing 2 tiles.

Thanks.
 
from what i know (and i don't think that's alot) the gamecube can only render in 24 bit, including alpha. so if you are using alpha you are limited to 6 bits per channel (wich would be 24 bit, 18 for color and 6 for alpha). if you aren't using alpha you get a maximum of 8 bits per channel (also 24bit).

other than the small size of the on die framebuffer (~2MB) the only thing that would limit color accuracy would be flipper (the graphics processor), and i think that's the case here. artX have had a pretty good track record of designing very efficient hardware by cutting corners where they are mostly unnoticed, and flipper is a good example of this.
 
see colon said:
from what i know (and i don't think that's alot) the gamecube can only render in 24 bit, including alpha. so if you are using alpha you are limited to 6 bits per channel (wich would be 24 bit, 18 for color and 6 for alpha). if you aren't using alpha you get a maximum of 8 bits per channel (also 24bit).

other than the small size of the on die framebuffer (~2MB) the only thing that would limit color accuracy would be flipper (the graphics processor), and i think that's the case here. artX have had a pretty good track record of designing very efficient hardware by cutting corners where they are mostly unnoticed, and flipper is a good example of this.

How does that explain the dithering in RE4, is it the 8 bits per channel? But that wouldn't explain the texture quality of Metroid Prime 2 which has some of the best texture I've seen in a GC game(with no dithering what so ever).
 
RE4 uses alpha, so it's only 6 bits per color channel at best (18 bits for color). it's slightly better than 16bit, but well below true 24bit color (or 32bit, which is actualy 8 bits per color channel, and 8 bits for alpha)

as for the texture quality in MP2, i confess that i haven't spent much time with it (i played MP all the way through, but only played in store demos of 2. it's on my list of games to pick up when i get a chance, though. MP was fantastic). texture quality is generaly pretty easy to maintain even while limiting color depth, though. you can limit the textures color(if that's what you need), and compress the texture before using it in game (the GC has hardware s3tc) so you have a pretty good idea how it's going to look before you even use it in a realtime situation.
 
see colon said:
RE4 uses alpha, so it's only 6 bits per color channel at best (18 bits for color). it's slightly better than 16bit, but well below true 24bit color (or 32bit, which is actualy 8 bits per color channel, and 8 bits for alpha)


Sorry for bringing this up, but I'm curious. Is the PS2 version using the same scheme?
 
ERP said:
If IRC and it's been a while, it supports 24 bit or 16 bit framebuffers.

You would generally use a 16 bit framebuffer if you were using the AA, it lets the full frame fit into the EDRAM and you don't have to resort to drawing 2 tiles.

Could A more bigger framebuffer support AntiAlias in 24 bits color or is it a limitation of the GPU?
 
Alstrong said:
Sorry for bringing this up, but I'm curious. Is the PS2 version using the same scheme?
i don't think so. to be honest, the only reason i know anything about what's going on with banding/dithering on the GC is because i noticed it in a handfull of games, especialy multiplatform games, and did a bunch of research to figure out what was going on.
 
Urian said:
Could A more bigger framebuffer support AntiAlias in 24 bits color or is it a limitation of the GPU?

You can AA at 24 bit, but a 640x480 framebuffer will not fit in the 2Mb's of edram, so you have to render in tiles (much like Xenos but without the hardware to assist you).
 
Alstrong said:
Sorry for bringing this up, but I'm curious. Is the PS2 version using the same scheme?

PS2 has no 6/6/6/6 format so it would use 8/8/8/8.
Gamecube has no 8/8/8/8 format so if it needs destination alpha then ithas to use 6/6/6/6.

Neither one has a stencil buffer, but both support stencil like ops on the destination alpha channel, which is what I'd guess RE would use it for.
 
I was under the impression that in order to fit the aa'd framebuffer in the edram, you compressed the z-buffer. I don't see much reason for that functionality to exist otherwise.
 
Steve Dave Part Deux said:
I was under the impression that in order to fit the aa'd framebuffer in the edram, you compressed the z-buffer. I don't see much reason for that functionality to exist otherwise.

Zbuffer compression exists only to reduce bandwidth.
It's lossless so worst case it will be exactly the same size as an uncompressed ZBuffer.

NGC also doesn't have ZBuffer compression.
 
Yeah, that's true. I guess what I meant is that I thought there were multiple ways to configure the z-buffer to save space by using less precision. Which isn't compression obviously, but sometimes my brain doesn't work right.
 
Steve Dave Part Deux said:
Yeah, that's true. I guess what I meant is that I thought there were multiple ways to configure the z-buffer to save space by using less precision.
Just a 16Bit Frame buffer with GCN's 3sample AA @ 640x480 takes up 1.8MB. If you wanted a Zbuffer to fit into available 2MB, it would need to use less then 3bits/pixel precision...

The amount of VRam is exactly enough to fit either
640x528 buffer with 24bit Z & Color and no AA
or
640x264 buffer with 16bit Z & Color and 3 sample AA.
 
Is there ANY game on GC that even uses the 3-sample AA? Other than perhaps on the title screen or such, I mean.

I've never been able to spot it in action, but then, I've only seen a handful of GC titles to begin with...
 
Back
Top