Could Wii developers use the 24 MB of "internal" T1-SRAM........

Mobius1aic

Quo vadis?
Veteran
.......instead of the embedded DRAM memory? Sure, it might be slower, but is it possible? I'm not exactly an expert on these things, but especially when considering console GPUs don't have embedded DRAM, I think it makes for an interesting case of study. I do understand that consoles are a much different environment from PCs as well, where developers have only one configuration to program for instead of a myriad of CPUs, GPUs, RAM, etc.......

Enlighten me please
 
I take it you mean use it for a larger framebuffer? As I understand it the "fault" is with the video scanout hardware. It simply can't cope with anything above EDTV resolutions.
You can still chose to render the scene in fields or tiles at a higher resolution and then downsample for AA, of course.
 
Last edited by a moderator:
I take it you mean use it for a larger framebuffer? As I understand it the "fault" is with the video scanout hardware. It's simply can't cope with anything above EDTV resolutions.
You can still chose to render the scene in fields or tiles at a higher resolution and then downsample for AA, of course.

Or just render at 480i @ 23.9fps instead of 480p to optimize ram space?
 
Rendering each frame as two interlaced fields doesn't save you any RAM, nor does rendering at a lower framerate.
 
So it really never was an issue with the embedded frame buffer on the GPU? The Wii just sucks at z-buffering and rasterization?

Well that brings me to another question, on the PS2 and GC/Wii, how was z-buffering done, as far as I know they don't have modern ROPs like modern graphics hardware does, or do they?
 
So it really never was an issue with the embedded frame buffer on the GPU? The Wii just sucks at z-buffering and rasterization?

Well that brings me to another question, on the PS2 and GC/Wii, how was z-buffering done, as far as I know they don't have modern ROPs like modern graphics hardware does, or do they?

The embedded framebuffer is an issue for anything apart from menu screens and very simple games it just wouldn't be worth it.
Look at how little tiling is being use on the 360, a system that is supposedly designed for it...
Splitting up the scene in tiles or having to render it multiple times would kill performance.

And Wii and PS2 doesn't suck a z-buffering. What makes you say that?
On the contrary having a z-buffer that's so easily manipulable makes lots of effects very cheap (z-fog, stencil volumes, selective blanking out of certain ranges etc.)
 
Look at how little tiling is being use on the 360, a system that is supposedly designed for it...

How little? There is a significant number of titles with AA, just read the Eurogamer comparison articles or our own neverending thread.
 
Back
Top