Strengths and weaknesses of GameCube relative to its peers *spawn

If memory footprint minus textures was great enough, a multiplatform game might be able to have better textures on PS2 even with the GC's superior texture compression.

Dreamcast used a mixture of 2 bpp and 4 bpp VQ compressed textures, and also 4 bit and 8 bit CLUT on occasion.

PS2 had to use a mixture of 4 and 8 bpp CLUT to match Dreamcast texture compression. Something like 2~3 x as much memory was needed for similar quality.

Gamecube and Xbox used 4bpp texture compression, generally giving quality above 8 bpp CLUT on PS2.

The trouble for GC was that it only had 24 MB of main ram. If a game relied on having textures loaded into main ram, then it could potentially run into trouble.

Image a PS2 game that (for the sake of argument) had 12 MB of textures in main ram, and 20MB of "other data". The same game on GC might only have 4MB left for textures. Even with GCs superior texture compression, it might have to reduce texture resolution to get textures to fit. Xbox meanwhile could just slop anything it wanted into its 64 MB of memory, likely with a 2x2 increase in texture size.
 
If memory footprint minus textures was great enough, a multiplatform game might be able to have better textures on PS2 even with the GC's superior texture compression.

Dreamcast used a mixture of 2 bpp and 4 bpp VQ compressed textures, and also 4 bit and 8 bit CLUT on occasion.

PS2 had to use a mixture of 4 and 8 bpp CLUT to match Dreamcast texture compression. Something like 2~3 x as much memory was needed for similar quality.

Gamecube and Xbox used 4bpp texture compression, generally giving quality above 8 bpp CLUT on PS2.

The trouble for GC was that it only had 24 MB of main ram. If a game relied on having textures loaded into main ram, then it could potentially run into trouble.

Image a PS2 game that (for the sake of argument) had 12 MB of textures in main ram, and 20MB of "other data". The same game on GC might only have 4MB left for textures. Even with GCs superior texture compression, it might have to reduce texture resolution to get textures to fit. Xbox meanwhile could just slop anything it wanted into its 64 MB of memory, likely with a 2x2 increase in texture size.
Yeah i'm sure it's possible for the PS2 to have better textures when it's the main platform, and it relies on main memory. But that's if you don't utilize GC's eDRAM, which also has texture compression. If i'm not mistaken PS2's eDRAM had no texture compression. I'd be a bit surprised because it's Eurocom who have done a lot of good work on Nintendo consoles. I'm pretty sure Gamecube's Nightfire (also by Eurocom) is at least on par with the ps2 version with regards to visuals and it has more bots/enemies. And I know their batman game looks significantly better on the cube.

The Ps2 has its advantages but I know textures aren't one of them, assuming the most is being done with both consoles.

But yeah that audio ram in the cube hurt it a bit.

---

Side note function, i've watch a few lengthy rallisport challenge 2 videos and that game is pretty impressive. The lighting and overall look is quite nice. I can't tell where the normal maps are though, couldn't find any development notes. Sucks I don't have the old Xbox anymore and 360 can't play it, i'd love to try it :/
 
Last edited:
Because the PS2 actually had the processing power on the CPU side to do it (~6 GFLops in it's dedicated vector units vs the gamecube's 1.9 I think).

More detail here.

http://www.sega-16.com/forum/showth...mcast-Graphics&p=645458&viewfull=1#post645458

http://www.sega-16.com/forum/showth...mcast-Graphics&p=645473&viewfull=1#post645473

Great! Thank you for the links!

It's well known the PS2 can produce far more polygons than GC, and if you question that then you are seriously ill informed. PS2 can produce many times more polygons on screen than GC because it has to because it's a multi-pass renderer. By and large the same polygons were drawn multiple times using the immense polygon and pixel drawing capabilities (in which the PS2 was by far the most powerful console of its generation and quite probably will always be relatively the most powerful console in those two areas because we use different techniques now). The end result was using several triangles to achieve the same look as one triangle on the other machines with single-pass multitexturing etc.

In PS2 thread Corysama said what ther's no need to calcuate polygons again for multipass, just send them again to GS. So it's the same polygon numbers. :D
 
Yeah i'm sure it's possible for the PS2 to have better textures when it's the main platform, and it relies on main memory. But that's if you don't utilize GC's eDRAM, which also has texture compression. If i'm not mistaken PS2's eDRAM had no texture compression.
Yes, but on PS2 only 1 Mb in EDRAM was used for textures, but there were possible to rewrite that 1 MB up tu 16 times per frame.

I'd be a bit surprised because it's Eurocom who have done a lot of good work on Nintendo consoles.

Eurocom made 16 games for PS2 and only 9 for Gamecube, one of them was just port. So Sphinx and the Cursed Mummy was Eurocom's 5th PS2 game, and 4th Gamecube game.

But yeah that audio ram in the cube hurt it a bit.

AFAIK some studios used main RAM for sound instead.
 
Yes, but on PS2 only 1 Mb in EDRAM was used for textures, but there were possible to rewrite that 1 MB up tu 16 times per frame.



Eurocom made 16 games for PS2 and only 9 for Gamecube, one of them was just port. So Sphinx and the Cursed Mummy was Eurocom's 5th PS2 game, and 4th Gamecube game.



AFAIK some studios used main RAM for sound instead.

According to Wikipedia (which could be wrong) Eurocom had done the same amount of games on each platform by the time Sphinx came out. They did crash wrath of cortex for gamecube while another studio handled the other versions.

Basically you're saying some studios may not have used Gamecube's A-RAM, or at least not all of it?
 
Any comparison? I find that a bit odd considering that's a Eurocom game.
Screenshots from emulators.
http://screenshotcomparison.com/comparison/203773
gxpe78-69bkww.png

ps2_111ijat.png

gxpe78-7uaklo.png

ps2_101tjj1.png

gxpe78-7iqkhn.png

ps2_98xkh8.png

gxpe78-4d7klh.png

ps2_3qvjjm.png

gxpe78-3d3j05.png

ps2_2hpjsm.png

gxpe78-16uk7t.png

ps2_1a0kvd.png

gxpe78-58gkn3.png

ps2_46xkzy.png
 
Texture compression is vastly overrated as a factor in this discussion. The important thing from a visual standpoint in that generation was by far resolution of the textures and not colour depth (and to a large extent it still is). In that regard all consoles are equal.
It is quite possible to have 2 or even 1 bit textures on PS2 by swapping the palette in the same texture.
The Dreamcast had room for only about 5 Mb of textures per frame (and probably per presently loaded environment also). If we are really nice we could double that because of the 2 bit compression capability, even though that is probably unrealistic.
That means it has a significantly smaller texture pool per frame than the PS2 per frame. Even then it was more than competitive with regard to textures to all the other consoles released until 2005. Just look at something like Sonic Adventure 2.
The thing that probably was troubling the developers with PS2 texturing was the perceived need to cram textures into single pages of eDRAM, resulting in a lot of 64x64 textures. It's true that it's an advantage to have the textures on the same page if you want to do blending of MIP maps, but it's only the top level that would be troublesome which is a small part of the total frame.
You could also alternatively just use the z-buffer as an alpha mask for the first MIP level.
 
Quite the difference on sphinx. I wonder how the frame rate compared though. I'd pay money for Digital foundry to comb over the 6th generation for some retro head to heads :smile2:
 
I was just thinking that the Gamecube had 8 dedicated hardware lights way back in 2001. Nintendo's previous console the N64 had fully programmable hardware via microcode. Wheareas the Gamecube had fixed function hardware designed around the needs of Nintendo's in-house artists and developers. That then changed in the following generations because of the emphasis on flexible fully programmable hardware.

And then this new generation coming is seeing a return to dedicated hardware inside GPUs, namely new silicon dedicated to hardware ray tracing and also cores for AI processing. And all of this is because of the ending of Moore's Law and therefore the need to return to dedicated fix function hardware for improvements in graphics.
 
I was just thinking that the Gamecube had 8 dedicated hardware lights way back in 2001. Nintendo's previous console the N64 had fully programmable hardware via microcode. Wheareas the Gamecube had fixed function hardware designed around the needs of Nintendo's in-house artists and developers. That then changed in the following generations because of the emphasis on flexible fully programmable hardware.

And then this new generation coming is seeing a return to dedicated hardware inside GPUs, namely new silicon dedicated to hardware ray tracing and also cores for AI processing. And all of this is because of the ending of Moore's Law and therefore the need to return to dedicated fix function hardware for improvements in graphics.
Efficiency is key.
It will only become more important going forward as lithographic advances slow down even more. If you know what you want to do, and dedicated hardware is more efficient doing it, and it is used sufficiently frequently, then dedicated hardware is the way to go.
The problem of course is that taking resources from the more general hardware blocks may impose a bit of a limitation if rendering technology methods change substantially during the active life of the console. Can’t say that is terribly likely at this point in time. Nor that it is a major concern for gaming consoles in general, although the risk is real that specialized hardware resources will see limited use in multi platform titles. Mobile SoCs are the obvious design example here. They use dedicated hardware for heavy general CPU use, ligtg general CPU use, 3D rendering, traversing neural networks, taking care if still image data, encoding and decoding video, network handling, modem functions, audio, et cetera. And when a functional block isn’t in use it can be shut down saving energy vs. forcing a more general resource to be active all the time.
In what ways this transfers to future console SoC design is a bit opaque to me. I can’t really see that much has emerged as ”new standards” that would merit dedicated hardware has appeared seeing as VR hasn’t really taken off. I remain doubtful that ray tracing will be such a case in a constrained environment, (is somewhat nicer shadow coding really enough?) but it remains to be seen. Compatibility between generations is desireable, and there is industry inertia to be considered.
That nVidia promotes their new hardware capabilities is not strange, but the relevance for dedicated gaming appliances might be limited.
 
And then this new generation coming is seeing a return to dedicated hardware inside GPUs, namely new silicon dedicated to hardware ray tracing and also cores for AI processing.
It never left. Texture samplers and ROPs are fixed-function, specialist blocks. Tensor cores aren't 'AI processors' but particular number crunchers, no different in principle to the vector units in a CPU. All these processors consist of a number of optimized processing cores and have done so for countless decades, ever since probably the first FPU extensions were developed. Raw processors, serially loading binary registers and combining bits in loops, aren't ideal for a lot of workloads, so different types of processor are combined.

There's nothing new going on. Cores are balanced between flexibility and performance. We saw vertex and pixel units combined because they shared a lot of common elements, while the ROPS and geometry engines are they're own thing.
 
The Ps2 has its advantages but I know textures aren't one of them, assuming the most is being done with both consoles.
It's not as simple as that. There are plenty of GC games with worse textures when compared to PS2. Disc space could have been an issue. Or maybe development limitations (time/money) prevented a studio from being able to properly resize textures to take advantage of the hardware. I'm sure plenty of games had their textures converted to a GC compatible compressed format with less regard to maintaining image quality compared to the competition and more regard toward getting the game running and fitting all the data on the disc and in system ram.
 
It's not as simple as that. There are plenty of GC games with worse textures when compared to PS2. Disc space could have been an issue. Or maybe development limitations (time/money) prevented a studio from being able to properly resize textures to take advantage of the hardware. I'm sure plenty of games had their textures converted to a GC compatible compressed format with less regard to maintaining image quality compared to the competition and more regard toward getting the game running and fitting all the data on the disc and in system ram.

In ideal development conditions - it is that simple. When cube's eDRAM is properly used it can have higher res textures than Ps2. When its TEV is used it can do nearly anything xbox can sans normal maps which when used drastically reduced polygon counts anyway. Not to mention cube had good texture decompression. Which, since all 4 of that gen's hardware was so different I think we should focus on what they do when games are tailor made.

Fun fact : the difference between 360's DVDs and ps3 blu ray was greater than the gap in DVD vs cube's mini discs and developers like treyarch said that limited the texture quality in COD black ops. I have never read any devs comment on the mini discs, it just seems to be a popular thing to say was an issue.

Sure if you just use the main memory on the cube, ps2 could potentially have better textures but that wouldn't be proper utilization of the hardware. It was however a good point in ps2s favor, i'm not detracting from it.

Which was cube's biggest issue really, corrected with the wii which had 64 megs of fast main ram like xbox , plus the cube's original main ram, PLUS edram and 110 mhz advantage in fsb speed over xbox. The Wii is gen 6 on roids, honestly the games look fine on my 55 inch via component.
 
Reading through the thread again I will admit one thing, I agree with ERP's statement that cube's memory set up was overengineered, and they spent too much time worrying about latency. I wonder if they could've used 48 megs of ddr on a 96 bit bus and could that have been cheaper? The slow audio ram being one thing but even the 1t SRAM had a 600mb\s deficit in speed vs. Ps2 rambus memory. Cube's cpu cache was really great so I doubt such a change would've hurt the latency much.
 
Xbox was much more capable then either in about all areas, gc coming in 2nd. PS2 could do zoe2 though, still lile its graphics, gotta play the steam or ps4 version too.
 
And cube has a nice list of games that somehow look better than xbox despite the power gap. Re4 for realism in its modeling (per vertex lighting though not per pixel I checked), metroid prime and its poly counts, f zero, starfox adventures at 60fps, Mario sunshine's water. Had Xbox had such a talent pool and there wasn't a focus on normal maps this could've been different. But, cube does have higher bandwidth and lower latency so it's not 100% win for Xbox. Perhaps an unscientific 90% i'll call it.

Cube certainly got whooped in realistic racers though, rallisport challenge 2 I think shows the power of Xbox used intelligently. Then there's black which was probably too alpha texture heavy for cube.
 
Last edited:
Xbox was much more capable then either in about all areas, gc coming in 2nd. PS2 could do zoe2 though, still lile its graphics, gotta play the steam or ps4 version too.

WOW I didn't know ZoE2 got a Steam release! Proud to say I'm one of the few who bought the game for PS2 in March 2003 when it hit US shelves. I think it was a pretty short production run, and it only got new life and recognition with the HD Remasters.
 
Back
Top