Was GC more or less powerful than PS2? *spawn

Thanks for the explanation.

Too bad nobody has the balls to create a Neo-Geo like premium system with premium prices in relatively low quantities.But I don't wanna hijack this thread:)

Cheers
 
Naomi 2 would never have been cost effective for a home system - it only made sense for arcades because it allowed Sega to use parts they were buying and making in volume for the Dreamcast and Naomi 1. Pity the Elan T&L unit didn't get used more, it was pretty amazing for the time.

There was never a DC upgrade planned either - the DC didn't even have a suitable expansion port to plug one in to. The DC could have scaled up to be a considerably more capable system (the SH4 architecture could scale up to 400 mHz for instance, and 250 ~ 300 mHz is within easy reach of DC overclockers) but Sega didn't have the money to bleed on hardware and they wanted something small and toy-like. Amazing by 1998 standards though, especially when you consider that the N64 came out only two years earlier in 1996.

How much more substantial of a PSU or cooling system would've been necessary for 300 or 400 MHz? We can also wonder the same for the Emotion Engine. It would've been fascinating for Dreamcast developers to be given, like with the PSP, the ability to use the full speed of the main processor at a later point with firmware upgrade.
 
Too bad nobody has the balls to create a Neo-Geo like premium system with premium prices in relatively low quantities.But I don't wanna hijack this thread:)

The Neo Geo could support high hardware and software costs by generating most of its revenue from arcades, which unfortunately isn't an option any more. The Xbox 360 and PS3 also cost a lot to make when they were new but they were subbed by the vendors, and software now costs even more to make than it ever did on the Neo Geo - it's just there are no carts to pay for and lots more sales. The idea of an exotic and super high end system is very appealing, but the likes of the PS2, Xbox, Xbox 360 and PS3 are the best we can probably hope for (and only half of those have been business successes!).

The GC represents a different approach, and one that has done Nintendo very well; carefully selected but somewhat conservative hardware that can launch at mainstream prices while selling at a profit. In all of this PS2 vs GC discussion I think it's easy to miss that Nintendo jumped in with a cheap, cost effective, more balanced, and viable system from day one and that they were never trying to recreate a PS2 or Xbox style monster system.

How much more substantial of a PSU or cooling system would've been necessary for 300 or 400 MHz? We can also wonder the same for the Emotion Engine. It would've been fascinating for Dreamcast developers to be given, like with the PSP, the ability to use the full speed of the main processor at a later point with firmware upgrade.

The very first Japanese DC's used a minuscule heatpipe to keep cool, but that was quickly dropped leaving only a tiny fan. You can overclock just about any DC (except the final versions who's bios' seem to reject it) to ~240 mHz on standard cooling, and there's a youtube of a guy with his running at 270 with a tiny heatpipe fitted (but still less substantial cooling than the GC or PS2). The Sh4 spec sheet says the architecture scales up to 400 mHz, but that probably wouldn't have been realistic for volume production in 1998 even with significantly beefed up cooling and power systems. As it was I think the SH4 was originally planned for a 166 mHz introduction but it appears that Sega pushed for them to reach 200 as they felt they needed that performance.

The PS2, IIRC, was originally planned to have it's Emotion Engine run at 250 mHz but either the cooling got uprated or manufacturing improved sufficiently for them to run at 300. Unlike the Xbox 360 and PS3 which both saw clock drops from original plans, they kept pushing with the PS2. It does seems like we've hit a power ceiling on consoles now.

The Dreamcast came at the start of a period of rapidly escalating power consumption for consoles that (like with the PC) has continued right up until the 200W 360 and PS3. The PS2 generation went something roughly like this:

1998, Dreamcast: ~20W, 3cm fan
2000, PS2: ~40W, 6cm fan
2001, GC: ~30W, 5cm fan
2001, Xbox: ~80W, 8cm fan

Slapping a good chunk of metal on the CPU and having a man-sized fan would have allowed the DC to fun a lot faster, but having a fan in at all seemed daring in 1998. By 2001 the GC's significantly higher power and cooling seemed a little bit weak, especially compared to the thumping great tombstone that was the Xbox.
 
Yeah, that's true. :D But all this talk of clocks and power consumption (mostly by me admittedly) has made me wonder something. The GC has 24MB main Ram, a 166 mhz GPU and 500 mhz CPU. We know 48MB main ram was do-able because of the Triforce arcade board, that the GPU was originally meant to be 200 mHz, and it appears that the CPU could scale up to much higher frequencies based on the Wii.

If Nintendo had gone with a 48MB, 200/800 mHz console with a 4X full size DVD drive, and made something considerably faster than the PS2 and much closer to the Xbox, would it have actually made the GC more competitive against the PS2, or less competitive in general (because it would have retailed at a higher price or had to have been sold at a sizeable loss)? I don't actually see it changing the landscape that much, just losing Nintendo money.

GC was Nintendo's worst performing console in terms of position in the market, and it would certainly have been possible for Nintendo to make it faster and differentiate from the PS2 more based on performance (as the Xbox managed to do). It's also the last time that Nintendo were up with Sony/MS in terms of performance. It just seems interesting to consider the GC in relation to the WiiU and all the current hoopla about how much performance it will have / won't have relative to its present and future competitors.

This is kind of OT too, but this seems like the best place for it, if indeed it is something worth pondering.
 
In my opinion the GC as a whole taking the whole system into consideration was a more powerful console than PS2. Just look at the games, the AAA titles on Gamecube (Starfox Adventures, Metroid Prime, Wind Waker, etc...) are much nicer looking games than anything put out by PS2.

PS2 simply could not have made as clean looking game as Wind Waker. There would have been lot of concessions that would have had to been made.
 
IMO its a lot like the difference between 360 and PS3. Sony puts like 1 crazy piece of hardware in the box (Emotion Engine for PS2, Cell processor for PS3) and then puts a bunch of garbage in for GPU and everything else. The GPU in the PS3 barely does hardware lighting, its a joke.

Whereas the GC and Xbox 360 are a lot more balanced. Sony builds such bottlenecks into their system, just so they can claim some crazy number that never materializes into anything tangible in-game.

That's why ports always look worst on Sony consoles.
 
IMO its a lot like the difference between 360 and PS3. Sony puts like 1 crazy piece of hardware in the box (Emotion Engine for PS2, Cell processor for PS3) and then puts a bunch of garbage in for GPU and everything else. The GPU in the PS3 barely does hardware lighting, its a joke.

Whereas the GC and Xbox 360 are a lot more balanced. Sony builds such bottlenecks into their system, just so they can claim some crazy number that never materializes into anything tangible in-game.

That's why ports always look worst on Sony consoles.

There is so many wrong things in that post.
 
PS2 simply could not have made as clean looking game as Wind Waker. There would have been lot of concessions that would have had to been made.
Uh, what actual facts do you have for making such a baseless statement? Okami for example is just one of oodles of good-looking cel-shaded games on PS2, there's many other examples too.

The GPU in the PS3 barely does hardware lighting, its a joke.
You are sadly misinformed my friend. The PS3 GPU is a 9800-series PC GPU with the PCIe interface replaced with a Rambus redwood interface instead, HDMI audio and some other minor changes. Weighing in at around 300 million trannies it's perfectly capable of doing lighting calculations and lots of other stuff as well.

That's why ports always look worst on Sony consoles.
That's a flawed over-simplification.
 
Uh, what actual facts do you have for making such a baseless statement? Okami for example is just one of oodles of good-looking cel-shaded games on PS2, there's many other examples too.


You are sadly misinformed my friend. The PS3 GPU is a 9800-series PC GPU with the PCIe interface replaced with a Rambus redwood interface instead, HDMI audio and some other minor changes. Weighing in at around 300 million trannies it's perfectly capable of doing lighting calculations and lots of other stuff as well.


That's a flawed over-simplification.

9800? I assume a typo and you meant 7800?
 
Last edited by a moderator:
You'd technically only need two to get 256 colours
And by using 2 4-bit textures, you are now using up the memory you were trying to save by using not using 8-bit textures in the first place. Like I said, you lose what you gain if you start replicating high-color textures by layering low-color textures (Edit: Obviously, doesn't apply if you layer textures of different resolution, which even modern games do). If I recall correctly, the both the GC and PS2 could only texture from the eDRAM, and that's (a) why using palletized textures was so important, and (b) why the GC's native 6:1 S3TC was a big deal.

BTW, I'm not trying to be "hard" on 4-bit textures. They are what they are, and the engineers made entirely reasonable design choices considering the time, and a lot of developers worked well with the limitation. I just take issue with the claim that anything you could do with 24-bit textures on the GC could be done with layering 4-bit textures, because it wasn't ever done.
 
Last edited by a moderator:
You are sadly misinformed my friend. The PS3 GPU is a 9800-series PC GPU with the PCIe interface replaced with a Rambus redwood interface instead, HDMI audio and some other minor changes. Weighing in at around 300 million trannies it's perfectly capable of doing lighting calculations and lots of other stuff as well.

Where are you getting this? The PS3 uses a nVidia 7800-series derivative, with a halved GDDR3 interface.
 
I mixed up the model numbers. These cards came out a half-decade ago, forgive me for not keeping them all fresh in my mind.

That said, the 7800 series has no difficulty performing lighting calculations, of course.
 
blip might have been referring to the several dev presentations (bizarre creations, naughty dog, and DICE) doing lighting on SPEs.
 
He did say RSX sucked at lighting though, which isn't the case as demonstrated by Crysis 2.

Well, we also don't know that Crytek might have done better with an SPU lighting implementation... It was also their first retail release for the system. The resolution reduction had its memory implications, but then it's not like the performance was always rock solid, was it? Anyways, the main point is that just because a number of games do lighting on RSX doesn't mean it may not better off being done on SPUs as evidenced by those developers I've listed above (and who knows how many more). You know... devs who have actually implemented lighting on RSX before and then decided the SPUs were better off doing it (game performance).

The SPUs can do much better light culling than RSX... The lighting is also done at FP16 on the SPUs so...
 
SPE assisted rendering is definitely a bonus, but the RSX itself is still pretty capable. It's good enough that Crytek weren't forced to do their shading on the SPEs in order to match the 360 version. It may not be optimal but his claim that it barely does lighting despite evidence to the contrary is a bit far fetched.
 
The SPUs can do much better light culling than RSX... The lighting is also done at FP16 on the SPUs so...
That doesn't make blip's comment in any way sane though. "RSX (7800) can barely do lighting." So all those people with a 7800 in their PC at the time had flat-shaded polygons in their games?? RSX comes in for some serious fictional bashing. As far as I'm aware the vertex setup is relatively weak, but in everything else it's a solid, typical, full-feature nVidia part.
 
Back
Top