[maven];879975 said:
Hmm, the final names seem to come from a Unix mindset, rather than a Windows. Perhaps Nintendo is using some Unix, Linux, or BEOS variant? I'd guess BEOS, since it doesn't have a GPL like Linux, and is already well-optimized for PowerPC hardware. That would fit with Nintendo's recent comments about wanting to make the Wii with almost no money spent on the hardware and solely focusing on the controller/interface.
I also think the GPU is likely a semi-reworked Flipper. It's GC compatible, but it's not a duplicate cuz that would just be too easy.
Still, I have to say the launch games and the previews of some games out there are not a good sign. Cube is proven and understood hardware and if Hollywood was similar but notably superior it should show.
Zelda does look pretty darned good for a Cube game though. Assuming the Cube version looks the same. It's sorta on the same level as RE4 IMO.
Anyone noticed banding yet? I don't think I have.
I doubt that the GPU differs in anything other than performance. The games haven't shown anything GameCube didn't do, just more of it, and there's even some GC effects they haven't touched yet. Remember Miyamoto's comments on spending almost nothing on the hardware development. That several Ubisoft games received major graphical downgrades once Nintendo released the final devkits kind of indicates this. Both Red Steel and Rayman were not on par with next gen level graphics originally, but they were what you'd expect from something that reasonably follow the increase in hardware you'd expect naturally from Gamecube to Wii while maintaining a small space and a cheap cost. Radeon 9600pro level of graphics (give or take a 10^0.2), yet the final games were downgraded to what you'd expect out of gamecube, or maybe a multiplatform Xbox game.
Zelda is not on the same level as RE4. The first major boss of Zelda looks similar to a boss in RE4, and trust me, even just on memory I can tell the RE4 boss looked vastly superior. (btw, is Zelda 30fps or 60fps? I believe 30, though it's completely stable)
And I've noticed tons of banding in Zelda, though I don't have component cables yet so that could be part of the reason. I've also noticed that Zelda has a blur filter in effect for most of the game (especially in the shadow world) that works nicely to cover up jaggies. I heard Microsoft was counting effects like these as AA for 360 games, so it's nice to see Nintendo has already adopted Microsoft's standard of AA. Will be interesting to see if this effect is retained for the cube version.
I don't believe the gamecube supported MSAA, but with the main memory of the gc now integrated as edram, would it be possible to do AA and similar effects for almost no cost with it?
If we assume a best case scenario (25% difference taking Flipper to 90nm) then what could have been added to increase Flippers size by 105%?
More edram, or 2 TEVs and 8 pixel pipelines (though that seems well beyond the performance current titles are showing, and a performance increase of that type would be easy to exploit....given the current games aren't even all maintaining 60fps, it's more likely edram or features that take additional effort to exploit as even a dx7 level gpu nearly 2 gigapixels of fillrate should easily show that). Or it could even be a pixel processor, early games are already showing better physics than I'd expect. Not great physics, but going from gamecube where not a single game used physics to any notable extent, to seeing wii sports have a fairly accurate (even if noticably flawed) physics model at 60 fps is interesting. But even just offloading vertex shaders to the gpu could have given devs more cpu time to play with. If it turns out out to be vertex shaders, I hope they're very flexible (like VS3.0 level) even if the pixel shaders are closer to PS1.0 level.
The only thing I found was a list of many different patents that might apply to Wii, some of the stuff was about DRM and integrated shop systems and other stuff probably related to the Virtual Console.
Perhaps some extra hardware features to make the Wii hack proof?
BTW, Wii was supposed to have 512MB of flash ram included. Where is it?
They must have had more to do with the Wii, however, to get their name on a chip inside it. Providing system updates isn't something you get your name on a chip for.
Why their name is on the GPU, however, I will never know.
Maybe they made the OS/firmware? Maybe they made the whole system chipset? Maybe the firmware is integrated into the GPU?