WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
That's a point - thinking about it, if Hollywood has additional features but not much more clock, games developed on Cube hardware will show less improvement (than if it had the same features and more clock)
 
Hollywood is reminiscent of Xenos, with that RAM on package. The little GDDR3 chip actually makes me think of Hypermemory...
 
Has anyone watched the new Pokemon videos? There's an effect when this cat-thing gets hit with green fire that's way more than 50% beyond anything I've seen on the Gamecube. Some kinda distortion thing...looks kinda like the bump-mapped glass in Red Steel, but there are several layers of it.
 
What are the chances that the 1t-sram is more than 24MB? it looks almost as big as the GDDR3 chip.

That's something I was wondering and asked about earlier. Didn't the 3MB 1T-Sram in Flipper take up about one third of the die space? Considering its on a 90nm process and is likely 1T-Sram-Q (Twice as dense as 1T-Sram) should 24MB of it really be 94.5mm^2? I'd have thought it'd be about half that, but please correct me if I'm wrong. Unless they've used standard 1t-Sram, but why if Q is cheaper?
 
Last edited by a moderator:
[maven];879975 said:

Hmm, the final names seem to come from a Unix mindset, rather than a Windows. Perhaps Nintendo is using some Unix, Linux, or BEOS variant? I'd guess BEOS, since it doesn't have a GPL like Linux, and is already well-optimized for PowerPC hardware. That would fit with Nintendo's recent comments about wanting to make the Wii with almost no money spent on the hardware and solely focusing on the controller/interface.

I also think the GPU is likely a semi-reworked Flipper. It's GC compatible, but it's not a duplicate cuz that would just be too easy.

Still, I have to say the launch games and the previews of some games out there are not a good sign. Cube is proven and understood hardware and if Hollywood was similar but notably superior it should show.

Zelda does look pretty darned good for a Cube game though. Assuming the Cube version looks the same. It's sorta on the same level as RE4 IMO.

Anyone noticed banding yet? I don't think I have.

I doubt that the GPU differs in anything other than performance. The games haven't shown anything GameCube didn't do, just more of it, and there's even some GC effects they haven't touched yet. Remember Miyamoto's comments on spending almost nothing on the hardware development. That several Ubisoft games received major graphical downgrades once Nintendo released the final devkits kind of indicates this. Both Red Steel and Rayman were not on par with next gen level graphics originally, but they were what you'd expect from something that reasonably follow the increase in hardware you'd expect naturally from Gamecube to Wii while maintaining a small space and a cheap cost. Radeon 9600pro level of graphics (give or take a 10^0.2), yet the final games were downgraded to what you'd expect out of gamecube, or maybe a multiplatform Xbox game.

Zelda is not on the same level as RE4. The first major boss of Zelda looks similar to a boss in RE4, and trust me, even just on memory I can tell the RE4 boss looked vastly superior. (btw, is Zelda 30fps or 60fps? I believe 30, though it's completely stable)

And I've noticed tons of banding in Zelda, though I don't have component cables yet so that could be part of the reason. I've also noticed that Zelda has a blur filter in effect for most of the game (especially in the shadow world) that works nicely to cover up jaggies. I heard Microsoft was counting effects like these as AA for 360 games, so it's nice to see Nintendo has already adopted Microsoft's standard of AA. Will be interesting to see if this effect is retained for the cube version.
I don't believe the gamecube supported MSAA, but with the main memory of the gc now integrated as edram, would it be possible to do AA and similar effects for almost no cost with it?

If we assume a best case scenario (25% difference taking Flipper to 90nm) then what could have been added to increase Flippers size by 105%?

More edram, or 2 TEVs and 8 pixel pipelines (though that seems well beyond the performance current titles are showing, and a performance increase of that type would be easy to exploit....given the current games aren't even all maintaining 60fps, it's more likely edram or features that take additional effort to exploit as even a dx7 level gpu nearly 2 gigapixels of fillrate should easily show that). Or it could even be a pixel processor, early games are already showing better physics than I'd expect. Not great physics, but going from gamecube where not a single game used physics to any notable extent, to seeing wii sports have a fairly accurate (even if noticably flawed) physics model at 60 fps is interesting. But even just offloading vertex shaders to the gpu could have given devs more cpu time to play with. If it turns out out to be vertex shaders, I hope they're very flexible (like VS3.0 level) even if the pixel shaders are closer to PS1.0 level.

The only thing I found was a list of many different patents that might apply to Wii, some of the stuff was about DRM and integrated shop systems and other stuff probably related to the Virtual Console.

Perhaps some extra hardware features to make the Wii hack proof?
BTW, Wii was supposed to have 512MB of flash ram included. Where is it?

They must have had more to do with the Wii, however, to get their name on a chip inside it. Providing system updates isn't something you get your name on a chip for.
Why their name is on the GPU, however, I will never know.

Maybe they made the OS/firmware? Maybe they made the whole system chipset? Maybe the firmware is integrated into the GPU?
 
Twice as dense as which type of 1T-SRAM? MoSys offers single and quad density 1T-SRAM. And 1T-SRAM-Q is the cheaper type as far as I know...
I believe the Q in 1T-SRAM-Q means quad density compared to traditional SRAM, but only double the density of normal 1T-SRAM.

94.5 mm^2 sounds right for 24 MB of 1T-SRAM-Q on 90 nm. At 130 nm 1 Mb of 1T-SRAM-Q requires 1.05 mm^2. It looks like MoSys managed to squeeze in even more transistors per mm^2 on 90 nm because even with ideal scaling that chip should come to 96.6 mm^2 for 24 MB of memory. Or perhaps the 1.05 mm^2 per Mb only applies to 1T-SRAM integrated into a chip with logic also and density is higher when you make standalone 1T-SRAM-Q.
 
Here are some examples:

http://www.pokemon.co.jp/game/wii/pbr_sp/

Check out the green fire in this vid:

http://gonintendo.com/?p=8018

Hmm, those graphics look fairly on par (imo, it's hard to compare) to the old red steel and rayman graphics. A bit worse perhaps, but beyond anything on the wii so far.
If it's still coming from genius sonority(?) then either they're the first company out making use of the final Wii devkits or they've improved their talent by quite a lot.
Well, Super Mario Galaxies is very nice looking as well, and even Wii Sports makes some good use of special effects and lighting, though overall I'd say Mario Power Tennis made better use of special effects while having more scene complexity.
 
Hmm, those graphics look fairly on par (imo, it's hard to compare) to the old red steel and rayman graphics. A bit worse perhaps, but beyond anything on the wii so far.
If it's still coming from genius sonority(?) then either they're the first company out making use of the final Wii devkits or they've improved their talent by quite a lot.
Well, Super Mario Galaxies is very nice looking as well, and even Wii Sports makes some good use of special effects and lighting, though overall I'd say Mario Power Tennis made better use of special effects while having more scene complexity.
I consider it possible that Pokemon is indeed one of the first projects primarily developed on real Wii hardware. It comes a while after launch, and there's not all that much content. Even WiiSports improved quite a bit after E3, but it's not exactly meant to be a graphics showcase, anyway.

And, I might be wrong, but I think we haven't seen a more recent build of Mario Galaxy or Metroid Prime 3 after E3. Judging by how WiiSports improved, those two games should look quite a bit better now.

I'm not into conspiracy theories, but maybe they didn't show more recent builds for Zeldas sake? Wii definitely is quite a bit more powerful than Gamecube, and Zelda would probably look like shit compared to a "real" high-profile Wii game from a technical standpoint. But upgrading Zelda, redoing all assets and such, to even remotely exploit Wiis added horsepower, would have been too expensive and time consuming...
 
Jeux-France has some Mario Party 8 screenies (seen on this GAF post, and they look pretty good. It's hard to judge actual IQ (AA, AF), but the lightning seems beyond anything the Cube could do.
 
I can see the distortion effect on that Pokemon video - some sort of frame buffer displacement distortion post-processing? I'm sure I've seen similar stuff on the cube but can't remember where - Metroid Prime heat haze maybe - but that Pokemon video does look better than what's currently on release.

I find it difficult to judge image quality from the sort of down-scaled low-bitrate videos on the net - is there anywhere with Wii video captured at original resolution and reasonable bit rate?
 
Are there 128MB chips? Just swap the 64 for the 128. Then cost wouldn't exactly have changed much.

AFAICT, no one makes 1024Mb chips. The limit is at 512Mb. Either they make bigger chips and thus less per wafer or they use a smaller manufacturing process and deal with other manufacturing issues. Either way, it's gonna cost more...
 
Last edited by a moderator:
Here are some examples:

http://www.pokemon.co.jp/game/wii/pbr_sp/

Check out the green fire in this vid:

http://gonintendo.com/?p=8018

I remember those kinds of translucency effects were also found in Final Fantasy: Crystal Chronicles (The effect the chalise made when you entered maisma).

Jeux-France has some Mario Party 8 screenies (seen on this GAF post, and they look pretty good. It's hard to judge actual IQ (AA, AF), but the lightning seems beyond anything the Cube could do.

There's no doubt that Wii lighting is superior overall to Cube lighting.
 
That several Ubisoft games received major graphical downgrades once Nintendo released the final devkits kind of indicates this. Both Red Steel and Rayman were not on par with next gen level graphics originally, but they were what you'd expect from something that reasonably follow the increase in hardware you'd expect naturally from Gamecube to Wii while maintaining a small space and a cheap cost

The question is why, those games, had downgrades?
Because if there is new HW devs would have hardly use any of that, look at 360 even using X800 their launch games looked like XB ones very bellow from the render targets of the GR3 (even the final game, with a 4 months delay), and now you have GoW.

HW needs time to be used , even if it is just raw power, so even if that is possible dont expect it at launch. In this case it should be even harder as they needed to have all those games playable since E3.

More edram, or 2 TEVs and 8 pixel pipelines (though that seems well beyond the performance current titles are showing, and a performance increase of that type would be easy to exploit....given the current games aren't even all maintaining 60fps, it's more likely edram or features that take additional effort to exploit as even a dx7 level gpu nearly 2 gigapixels of fillrate should easily show that). Or it could even be a pixel processor, early games are already showing better physics than I'd expect. Not great physics, but going from gamecube where not a single game used physics to any notable extent, to seeing wii sports have a fairly accurate (even if noticably flawed) physics model at 60 fps is interesting. But even just offloading vertex shaders to the gpu could have given devs more cpu time to play with. If it turns out out to be vertex shaders, I hope they're very flexible (like VS3.0 level) even if the pixel shaders are closer to PS1.0 level.

You do have a hint at physics HW



Anyway given how much devs had used GC HW (eg TEV) I guess it will be a long time to use Wii HW even if it is just more of the same, althought I would think that there would be much more sense in the addition of some new HW like VS (even ERP(?) said that the TEV couldnt be fully exploited without those), physics... than TEV units, that probably wouldnt give much more than a well used (Factor 5 level) GC one and limited by CPU or VS, Edram, once the rez still the same,...

About one of the CPU upgrades I think it will probably be a upgrade of the FPU/SIMD to a four-way SMID, isntead a two-way one, plus the compression tech from the 750CL.
 
Last edited by a moderator:
Wait, what's going on here? Hollywood's size is actually bigger than Broadway? And the die shrinkage don't jive with the 90 nm process? Is it even a good idea to have a GPU with a pretty good jump in abilities, with a CPU with very low jump (assuming this is what everyone's talking about)?
 
Status
Not open for further replies.
Back
Top