WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
Looking at the SSBB trailer again, at first I have to disagree. I see nothing other than a mix of standard vertex lighting and textures. But on closer inspection, it actually seems to mix up lighting styles. Looking at Pikachu, in the cartoony-background level where he first appears, he's shaded a lot better then the 'cross the road' type level that seems to be mimickng N64 I think. Which is...odd!

Odd is a good word for it. Personally I am very interesting in knowing how is made the lighting in games like Mario (eg the starting planet in the demos with lots of green), because it is odd once it doesnt look like the usual TnL games it flat lightning (like in GC games) but it doesnt seems like there is any (advanced?) shading in it, just some EMBM in some parts (eg boss fights), or anyhere compared to next gen games.

I wonder if they did put some (simple) per pixel effect that while not expensive is enought to the this result that does have a much better sence of depth. This is also presented in MP3 or Project Hammer and a few other also present better lighting.

Anyone does have a clue of what they are using to do this?

Edit: this actually made me remember some of the old (no reflexive) Geforce 2 per-pixel demos :???:.
 
Last edited by a moderator:
Anyone does have a clue of what they are using to do this?
One possible trick is a spherical/cubic lightmap, with dark underneath and lighter on top. This can be quite effective at times, used additively with other lighting. It's surprising how much can be achieved with a few texture layers.
 
Forgive me if I'm incredibly mistaken, but in the SSBB video, that part with the sunset at the castle, is that HDR? It's after Metaknight appears and with the clouds and such.
 
zeckensack said:
88MB is just extremely little.
It's even less after you subtract the OS reserved memory :p

To be fair though, I think the memory should be plenty enough given the graphical targets (SDTV and processing power available). Relative to a GCN game, you are looking at 3-4x higher texture/polygon memory budget, easily.
 
I think that IBM has doubled the number of FPU unit (to have 3 times the perf of the Gekko). Why that? Because a lot of games (Red Steel and Far Cry Vengeance) have very good lighting use (lights that shine throught windows for example). For the GPU, I think that they just added wifi controller and some little boost on the TEV to make the 8 texture layers more achievable. So we can conclude that the Wii will have 3 times more FPU power on the CPU (to have better physics and better local lights than the Gekko) and twice more perfs on the texture layers. For polygons, pixel filrate, integer unit (on the CPU), it's a 1.5 more.

I hope that i'm right :D
 

Thanks, I missed those. The 1T-Sram must have at least 3.9GB/s bandwidth (50% more then GC's main ram from the 50% overclock). So I wonder what the GDDR3 memory has. Apparently the memory is rated 700Mhz maximum and is on a 32bit bus. The 1T-Sram will be one a 2 multiplier with the GPU (like GC) and a 3 multipler for the GDDR3 memory would be over 700Mhz so I suppose it has to be a 2.5 multiplier. Which would make it 607Mhz and 4.85Gb/s bandwidth. Correct or not Faf? :)
 
If the GDDR, external 1-T, GPU and CPU are connected via a crossbar in the GPU, does this mean the system would see a total bandwidth of ~8 GB/s, provided the CPU generally accesses a different memory to the GPU (e.g. CPU <-> 1T, GPU <-> GDDR)?
 
I think that IBM has doubled the number of FPU unit (to have 3 times the perf of the Gekko). Why that? Because a lot of games (Red Steel and Far Cry Vengeance) have very good lighting use (lights that shine throught windows for example). For the GPU, I think that they just added wifi controller and some little boost on the TEV to make the 8 texture layers more achievable. So we can conclude that the Wii will have 3 times more FPU power on the CPU (to have better physics and better local lights than the Gekko) and twice more perfs on the texture layers. For polygons, pixel filrate, integer unit (on the CPU), it's a 1.5 more.

I hope that i'm right :D
The WiFi controller is not part of the LSI as far as I know. Wii uses an integrated WLAN/ Bluetooth chipset by Broadcom, it's on the WLAN module in the front of the system. I think you were misled by the BroadOn logo on the LSI? BroadOn is not a networking company, they develop DRM solutions, secure processors and what not. Lots of different, Wii specific stuff.
 
The WiFi controller is not part of the LSI as far as I know. Wii uses an integrated WLAN/ Bluetooth chipset by Broadcom, it's on the WLAN module in the front of the system. I think you were misled by the BroadOn logo on the LSI? BroadOn is not a networking company, they develop DRM solutions, secure processors and what not. Lots of different, Wii specific stuff.

I didn't notice that, sorry :D however, for a 249 dollars, having WIFI out of box is insane when Microsoft and Sony don't offer that on 400 and 500 dollars console....

But the question is : does the bigger size of hollywood on 90 nm hide?
 
]To be fair though, I think the memory should be plenty enough given the graphical targets (SDTV and processing power available).

That is interesting as it means that (depending on how much is the GC limited by its memory) that Wii is in the maximun 2,5-3x as powerfull (once it only have 3-4x the memory for texture/polygon), like the first rumors said. It seems that the main improvements will be mostly in the qualitity of the textures/polygons and hopefull in a few more things (AI, physics and animation please :LOL:).

Relative to a GCN game, you are looking at 3-4x higher texture/polygon memory budget, easily.

So the really good things are still to came, because till now there isnt a game with 3-4x the qualitity in textures.
 
Last edited by a moderator:
it would be nice to have not only an upgrade of the TEV unit, but also an upgrade or total replacement of the XF (geometry engine) unit too, for the higher polygon counts. I did notice Metroid Prime 3's higher geometry over MP1/MP2.

sadly, there's no chance of Wii reaching the level of the CG/FMV scenes that Nintendo showed at SpaceWorld 2000 (Metroid, WaveRace, Rebirth) which featured per-pixel lighting, true motion blurr, huge poly counts, large amounts of AA and all at 60fps.
 
it would be nice to have not only an upgrade of the TEV unit, but also an upgrade or total replacement of the XF (geometry engine) unit too, for the higher polygon counts. I did notice Metroid Prime 3's higher geometry over MP1/MP2.

sadly, there's no chance of Wii reaching the level of the CG/FMV scenes that Nintendo showed at SpaceWorld 2000 (Metroid, WaveRace, Rebirth) which featured per-pixel lighting, true motion blurr, huge poly counts, large amounts of AA and all at 60fps.

In fact, the only thing needed on the XF unit is to ameliorate the lighting. The beauty of the flipper architecture is the combination of geometry unit with lighting, bump mapping and all...I think that the Wii need to have 8 hardware lighting computed for free, so the polygon count will be largely better :D

For the demo at Spaceworld 2000, a part of Rebirth was calculated on real time ;)
 
That little clip of Metroid from Spaceworld doesn't look much better than what we saw on the Gamecube. Maybe your memory makes it look better than it is...or doesn't take into account that tech demos have a lot less overhead than in-game situations. And apparently, only one short part of the Rebirth video was on Gamecube...and that part doesn't look mind-blowing, either.
 
Is Faf a developer? If you've seen Wii hardware, can you elaborate on the GPU? Pipelines, possible dual core, etc?

I'm actually quite surprised that no one's leaked anything more than what's been said so far.
 
Is Faf a developer? If you've seen Wii hardware, can you elaborate on the GPU? Pipelines, possible dual core, etc?

I'm actually quite surprised that no one's leaked anything more than what's been said so far.

There are NDA (non disclosure agreements) on the hardware because Nintendo don't want to have Wii compare with Xbox 360 and PS3...and there are no dual core because it would be so much power consuming that it's not good for a "17 watts" system :D
 
Status
Not open for further replies.
Back
Top