WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
6 months away from the launch of Nintendo Wii, and we still don't know exactly what Hollywood is.

is it a Flipper clocked 1.5x faster?

is it a new GPU with Flipper instructions/BC ?

something inbetween?


final Wii dev-kits are supposed to be out to developers by June, last I remember.

architectural details of Hollywood..... not public domain yet. will they ever be?

Wiiiiiiiiii.... leak the info on the WiiGeePeeYou :D
 
I don't know, and last time I did fool Dave he didn't knew either, which makes me think it's the best kept secret ever ! ;)

I would think it has embedded memory just like the NGC GPU, but that it is programmable, output 853*480 images, and I'd like it to have a standard color range this time around. (8:8:8:8 or better FP10 ;))

Oh! and it's called Hollywood, that we are sure of ! ;)
 
For me:

Voodoo3/Flipper>>>>VSA-100/Hollywood.

Improved architecture with new things, more effective and more powerful per clock cycle.
 
What we know officially from Nintendo (from the Japanese Wii site):

Hollywood designed by ATI
90nm
embedded DRAM
LSI
 
Well couldnt they just add 4 more flipper pipes, a little extra sauce here an there. something like that.
Btw are they totally dumping the GC controller, or will it be left?
 
overclocked said:
Well couldnt they just add 4 more flipper pipes, a little extra sauce here an there. something like that.
Btw are they totally dumping the GC controller, or will it be left?

There are four gamecube controller ports and two memory card slots under a flap on the top of the Wii.
 
I am guessing that they will use a flipper variant (don't hold me to this) simply to keep the cost down so that GC emulation doesn't require much additional hardware.
 
overclocked said:
Btw are they totally dumping the GC controller, or will it be left?
I think you can plug in the GC controller but the "classical" controller is this:

1360_photo_classic.jpg


It has three shoulder buttons like the GC controller.
 
Considering that it only has to output a 480p image - what what we have seen of its graphics capabilities from E3 suggest a stone age GPU. Has anyone even seen a Wii with even a hint of pixel shader effects yet?

Even if they licensed R350 level tech (Radeon 9800), they should have had graphics far superior compared to what we have seen it do so far. What is going on?
 
inefficient said:
What is going on?
I can only guess, but I'd say one of these two options is likely:
- They went way way cheap on the GPU or
- the final design was not ready in time for E3 / for the development pre-E3.

I really hope it's the latter.
 
inefficient said:
Considering that it only has to output a 480p image - what what we have seen of its graphics capabilities from E3 suggest a stone age GPU. Has anyone even seen a Wii with even a hint of pixel shader effects yet?

Even if they licensed R350 level tech (Radeon 9800), they should have had graphics far superior compared to what we have seen it do so far. What is going on?

The games were running in a Gamecube with the Wiimote adapted to it.
 
I can only guess, but I'd say one of these two options is likely:
- They went way way cheap on the GPU or
- the final design was not ready in time for E3 / for the development pre-E3.

I really hope it's the latter.
But if the latter, what are developers developing for? How can they come this far in software, with the final hardware seeming to be coming out a few months before release, and hope to get anything like potential from the hardware? Or are we going to see a new standard in poor first-gen titles because the specs of the development kits were a quarter that of the final hardware? We;ve heard a lot about SDKs for XB360 and PS3 not being representative, but Wii's sounds in a totally different ball park yet there's no noise!

I mean, I think a 9800 would handle so much more, poly's and all, with AA at SDTV res, yet games don't appear to be targetting that level of performance. If Wii is that good in the hardware stakes, the launch games won't be anywhere near pushing it, barely even using it. I think it's the low poly that strikes me. You can add pixel shaders fair enough, but they don't seem to have any more triangle capacity than current-gen. Does anyone make PC GPUs that can't handle more than that these days? I think the cheapest available components outperform XB by two generations, don't they? A quick look at eBuyer finds GeForce FX 5xxx series at around £20, with 128 MBs RAM. The actual processors must be dirt cheap. Put one of these in a closed box and it ought to look fairly impressive. Somewhere between XB and Sega's Virtua Fighter on 6800. I really don't understand how Nintendo are handling the GPU side of things. I appreciate not using a normal PC part for BC, but wouldn't it be cheaper and more effective to put in a Flipper and an off the shelf GPU?!
 
inefficient said:
Considering that it only has to output a 480p image - what what we have seen of its graphics capabilities from E3 suggest a stone age GPU. Has anyone even seen a Wii with even a hint of pixel shader effects yet?

Even if they licensed R350 level tech (Radeon 9800), they should have had graphics far superior compared to what we have seen it do so far. What is going on?


I already saw a GC.;)

BTW I think they will go the low price way.

About tech IMO it will be a pity if they dont explore it as far as they can, on the other hand we almost have no idea what is going there (eg it seems it will have HW for physics), I personally guess that even a extention of GC could do a good job.

Anyway , like I always said, unless the it is over priced or the controler is limited to much by tthe HWI I will not complain, I will like or dislike it, and for someone with their strategy price is cruncial.

I mean if it is 3xGC/XB in raw power and a few more features, with nice games and at 99$ wouldnt you like it or would you think it is a bad console? Personally I think I would like it and (like I read somehere) there would be a lot of houses with Wii s on Xmas.
 
Panajev2001a said:
No, Nintendo confirmed that inside those GameCuble-like shells there was Wii hardware indeed.


They also said the HW isnt finished too, as long as we know they can only changed what is needed to run the new controllers (like the first/current(?) SDKs).
 
Panajev2001a said:
No, Nintendo confirmed that inside those GameCuble-like shells there was Wii hardware indeed.

That's called a DevKit ;), and it's not a final one.
 
inefficient said:
Considering that it only has to output a 480p image - what what we have seen of its graphics capabilities from E3 suggest a stone age GPU. Has anyone even seen a Wii with even a hint of pixel shader effects yet?

Of course, fat more then a hint, but, as PC999 says, we already saw more then a hint pixel shader effects in the best looking GameCube games. Though I'm certainly seeing a lot more of it in certain early Wii games. For instance ExciteTruck, developed by Monster Games (hardly a technically great developer) has lots of good effects. I'm hoping that the reason for its relatively impressive graphics is due to the fact that Nintendo's second party developers have had near/final dev kits much longer then any other developer. Which will hopefully mean that other more technically gifted developers out there will catch up and overtake those kind of graphics by launch (unless I'm really underestimating how technically savy Monster Games are?).
 
Last edited by a moderator:
Oh, it seems that Pana has returned.

I believe that the Hollywood is just a SuperFlipper with new things like:

-Vertex Shader instead of fixed Geometry Shader
-DSP for Full Screen AntiAliasing, Motion Blur and certain FX. ATI can take material from the Xenos Daughter Die.
-TMU and TEV apart, like the Pixel Shader and TMU in Xenos Mother Die, like the TMU and Pixel Shader in ATI Rx5x0 architecture for PC.

I am not expecting a SuperGPU like ones in 360 and PS3 but 5 years have passed and ATI must have done more than overclocking it.
 
lets say the current Wii development kits are Gamecube 1.5x using Broadway and Flipper 1.5x but the actual Hollywood GPU is a significant improvement with modern ATI technology in it--with elements of R5xx technology or even R3xx technology (still in a custom, Nintendo-exclusive GPU that's inexpensive).... then developers and gamers will see a NICE improvement in 2nd-gen Wii games that would be arriving 3rd and 4th Q 2007. while for 2006 and early 2007 we're stuck with Gamecube 1.5 visuals. I hope thats the case. I hope Hollywood isnt simply a Flipper 1.5x being that Flipper is year-1999 technology.


Urian said:
Oh, it seems that Pana has returned.

I believe that the Hollywood is just a SuperFlipper with new things like:

-Vertex Shader instead of fixed Geometry Shader
-DSP for Full Screen AntiAliasing, Motion Blur and certain FX. ATI can take material from the Xenos Daughter Die.
-TMU and TEV apart, like the Pixel Shader and TMU in Xenos Mother Die, like the TMU and Pixel Shader in ATI Rx5x0 architecture for PC.

I am not expecting a SuperGPU like ones in 360 and PS3 but 5 years have passed and ATI must have done more than overclocking it.

exellent little post Urian. I can't argue with that, and certainly hope you're right :)
 
Last edited by a moderator:
Personally I doubt we will see any tech from PC parts, not only is probably expensive and I guess it should be very hard to integrate into a flipper like/extention chip, plus those are made for 300-600 Mhz chips, isntead of 240Mhz (if IGN is right).

I think that things like (egs) upgraded and more TEVs units, a better and faster EMBM (like), new compressions or some special HW for specific fxs like shadows should be much more probable than any other thing and given the results of a very badly used TEV unit (like everything else) on GC one should see very good improvements even over XB IMO.

IMO Hollywood will be like the X1900 version of the X800 much faster and more advanced but still based on the same architeture.
 
Status
Not open for further replies.
Back
Top