WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
I will not tolerate anyone posting random stupid and unbased crap in here. Part of discussing is not only showing respect for other participants, but also listening to them when they actually are more knowledgable on the topics at hand. Moreover, please make also sure that if you're using conjectures and estimates etc. that you're including how you'll derived them.

Anyone wishing to lead a technical discussion is more than welcome, though.

P.S.: Sorry Shifty for deleting your post while it had some good points in it, it was a follow-up to already deleted postings.
 
Broadway is a Power PC e750GX with extra instructions @729MHz, Gekko is the same CPU @485MHz.
Hollywood from everything we've heard untill now is the same architecture as the Flipper but clocked from 162MHz to 243MHz.
Apart from this thread I've never heard the Hollywood having 2 TMU's.

4 Pipelines x 1 TMU x 243MHz = 972MTexel/s.
4 ROP's x 243MHz - 972MPixel/s.

Broadway isn't simply Gekko at 729Mhz. At the very least it has small improvements here and there. There's really no arguing with this when you look at the recently released PowerPC 750CL. The 750CL is basically Gekko on a 90nm process (possibly slightly improved over Gekko but worst case its just the same as Gekko) and is 16mm^2. Broadway is on the same 90nm process and is 19mm^2. If both CPU's were simply Gekko on a 90nm process then both would be identical in size.

With Hollywood its even more obvious that its not simply Flipper on a 90nm process. I'm sure if you've read any decent amount of this thread you'd have seen the reason for that. But if you haven't, well then lets just say there is a very large discrepancy (far larger then in the same of Gekko/Broadway) between the size you'd expect Flpper on 90nm to be and the size of Hollywood on the same process. Needles to say its hard to see any reason for such a discrepancy other then a large amount of extra transistors (large compared to the size of Flipper in the first place of course). Apparently the extra transistors in Hollywood aren't spent on a vertex shader or a significantly improved pixel shader. So it would seem pretty likely that they are being spent on multiplying certain parts of the GPU, likely pixel pipes/TU's and possibly T&L. Factor 5's comments on Wii having a "insane fillrate to play with" seem to back this up. Obviously they'll be talking about an insane fillrate in GC/XBox terms but either way Flipper + 50% clock speed doesn't offer that..
 
Last edited by a moderator:
So it would seem pretty likely that they are being spent on multiplying certain parts of the GPU, likely pixel pipes/TU's and possibly T&L. Factor 5's comments on Wii having a "insane fillrate to play with" seem to back this up. Obviously they'll be talking about an insane fillrate in GC/XBox terms but either way Flipper + 50% clock speed doesn't offer that..

Even then, unless they are multiplying by 3, that would still be at the max 50% bigger while our worst case stimations gave it 2x the size. There should be something more, meybe just a upgrade to those parts, or something not gfx related.

Or there wasn't any sdk support for them untill the midle of last year.

According to Goto reasons it isnt dependent on SDKs (power/heat requirements).
 
Putting aside for a moment what you think about the Wii technically,what do you think about the Wii's power strategically? Was it smart for them to position themselves between last gen(GC,PS2,Xbox) and current gen(360,PS3).
Personally I think by positioning themselves halfway they get the best of best worlds without all the traps of either.
I think riding off the coat tails of the PS2 is kinda smart.Sony spent the time money and effort to build the PS2,now Wii can benefit from it to a degree. You have this huge installed base for the PS2 and still a great amount of 3rd party support. If the Wii can tap into this and cherry pick the best games it could be good for the Wii, Wii owners and 3rd party devs who would be making these games for the PS2 anyway. They could potentially also steal some PS2 installed base that aren't ready to make the full jump to next gen yet. 3rd party devs get an extra revenue stream for less then the cost of making an exclusive and Wii owners get games they might not have otherwise gotten. This is of course assuming that the ports are good,but that's always a concern no matter the platform.
By contrast you won't see the 360 getting many if any PS2 ports at all. This might seem like a good thing thing to some,but I happen to think the PS2 is a great system still and will probably get some good games. I think simply for image reasons and to satisfy the expectations of next gen owners,devs won't be porting PS2 games to 360 and will miss out some potentially great games.
At the same time you get a more powerful system,more features,a redesign and new controller. A sorta next gen experience without all the drawbacks of full next gen. You loose some games of course but there's always compromises. Overall though I think in the long run being able to pull from last gen and current gen will yield more results of games.
 
So it would seem pretty likely that they are being spent on multiplying certain parts of the GPU, likely pixel pipes/TU's and possibly T&L. Factor 5's comments on Wii having a "insane fillrate to play with" seem to back this up. Obviously they'll be talking about an insane fillrate in GC/XBox terms but either way Flipper + 50% clock speed doesn't offer that..

wasn't XGPU nearly always bandwidth limited anyway? it's pretty likely that you couldn't ever fully tap it's fillrate in game while it was much more plausible on GC because of the embedded framebuffer. what i'm saying is, Wii might not need to have the highest theoretical fillrate to have "insane" achievable fillrate, compared to last gen systems.

personaly, if hollywood was just flipper with 3x the fill/shading power i would think you could make some pretty nice looking games, assuming the 480p limit. i've been pretty disapointed so far with what developers have done graphicaly on Wii. but the system reminds me of the atari VCS in regards to software. you have to use a bit of imagination with the graphics but the games that are fun are instant classics, and there are alot of turds.
 
wasn't XGPU nearly always bandwidth limited anyway? it's pretty likely that you couldn't ever fully tap it's fillrate in game while it was much more plausible on GC because of the embedded framebuffer. what i'm saying is, Wii might not need to have the highest theoretical fillrate to have "insane" achievable fillrate, compared to last gen systems.
If comparing it to last gen, wouldn't the comparison be with PS2? If Wii has insane fillrate, is that suggesting more than PS2? Can you have less fillrate than a last-gen machine and still be insane? I guess 'insane fillrate' generally means 'enough that we'll never run out doing what we want.' The PS2's 100 screens per frame would be insane on Wii (2G Pixels a second)especially if they don't need to use so much overdraw for multipass rendering.

Can we not make a guess at available fillrate? Looking at clocks, eDRAM, and guestimates, can't we deduce a ballpark figure? Anything more than a few GPixels a second would be more overkill than insane I think.
 
If comparing it to last gen, wouldn't the comparison be with PS2? If Wii has insane fillrate, is that suggesting more than PS2? Can you have less fillrate than a last-gen machine and still be insane? I guess 'insane fillrate' generally means 'enough that we'll never run out doing what we want.' The PS2's 100 screens per frame would be insane on Wii (2G Pixels a second)especially if they don't need to use so much overdraw for multipass rendering.

Can we not make a guess at available fillrate? Looking at clocks, eDRAM, and guestimates, can't we deduce a ballpark figure? Anything more than a few GPixels a second would be more overkill than insane I think.
All of our 100 page discussion was about this.

Hey, the guys from the ARTX and the guys from the 3dfx are come from the SGI originaly, isn't they?
The flipper design is similar to the voodoo 1 original desing.And the dificulty of the architecture (form the improvement standpoint) is the same.

The voodoo used the low level glide as an interface ,and because of it they was not able to implement 32 bit.
The only one way that they was able to use is the additinal pixel unit with dedicated memory.(the v2 had 3 chip with 3*64 bit interface).
 
All of our 100 page discussion was about this.

Hey, the guys from the ARTX and the guys from the 3dfx are come from the SGI originaly, isn't they?
The flipper design is similar to the voodoo 1 original desing.And the dificulty of the architecture (form the improvement standpoint) is the same.

The voodoo used the low level glide as an interface ,and because of it they was not able to implement 32 bit.
The only one way that they was able to use is the additinal pixel unit with dedicated memory.(the v2 had 3 chip with 3*64 bit interface).

By 32bit, do you mean 32bit color or 32 bit memory bus? Because there were voodoo products with both. The VSA-100 based cards definitely had 32 bit color support in glide, and I think rampage was to support both as well, even way back at its original missed released date.
 
If comparing it to last gen, wouldn't the comparison be with PS2? If Wii has insane fillrate, is that suggesting more than PS2? Can you have less fillrate than a last-gen machine and still be insane? I guess 'insane fillrate' generally means 'enough that we'll never run out doing what we want.' The PS2's 100 screens per frame would be insane on Wii (2G Pixels a second)especially if they don't need to use so much overdraw for multipass rendering.

Can we not make a guess at available fillrate? Looking at clocks, eDRAM, and guestimates, can't we deduce a ballpark figure? Anything more than a few GPixels a second would be more overkill than insane I think.

you can compare it to whatever last gen machine you want. GC might have been fill bound in many situations, but it's not like it was fill limited to the point that it handicapped it's performance compared to the other machines (PS2 and XB) in the grand scheme of things. PS2's fillrate took a nosedive with textures, right (16 pipes @ 147mhz, but half the fillrate with textures*). if we assume 2-3x the fillrate of GC we'd be looking at about equal (at 2x) or better texel fillrate than PS2 anyway.
anyway, that still goes to my point that Wii may not have a fillrate number that excedes what PS2 or XB had last generation, but may be able to achieve it's theoretical fillrate in game.

By 32bit, do you mean 32bit color or 32 bit memory bus? Because there were voodoo products with both. The VSA-100 based cards definitely had 32 bit color support in glide, and I think rampage was to support both as well, even way back at its original missed released date.
the Voodoo3 had internal support for 32bit as well, though it could only write to a 16bit framebuffer. this was actualy the cause a few softeware issues when an app requested 32bit. i remember specificaly a fishtank screensaver that would chug along at 1-2FPS if you had your desktop set to 32bit, but would screem along at 60FPS in 16bit.



*this is something i hadn't thought of before. how does the PS2 handle multiple textures? i mean, it takes 2 pipes to draw a texel, right? it was my understanding that you'd use part of a the pixel pipeline (the ROP, i'd assume) as a TMU and another pipe to draw the pixel. so does it take 3 pipe to draw a pixel with 2 texels (1 pipe to write and 2 as TMUs?), or 4 (2 pipes as TMUs with 2 rendering pipes, combined on output). or 2 pipes and 2 passes?
 
By 32bit, do you mean 32bit color or 32 bit memory bus? Because there were voodoo products with both. The VSA-100 based cards definitely had 32 bit color support in glide, and I think rampage was to support both as well, even way back at its original missed released date.

I mean 32 bit support.
The voodoo 1/2 had 64*2/64*3 bit memory interface.
The v4 had 32 bit color depth, but due to this the glide support was very limited, and the v3 due the unified memory had cripled glide support.

So, my point: if you want a 100% percen compatibility,but paralel you want an improved speed, the solution can be the same that was used in the V2:Add an additional texture unit with dedicated memory (or 2,or 3),and you immedietly solved many issue.For the legacy programms, the soft can not see the additional memory/units,but for a bumb mapped/shadow mapped /enviroment mapped wii program you can push several more texture,and the bandwith usage is better,you don't have to move out to the main memory the shadow maps ect.
 
Last edited by a moderator:
Ironically, the first page of this thread is perhaps the most informative. And the page with the shot of the board and chip dies. This is one hell of a circular thread. :cool:
 
Ironically, the first page of this thread is perhaps the most informative. And the page with the shot of the board and chip dies. This is one hell of a circular thread. :cool:

Yes,and untill somebody don't put the sdk of the WII on a p2p, we will not see any improvement.
So, due to this I check on a weekly basis the p2p networks for the wii sdk.:)
 
The Wii appears to be designed to provide XBOX360 like graphics at SD resolutions.

What?!! No, no, no. Nothing we've seen so far, or even looking at the specs would indicate that. Even the best games we've seen so far like Mario Galaxy and SSBB have annoying jaggies and the like. Wii may not be an overclocked GC, but I certainly can't feel it's a 360 at SD res. either.
 
Is there an advantage for the CPU to be so close to the GPU?

Wii:
wiitopsy_ss_14.jpg


360:
normal_motherboard.jpg
 
Ironically, the first page of this thread is perhaps the most informative. And the page with the shot of the board and chip dies. This is one hell of a circular thread. :cool:
One of the problems here is discussion is continuing without much new info, and in a trillion posts, it's hard to find the info that may have gone before. Perhaps the first post should be edited into a summary of known facts and best guesses for reference, so people don't ask for info that we've had already?
 
good analysis on the re4 graphics engine

The point is that:in the re4 they did all of the lighting with textures,the game is not contain any real light (but the pcvs has normal for every vertices).
the wario world contain light sources,but that use shadow maps.(there is a few scene where you can see the big pixels in the shdowmap)
 
Status
Not open for further replies.
Back
Top