WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
I hope they still have the same "talent" with HW that they showed in GC:LOL: , anyway that is good looking if they can suport a good number of those in a more complex scene it will be mcuh better than must are expecting.

Actually...

New footage from IGN!:
http://media.wii.ign.com/media/818/818481/vids_1.html

Looks like they're hitting target renders (the bloom does look more reasonable, but the models and lighting themselves are nothing short of what was shown in the very first video).

Some effects still need a little work, though (mounds disappear half-second before switching to next screen). But other than that, the environments themsevles don't look bad. Some of the modeling looks just as good as Sonic 360/PS3.
 
Last edited by a moderator:
GF3 might have the flexibility edge, but Rebel Strike + 50% on a Geforce 3? Yeah right.

What's the size of the 7100 die, how much bigger would it be with 3 MB of 1T-SRAM, and how does its manufacturing process compare to Hollywood's? How well would it run if it were emulating a Flipper? Also, the memory on that video card you showed me is GDDR2. Wii uses 64 MB GDDR3 and 24 MB 1T-SRAM, both of which are more expensive than GDDR2.

I was more going to show that for around the same cost as Nintendo is supposedly paying for their GPU + 64MB GDDR3 that you could get a retail video CARD with more memory and a vastly more advanced and decently more powerful gpu (and the GDDR2 is clocked faster than the Wii's GDDR3 I believe).
I'm not arguing that Nintendo should have picked a 7100 or similar gpu, I'm arguing that analyst report is wrong about the cost of the gpu. I wouldn't say it's too radical to argue that, considering they don't even take the cost of the wiimote, sensor bar, wireless chips and antennae, and various other miscellaneous parts into account.
 
GF3 might have the flexibility edge, but Rebel Strike + 50% on a Geforce 3? Yeah right.

Why not? GeForce 3 has much more RAM, more bandwidth, a lot more fillrate, a vertex shader, and better pixel shader hardware than Flipper. GeForce 3 could rival Xbox I imagine because it has its RAM all to itself.

Anyone care to guess how large R360 would be on 90nm? :)
 
Last edited by a moderator:
Actually...

New footage from IGN!:
http://media.wii.ign.com/media/818/818481/vids_1.html

Looks like they're hitting target renders (the bloom does look more reasonable, but the models and lighting themselves are nothing short of what was shown in the very first video).

Some effects still need a little work, though (mounds disappear half-second before switching to next screen). But other than that, the environments themsevles don't look bad. Some of the modeling looks just as good as Sonic 360/PS3.

Looks really nice, such a shame it's only in 4:3
 
Come on man :) , let the defense mode rest a bit.

Fox5 is making a reasonable comparison, its not perfect, (as comparisons rarely are) but his just questioning the GPU price in the article. And since we dont have a lot of details about the WiiGPU, he is pointing at PC parts that are more advanced yet close to the price range, not to mention featuring more memory and more up to date effects.

So whats the problem? :)
 
GF3 might have the flexibility edge, but Rebel Strike + 50% on a Geforce 3? Yeah right.

What's the size of the 7100 die, how much bigger would it be with 3 MB of 1T-SRAM, and how does its manufacturing process compare to Hollywood's? How well would it run if it were emulating a Flipper? Also, the memory on that video card you showed me is GDDR2. Wii uses 64 MB GDDR3 and 24 MB 1T-SRAM, both of which are more expensive than GDDR2.

It should a good way smaller than hoolywood, as hoolywood (72mm^) is almost as big as a 7300 (76mm^), and if the edram used to be 1/3-1/2 of the die chip in flipper now it should be around 1/6-1/4 of it, ie 12-18mm^.

Anyway one can complain at will as none cant find any reason to not have a significantly more powerfull and actrative Wii, once it would be very cheap to obtain it.

(If at least Hoolywood could do what a 7300 can...:cry: )

Actually...

New footage from IGN!:
http://media.wii.ign.com/media/818/818481/vids_1.html

Looks like they're hitting target renders (the bloom does look more reasonable, but the models and lighting themselves are nothing short of what was shown in the very first video).

Some effects still need a little work, though (mounds disappear half-second before switching to next screen). But other than that, the environments themsevles don't look bad. Some of the modeling looks just as good as Sonic 360/PS3.

Personally I think it still a long way from the render targets in all ways. If this one, like their last games, is one of the worst looking from Wii, then we shuld see some great looking games in the future, but it does seem to be on par with the others teams, which means that something as good looking as this will probably will be limited to very symple scenes.
 

Beats me, but PC devs have had 5 years to do it, and they haven't. Considering that even the highest-end GF3 has less than half the vertex-crunching power of the Xbox, that there seems to be a general agreement that if you design entirely around Flipper's strengths, you'll be able to do a few things you won't be able to do on XBox, and that Rebel Strike was designed entirely around Flipper's strengths, if you merely OC'd the Flipper to 243 MHz (faster than GF3), you'd have a chip that you could certainly get more performance out of than GF3 in at least a few different contexts. Not to mention whatever Hollywood's extra transistors are for. Hollywood should have ~30 GB/s total to the eDRAM, too.

Maybe it's Flipper's better bandwidth. I think I've read that you can get a lot more performance out of those Ti-era Nvidia cards with better RAM, but I could be wrong.

Anyway, like I said...without knowing how much it actually costs to manufacture the silicon on the 7100, it's kind of silly to compare it to whatever Nintendo's supposedly paying for the silicon in Wii. It would make a lot more sense to try to figure out where this analyst came up with their costs. If for example, it's based on what NEC charges various companies to make dies of a certain size, then you really can't argue with it, no matter how powerful a GF 7100 may or may not be.
 
I would think that the 24MB 1T-SRAM is included in the cost of the graphic chip. And the graphic chip also uses a special multi-die packaging that is very likely much more expensive than the regular single die packaging used for PC video cards. The graphic chip also uses quite a lot of eDRAM requiring a special process that should be bit more expensive than the regular logic process used by Nvidias pc chipsets.
 
Actually...

New footage from IGN!:
http://media.wii.ign.com/media/818/818481/vids_1.html

Looks like they're hitting target renders (the bloom does look more reasonable, but the models and lighting themselves are nothing short of what was shown in the very first video).
The graphics aren't bad, but they're totally missing the quality of the original renders. There's no GI class illumination going on here. It's just standard illumination techniques. A nice enough looking title, but more suggestive that those super-dooper original renderings we were getting before anything much at all was known about Wii were just PR images.
 
Why not? GeForce 3 has much more RAM, more bandwidth, a lot more fillrate, a vertex shader, and better pixel shader hardware than Flipper. GeForce 3 could rival Xbox I imagine because it has its RAM all to itself.

Anyone care to guess how large R360 would be on 90nm? :)

Wii's GPU has a lot more bandwidth (8GB/s for main ram and at least 30GB/s for embedded ram), more pixel fillrate (972mpixels/s minimum) and a lot more T&L power (at least 23-30 million pps possible in game) compared to a Geforce 3. Also depending on the possible upgrades over Flipper it could have more texture fillrate, even higher pixel fillrate, even more embedded memory bandwidth and a lot more T&L power as well.
 
Last edited by a moderator:
Wii's GPU has a lot more bandwidth (8GB/s for main ram and at least 30GB/s for embedded ram), more pixel fillrate (972mpixels/s minimum) and a lot more T&L power (at least 23-30 million pps possible in game) compared to a Geforce 3. Also depending on the possible upgrades over Flipper it could have more texture fillrate, even higher pixel fillrate, even more embedded memory bandwidth and a lot more T&L power as well.

Well we might eventually see what it can do assuming another decent game comes out (Zelda is it for quite a while!). I just don't know though... We don't even know the clock speed for certain! (do we?) I'm afraid the pixel shader hardware is really very limited (Flipper-esque) and that we'll be stuck with simple lighting/shadowing and mainly pure polygon detail instead of fancy texturing. Kinda like the over-blown Rebel Strike. Who really cares how much hardwired T&L capability it has if the CPU gets stuck with the fancier effects? None of the preview shots showcase any notable increases in quality over Flipper, IMO. And, man, if the thing still dithers.... :???: One would think they coulda changed at least that for Wii Zelda.

GeForce3 at 640x480 32-bit would have massively better IQ for certain. Flipper has awful texture filtering and... that dither.
 
Last edited by a moderator:
Well we might eventually see what it can do assuming another decent game comes out (Zelda is it for quite a while!). I just don't know though... We don't even know the clock speed for certain! (do we?) I'm afraid the pixel shader hardware is really very limited (Flipper-esque) and that we'll be stuck with simple lighting/shadowing and mainly pure polygon detail instead of fancy texturing. Kinda like the over-blown Rebel Strike. Who really cares how much hardwired T&L capability it has if the CPU gets stuck with the fancier effects? None of the preview shots showcase any notable increases in quality over Flipper, IMO. And, man, if the thing still dithers.... :???: One would think they coulda changed at least that for Wii Zelda.


A,c'mon.Between the dx7 style pixel pipeline and a dx8 style vs/ps1.1 pipeline the diference is not that big.The other thing is with a good and fast frame buffer you can do more interesting and nice thing than with a ps1.1 pipeline.
(teh other thing is the pixel shader is tool to program a complex rendering pipeline.So the ps thing is not a mistical thing,but it mean a complex texturing unit)
 
A,c'mon.Between the dx7 style pixel pipeline and a dx8 style vs/ps1.1 pipeline the diference is not that big.The other thing is with a good and fast frame buffer you can do more interesting and nice thing than with a ps1.1 pipeline.
(teh other thing is the pixel shader is tool to program a complex rendering pipeline.So the ps thing is not a mistical thing,but it mean a complex texturing unit)

Well I would hope that Wii isn't limited to DirectX 8-level stuff either. But it may not even be that flexible.

One thing that bothers me about Cube games (and makes them look like PS2 games) is the lack of quality texture filtering and mip mapping. The Voodoo1 could do passable mip mapping. Most Cube games don't seem to bother and you get horrible aliasing in the distance as a result. Wii Zelda is the same. So fancy texturing unit my arse! Now I don't doubt that the hardware can do this (N64 could!), but there's obviously some big bad limitation in there that's preventing them from using it.
 
Last edited by a moderator:
For kicks 'n' giggles:

Flipper: 106 mm^2, 180nm
Hollywood: 72 mm^2, 90nm
GF 7300 LE: 77 mm^2, 90nm

Remember also that Nintendo can't sell a bad Hollywood in a "Wii GS."
 
Well I would hope that Wii isn't limited to DirectX 8-level stuff either. But it may not even be that flexible.

One thing that bothers me about Cube games (and makes them look like PS2 games) is the lack of quality texture filtering and mip mapping. The Voodoo1 could do passable mip mapping. Most Cube games don't seem to bother and you get horrible aliasing in the distance as a result. Wii Zelda is the same. So fancy texturing unit my arse! Now I don't doubt that the hardware can do this (N64 could!), but there's obviously some big bad limitation in there that's preventing them from using it.

voodoo 1 : 2 megs frame buffer/2 megs tex buffer. (so it had doble more tex buffer)
The main point in the flipper is the embedded 24 mes 1t mem.
In the xbox2 you can see 10 megs similar frame buffer, and the speed of that is extreme.

Oh,and the other thing :the Nintendo have a patent for on-fly mip-map generation.:)

The biggest issue is the small tex buffer and the time that needed for a good optimalisation.But if it is the leading hardware the developers will have enought time and resource to do all of the low level optimalisation.

And the other thing is on the new core you have enought space to do many interesting thing.
 
Last edited by a moderator:
a lot more T&L power (at least 23-30 million pps possible in game) compared to a Geforce 3


Here did you get that number, or you are already assuming significant upgrades to the TnL unit?

Who really cares how much hardwired T&L capability it has if the CPU gets stuck with the fancier effects?


There is a much bigger addition of transistores in the GPU than in the CPU, so if the feature set is the same wouldnt that mean a much more CPU limited console in a feature set already to dependent of the CPU for its own good.

Anyway they can hardwire the fancier fxs too (like those show in the PR ss of Pokemon or RS), in fact they alreaedy did in flipper(like EMBM or even self shadwing, althought this incarnation is to limited).

This is a console with a GPU with more transistores in logic than XGPU plus edram, with more transistores in the CPU than XCPU, more memory and much more BW, higher clock speeds, a much much much more efficient architeture than XB and it is designed by some of the best gfx/CPU of market in the world, so I really thing it will be very strange if it is/looks overall inferior to the XB. (you can always slap in my face the Geforce FX) So I do find relatively easy to belive in the ATI commets (of being a new architeture, not the tip of iceberg ones).



None of the preview shots showcase any notable increases in quality over Flipper, IMO. And, man, if the thing still dithers.... :???:

How many launch games, in the history of consoles, designed with just 3 months of HW that resemble the final HW did (besides SW:RL)?
 
Last edited by a moderator:
Wii's GPU has a lot more bandwidth (8GB/s for main ram and at least 30GB/s for embedded ram), more pixel fillrate (972mpixels/s minimum) and a lot more T&L power (at least 23-30 million pps possible in game) compared to a Geforce 3. Also depending on the possible upgrades over Flipper it could have more texture fillrate, even higher pixel fillrate, even more embedded memory bandwidth and a lot more T&L power as well.

I took a random guess on the geforce + 50% performance, and just assuming that the games would be fillrate limited.
The 7100 has a 1.4gigapixel fillrate, blows away the geforce 3's T&L power (so if you're arguing that a +50% geforce 3 couldn't handle rebel strike's poly counts...well it's kind of irrelevent since in that one area the 7100 should be substantially more powerful), and a more efficient memory crossbar than the geforce 3.

And a geforce 3, at least when paired with a fast cpu (say around 2 ghz athlon) and 256MB of ram could run doom 3 about as well as the xbox. Wouldn't be surprised if that was true for halflife 2 also.

For kicks 'n' giggles:

Flipper: 106 mm^2, 180nm
Hollywood: 72 mm^2, 90nm
GF 7300 LE: 77 mm^2, 90nm

Remember also that Nintendo can't sell a bad Hollywood in a "Wii GS."

I doubt that Nvidia is having trouble making those low end chips.
Anyhow, if the 7100 is on 90nm then, through extrapolation (couldn't find the actual die size) it should be between 50mm^2 to 60mm^2. Smaller if it's on 80nm, which I believe many of nvidia's low end chips are. So assuming every hollywood is without flaw, they could have one failed 7100 for like every 3 or 4 they make.

That said, I believe that there is more to the Wii's gpu than we've seen.
My personal belief is that it either has:
A. Physics hardware......I doubt it
B. Severely beefed up vertex processors, possibly improved pixel shaders. Additionally, Nintendo will have a custom API to use the vertex shaders for physics calculations. ATI was doing research in that area, and it'd be the most sensible graphics things for Nintendo to include when they're not focused on increasing visible detail.
or C. Tons and tons of DRM stuff to keep the system from being hacked and mass produced illegally.
 
The main point in the flipper is the embedded 24 mes 1t mem.
In the xbox2 you can see 10 megs similar frame buffer, and the speed of that is extreme.

Flipper has about 3 MB of embedded 1T-SRAM.
1 MB embedded texture memory, 2 MB embedded framebuffer
the total embedded 1T-SRAM memory is actually 3.12 MB


the 24 MB 1T-SRAM is external memory in GameCube.

in Wii, the 24 MB 1T-SRAM seems to be part of the Hollywood GPU package, but not actually embedded, almost certainly still external. it's not like Xbox360's 10 MB EDRAM daughter chip.


random thoughts, ramble:

lets assume that there is NOT much of an upgrade with Hollywood from Flipper. let us assume it is only a tweaked, somewhat beefed up Flipper combined with 50% more core clockspeed, making for a 2x increase overall over Flipper, and a few more effects/features, but nothing really significantly more powerful, and overall, just a bit more powerful than NV2A, the Xbox1's GPU. okay, then that means with WiiHD / Wii2 / Super Wii / whatever in 5 years, Nintendo games could get a MASSIVE upgrade in graphics by going with a fairly powerful GPU next time, even if not a cutting-edge one. That way, Nintendo can make it seem like they've made an absolutely enourmous leap in graphics while going with a modest GPU for the time, a GPU thats more powerful than anything out today, even the highest end, but not highend for when the next Nintendo comes out.

Wii2 should be able to comfortably handle 1080p without comprimizes that Xbox 360 and PS3 have to make to hit that resolution. if Nintendo makes their HD games in 720p that's even better, since 3x more power will be there whatever the GPU is, since 1080p needs about 3x more performance over 720p. just give me a load of anti-aliasing and 3D motion blurr.

the GameCube and Wii both use narrow memory busses AFAIK with nothing being wider than 64-bit. i think they use a combination of 32-bit and 64-bit busses. not even 128-bit. Nintendo should go with 256-bit busses next time and continue to use embedded memory, but have embedded memory bandwidth greater than that of Xbox 360.

the G3-based CPUs used in GameCube and Wii (Gekko, Broadway) will have to be totally replaced with something modern. dual-core G6 or Power6 based. if not quad core. much less than what XCPU3 and Next-Gen CELL though. a GPU, again that's significantly more powerful than G80 and R600, but as I said, won't have to be near the absolute state of the art.
 
Last edited by a moderator:
Anyhow, if the 7100 is on 90nm then, through extrapolation (couldn't find the actual die size) it should be between 50mm^2 to 60mm^2.

The 7100 is more powerful than the 7300 LE, which is why it's replaced it. However, I could find numbers for only the 7300 LE. Although you have to account for 3 MB of eDRAM, it would seem that there's more to Wii's GPU than Flipper's. And now that you know how big Hollywood's die is, you need to find out how much similarly-sized chips from NEC cost other companies instead of ranting about how Wii doesn't have enough fillrate. I'm gonna guess that when fabbing custom chips, NEC doesn't really give a crap about what the application is...the size of the chip and the cost of the process likely matters way more.

Regarding Rebel Strike, I'm the opposite: As a gamer, I don't really care what's in the machine as long as what comes out of the machine looks nice. Only on B3D will people say "Yeah, but this isn't being done with a programmable vertex shader, therefore, I don't like it."

None of the preview shots showcase any notable increases in quality over Flipper, IMO.

Then you either don't pay much attention to Gamecube, or you don't notice the little things. Having been playing Baten Kaitos for the past coupla months (makes quite good use of the Cube gfx hardware, comparable battle situations to Pokemon), I've notice quite a quality upgrade in the Pokemon videos, most notably lighting and special effects. BK is easily one of the prettiest games of its kind on the Cube, and Pokemon is significantly better. If there are all these games on Gamecube that look just like what we're seeing out of the handfull of nicest-looking Wii games, what are they? Out of the 41 Cube games I own, I certainly haven't seen any.

Zelda's a terrible benchmark, since it's graphically unchanged from the Gamecube. BTW, more than a few games did a fine job with mip-mapping and texture filtering on the Cube. For whatever reason, probably because they were using every last byte of memory to cram more textures in, a few notable titles did not. For the time being, Red Steel's probably the best indicator we've got, and the more polished sections of the game look great.

Some of you need to actually play more Cube games before talking about what crap it is.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top