WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
If the TEV can do normal mapping, why does it have to work alongside the indirect texture unit...

The indirect texturing unit is part of the TEV. However, normal mapping on Flipper is slow (about 3x slower than it would be on a DX8 part with the same clockspeed IIRC), and Flipper doesn't have all that much fillrate to begin with. If DOT3 bump mapping was ever done in a Gamecube game, it was in very limited instances.
 
6 months away from the launch of Nintendo Wii, and we still don't know exactly what Hollywood is.

is it a Flipper clocked 1.5x faster?

is it a new GPU with Flipper instructions/BC ?

something inbetween?


final Wii dev-kits are supposed to be out to developers by June, last I remember.

architectural details of Hollywood..... not public domain yet. will they ever be?

Wiiiiiiiiii.... leak the info on the WiiGeePeeYou :D



From what i can make out of it.

It's likely to be a Radeon X1000 series GPU with a reduced clock. Most likely a customized X1300. Although less powerful it may have a more advanced GPU structure then the X360.

If it was a Flipper, the flipper used 180nm, Hollywood uses 90nm, at 243Mhz it would make it 2X the power.

I thought it was just a faster flipper but after seeing developers comments on the Wi it's most likely to be something up to date.
 
Just a note about that Disney game pic. In the latest Nintendo power, the caption accompanying that pic says:
"A screen shot mock-up illustrates the game's potential graphic style.The look may evolve as the game moves towards it's fall release"

It's not being sold as a Wii screen shot.
 
Just a note about that Disney game pic. In the latest Nintendo power, the caption accompanying that pic says:
"A screen shot mock-up illustrates the game's potential graphic style.The look may evolve as the game moves towards it's fall release"

It's not being sold as a Wii screen shot.

Translation: There won't be any self-shadowing in the final game.

And whoever said Wii's GPU looks like a downclocked DX9-class Radeon, it's not. Seriously, the console's been out for over two months. We need to stop with the fantasies.
 
Whatever,who knows what they will pull off with the Wii. Bottom line is if you bought the Wii for advanced graphics on par with 360/PS3 you haven't been paying attention.
 
The Unity 3D engine is being brought over to the Wii. Not sure how big of a deal this is.

http://forum.unity3d.com/viewtopic.php?p=29412#29412

Seriously, the console's been out for over two months.

While I don't think the GPU is super powerful, I also think you're a little hasty by judging the system based on 2 months. I mean, as I recall, the first 2 months for the 360 wasn't exactly all that impressive either. Like you said, the Gamecube can self-shadow, there's no reason to think the Wii can't. It's not like most 3rd parties are putting any real effort into the graphics. I mean, a Godzilla game has bump mapped snow, while SSX Blurr -- a snowboarding game -- does not have bump mapped snow.
 
Last edited by a moderator:
And whoever said Wii's GPU looks like a downclocked DX9-class Radeon, it's not. Seriously, the console's been out for over two months. We need to stop with the fantasies.

If you clock the X1300 from 450Mhz to 243Mhz, takeaway the memory and replace it with 24MB 1TS and 64MB GDDR (about a quater of the speed) you will have something roughly aournd the performance of the Wii. An assumption based on what is known and this guys comments:

The Ubisoft guy said i'ts between the " Radeon X1400 and the Radeon X1600, and the CPU to between the AthlonXP 2400+ and the AthlonXP 3000+"

Not much of a fantasy.

The X1400 mobility uses the same GPU as the X1300, (Lower power consumption)

Makes sense since the Wii would need hyper memory technology to make good use of it's low. memory.

just an assumption.

Not a fantasy
 
The Ubisoft guy said i'ts between the " Radeon X1400 and the Radeon X1600, and the CPU to between the AthlonXP 2400+ and the AthlonXP 3000+"

Not much of a fantasy.
Unless you're talking about some totally different Ubi guy than the one I'm thinking of then wasn't what he said completely made-up fabricated lies created by some unknown webmaster person?

Peace.
 
From what i can make out of it.

It's likely to be a Radeon X1000 series GPU with a reduced clock. Most likely a customized X1300. Although less powerful it may have a more advanced GPU structure then the X360.
How do they manage GC backwards compatibility then? Emulation?? I thnink everything so far points to Hollywood being a Flipper derivative. Suggestions of a Flipper with double TEVs and maybe some extra features make a lot of sense to me. An X1n00 or any other standard GPU is impossible due to perfect GC BC.
 
How do they manage GC backwards compatibility then? Emulation?? I thnink everything so far points to Hollywood being a Flipper derivative. Suggestions of a Flipper with double TEVs and maybe some extra features make a lot of sense to me. An X1n00 or any other standard GPU is impossible due to perfect GC BC.

If Nintendo kept Gamecube code to high level only (via OpenGL) it's possible something not flipper based could be used, but it's obvious the gpu is at least based on flipper.
 
Yeah, I've been under the impression that GC never touched normal mapping. It can certainly do simpler bump mapping methods though. So could my Matrox G400 though, lol.

Still wish N had stuck a RV410 in there and scrapped the GC compatibility. Means little to me and probably the majority, honestly. Sure sells machines though. Undoubtedly the Virtual Console doesn't need those transistors wasted on Cube hardware to play N64/SNES/etc.
 
Hah. That's why Hollywood is supposefdly "too big".. It's got a N64 integrated on the die.

How's that for a totally grabbed out of the air explanation with no evidence to back it up.. :cool:

Anyway nevermind ohw unlikely my idea is how else can it be emulated perfectly at full speed using that tiny CPU when today's PCs can't do it on multi-GHz processors.

Peace.
 
Hah. That's why Hollywood is supposefdly "too big".. It's got a N64 integrated on the die.

How's that for a totally grabbed out of the air explanation with no evidence to back it up.. :cool:

Anyway nevermind ohw unlikely my idea is how else can it be emulated perfectly at full speed using that tiny CPU when today's PCs can't do it on multi-GHz processors.

Peace.

The N64's cpu is a 32 bit Risc cpu (The GPU is 64 bit).

The Wii is 64 bit Risc like the Xenon and cell.

A Pentium 4 is a 32 bit CISC.

It is much easier for the PowerPc to emulate the N64 CPU then .....almost everything.

The Wii's 729Mhz PowerPC is likely to be more powerful then a 2Ghz Pentium 4 in gaming performance.

Cisic processors are crap at emulating RISC.
 
The N64's cpu is a 32 bit Risc cpu (The GPU is 64 bit).

The Wii is 64 bit Risc like the Xenon and cell.

A Pentium 4 is a 32 bit CISC.

It is much easier for the PowerPc to emulate the N64 CPU then .....almost everything.

The Wii's 729Mhz PowerPC is likely to be more powerful then a 2Ghz Pentium 4 in gaming performance.

Cisic processors are crap at emulating RISC.

n64 is powered by an NEC VR4300 CPU. it has a 64-bit instruction set, 64-bit internal data paths, and 64-bit registers. please tell me how it's not a 64bit CPU.

and GC emulated n64 for the zelda bonus disks, so there's no reason to think Wii couldn't do the same.
 
The N64 did use a 64bit processor. It's the Gamcube that did not. Well 32 bit and 64bit integer. The xbox used a 22 bit processor as well while the dreamcast and ps2 used a 128 bit processor. Bit's don't matter anymore. Now because of shaders, neither do polygon counts really.
 
n64 is powered by an NEC VR4300 CPU. it has a 64-bit instruction set, 64-bit internal data paths, and 64-bit registers. please tell me how it's not a 64bit CPU.

and GC emulated n64 for the zelda bonus disks, so there's no reason to think Wii couldn't do the same.

It was reduced due to cost......Lets call it "customized"

It is based in the 64 bit MIPS but it is actually 32bit.

The 64 name can only account for the fact that the cpu is based on the 64bit VR4300 and that the GPU is 64 bit.

[Flashback
Oh i remember about 10 years ago i was reading a N64 mag and a developer said "if the N64 was 32 bit it wouldn't make a difference",,,,now i see what he was getting at]
 
The N64 did use a 64bit processor. It's the Gamcube that did not. Well 32 bit and 64bit integer. The xbox used a 22 bit processor as well while the dreamcast and ps2 used a 128 bit processor. Bit's don't matter anymore. Now because of shaders, neither do polygon counts really.

this statement is all sorts of messed up.

It was reduced due to cost......Lets call it "customized"

It is based in the 64 bit MIPS but it is actually 32bit.

The 64 name can only account for the fact that the cpu is based on the 64bit VR4300 and that the GPU is 64 bit.

[Flashback
Oh i remember about 10 years ago i was reading a N64 mag and a developer said "if the N64 was 32 bit it wouldn't make a difference",,,,now i see what he was getting at]
the only thing "32bit" about the n64's cpu is it's external data bus. you can call it wahtever you like, but it's 64bit internaly.
 
Status
Not open for further replies.
Back
Top