WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
Anybody notice there's a tiny 3rd die on Hollywood? It's the small blue rectangle.;)
http://pc.watch.impress.co.jp/docs/2006/1201/ninten01.jpg
My guess would be mask rom with the bios. With the mask rom being on the same multichip package it is pretty much impossible to solder a modchip or something like that to it. On the other hand it is still pretty easy to change the BIOS if a update is needed for some reason, because you don't have to change the main die.
If it isn't mask rom: Maybe they didn't integrate the audio dsp on the same die and that tiny die is the DSP?
 
I spend a small time with the GC sdk to figure out the concept behind the wii architecture.

It is interesting.If anybody thinking about the GC architecture it is easy to figure out that it is a bandwith limited system.
The 24 megs of 1t ram have a 2,6 gb/sec memory to the flipper , but half of it is used by the cpu.It mean that if you using 12 megs of graphic asests at 30 fps the maximum texture movement can be 2 movement to the texture buffer/frame on avarage (for every texture).

It mean that you have to render not by the Z order,but by the texure order , or you have to decrease the size of the texures as it is possible.(eliminate mipmap, as small texures as possible ,compressed low quality textures ).


This can be the reason for the integrated 1t ram in the hollywood chip.With this architectiure they can use more effectily the bandwith to the 1tram, put the cpu and (possibly) the vertex movements to the gddr3 ram , and (possibly) they can put bigger bus to the 1t ram (128-256 bit wide?)

For me it make sense.this architecture (vertex buffers+programs in the main mem, 128 (256) bit wide bus to the 1t ram) can make an "easy to develop" system , with a "not that bad " result.
 
I spend a small time with the GC sdk to figure out the concept behind the wii architecture.

It is interesting.If anybody thinking about the GC architecture it is easy to figure out that it is a bandwith limited system.

We have heard completely different things from developers. Developers often mentionend not having to care about bandwidth or cache is one of reasons why GC is easy to work with. Gekko had a lot of cache. 256KB L2 Cache don't seem much by todays standards, but compared to Xbox (just 128KB) and especially PS2 (no L2 Cache at all, small L1 Cache) it is a lot. Flipper had a big texture cache and internal framebuffer. Also 1T-SRAMs real world bandwidth is way closer to its peak bandwidth that regular DRAM.

The 24 megs of 1t ram have a 2,6 gb/sec memory to the flipper , but half of it is used by the cpu.It mean that if you using 12 megs of graphic asests at 30 fps the maximum texture movement can be 2 movement to the texture buffer/frame on avarage (for every texture).

Often you don't need to move the whole texture to the GPU because only a part of it is visible. 1T-SRAM is especially helpful here.

It mean that you have to render not by the Z order,but by the texure order , or you have to decrease the size of the texures as it is possible.(eliminate mipmap, as small texures as possible ,compressed low quality textures ).

Eliminating Mipmaps would be stupid. Often only lower resolutions of the mipmap are used or you just need a tiny part of the texture with the highest resolution. Mipmaps are saving bandwidth here. Rendering by texture order is also something PS2 developers are doing since ages.

This can be the reason for the integrated 1t ram in the hollywood chip.With this architectiure they can use more effectily the bandwith to the 1tram, put the cpu and (possibly) the vertex movements to the gddr3 ram , and (possibly) they can put bigger bus to the 1t ram (128-256 bit wide?)
Well, even without making the pipe to the 1T-SRAM wider, the Wii got much more bandwidth than GC. Memory clock is higher than on GC, and you got a lot of additional bandwidth to the GDDR3.
 
It is 30%

Thanks

No doubt, but Flipper as seen in GC has 30% of the die occupied by eDRAM.

I think he means that Hollywood has enough space for 3 times the logic of Flipper as well as the eDRAM. Flipper should be somewhere around 28mm^2 on a 90nm process. The 3MB 1T-Sram should take up about 10mm^2 according to NEC leaving around 18mm^2 for the 25 million logic transistors. Hollywood is 72mm^2, remove the memory and you have around 62mm^2 for logic, three and a half times as much as Flipper (87 million logic transistors?).

Obviously that is just speculation based on some facts and some estimates, so nobody jump on me for trying to paint this as 100% fact. But it does seem that there must be a lot of extra transistors in Hollywood in comparison to Flipper.
 
Last edited by a moderator:
Bomlat, can you talk about what kind of (hardwired?) fxs there is on the GC and how easy/hard is to d them? I mean we heard that there is things like self shadowing (via EMBM) but almost none use it and according to ERP it isnt easy to use in real world. But we sometimes hear about others that we dont even know if it is true or not. It would be very nice to know what is really possible with flipper.


I think he means that Hollywood has enough space for 3 times the logic of Flipper as well as the eDRAM. Flipper should be somewhere around 28mm^2 on a 90nm process. The 3MB 1T-Sram would take up about 10mm^2 according to NEC leaving around 18mm^2 for the 25 million logic transistors. Hollywood is 72mm^2, remove the memory and you have around 62mm^2 for logic, three and a half times as much as Flipper (87 million logic transistors?).


Obviously that is just speculation based on some facts and some estimates, so nobody jump on me for trying to paint this as 100% fact. It does seem like there must be a lot of extra logic in Hollywood though compared to Flipper. But it does seem that there must be a lot of extra transistors in Hollywood in comparison to Flipper.


Interesting I was writing a post in a diferent way but it will end in the same result (dependinon how optmistic you are about the process). Anyway...

Just in logic there is about 60mm^ (11mm^in edram)in hoolywood in comparition to about 70mm^ ((assuming, IIRC, about 30-33% of the space in edram) 30+mm^ in edram, ie 3x bigger) in flipper on a 180nm process. It is very interesting that in logic both chips have almost the same size even thought the in this process itcould afford at least 3,5x as many transistores (I guess that with so litle power consumption leakeage inst a big problem too) and once GPUs scales much better than CPUs (see the above eg of g70-g71 from 110nm to 90nm, that eg alone would mean a lot more inside the hollywood as it scales almost liniar), this can only mean good things as I would be suprissed if there isnt at least 2,5x more transistores, which mean about 65M transistores for logic (most of them probably used in gfxs or, eventually, in some work that would be done in the CPU), IIRC this is equal or more than what XB had for the XGPU+MCPX.

We always end in what is using soo many transistores.:D
 
Is it possible that Hollywood has more than 3MB of eDRAM? Say 5MB? Also is it possible that the bus bandwidth between the 24MB of eDRAM module and GPU module have been increased similar to X360GPU?
 
We have heard completely different things from developers. Developers often mentionend not having to care about bandwidth or cache is one of reasons why GC is easy to work with. Gekko had a lot of cache. 256KB L2 Cache don't seem much by todays standards, but compared to Xbox (just 128KB) and especially PS2 (no L2 Cache at all, small L1 Cache) it is a lot. Flipper had a big texture cache and internal framebuffer. Also 1T-SRAMs real world bandwidth is way closer to its peak bandwidth that regular DRAM.

The first spyro not show this.I think the issue that it had was due the bandwith limitation between the cache and the main 1t ram.Can be interesting to see that the framedrop is remaining on the wii or not.


Often you don't need to move the whole texture to the GPU because only a part of it is visible. 1T-SRAM is especially helpful here.
the point is that you can

Eliminating Mipmaps would be stupid. Often only lower resolutions of the mipmap are used or you just need a tiny part of the texture with the highest resolution. Mipmaps are saving bandwidth here. Rendering by texture order is also something PS2 developers are doing since ages.
YEs, the only one isseue is that you decreasin the effectivness of the early z check.
And the ps2 has the same issue.the bandwith is limited between the gs and the main memory,so they have to give up a part of the renderring speed for same bandwith saving.
But you are right.The sampling effect that was ugly in re4 in realyti was due the lack of the aa (the object was composed from many poligon)

Well, even without making the pipe to the 1T-SRAM wider, the Wii got much more bandwidth than GC. Memory clock is higher than on GC, and you got a lot of additional bandwidth to the GDDR3.

Ok, the point is: why they put the 1t ram into the core?tere is can be two reason:
-cheaper motherboard
-Higher bandwith between the 1t ram and the hollywood (useable for aa, or for render to vertex bbuffer aplications, and as higher the bandwith as easyer to manage the cache)
-lower latency.It is good by any case


The interesting test can be the next:
1., check the wii energy consumption diference between the wii and the gc version of the twilight princess.princes
2., check it with other programs.

possible discovery:
If there is a diference in the power consumption , the system frequency have to be diferent

4.,check the spyro on the wii.If the clock speed not decreased,but you can see some framedrop the bandwith is increased only due to the frequency.If the freq is decreased in the wii, but the the frame drop is smaller , or if the freq is the sam on the wii,but there is no framedropp the bit count of the 1t ram is increased.
 
Is it possible that Hollywood has more than 3MB of eDRAM? Say 5MB? Also is it possible that the bus bandwidth between the 24MB of eDRAM module and GPU module have been increased similar to X360GPU?

Of course it is because:

Wii copies 360 with NEC eDRAM
Author: Wil Harris
Published: 20th June 2006
http://www.bit-tech.net/news/2006/06/20/Wii_copies_360_with_NEC_eDRAM/


The Wii will be out in November.

The Nintendo Wii will use the same NEC eDRAM for its graphics processing as the Xbox 360, NEC has revealed today.

10MB of fast RAM, embedded in the graphics chip, allows enough buffer space for anti-aliasing to be added to graphics 'for free'.

The same technology is in Xenos, the graphics chip for the Xbox 360. That chip was developed by ATI - so is the 'Hollywood' chip that powers the graphics inside the Wii.

Whilst some have been speculating that the Wii will lack the visual quality of the Xbox 360 and the PS3, this latest announcement seems to suggest that Nintendo is serious about graphics.

What's slightly odd is that the Wii is rumoured to lack high-definition outputs, which is really where anti-aliasing is needed - AA on standard definition isn't really a good use of hardware. Could this mean the eDRAM is being used for something else?

Games journalists at E3 were generally underwhelmed with the Wii's graphics, but this is because Nintendo had the games running on Revolution-ised Gamecubes with the Wiimote attachment.

&


NEC: We Do Wii RAM too!
By: Jared Black

KAWASAKI, Japan, SANTA CLARA, Calif., June 19, 2006, NEC Electronics today announced that Nintendo Co., Ltd. has selected NEC Electronics’ 90-nanometer (nm) CMOS-compatible embedded DRAM (eDRAM) technology for WiiTM, its innovative new video game console. Designed to provide advanced graphics functions for this new gaming platform, the new system LSI chips with eDRAM will be manufactured using advanced technologies on NEC Yamagata’s 300-millimeter (mm) production lines.

Embedded DRAM technology integrates DRAM on the same chip with logic circuits, and is viewed as an optimal solution for three-dimensional (3D) graphics acceleration systems and other applications that need to process high bandwidth data using low power. In the past, the integration of an eDRAM structure with a standard CMOS process proved challenging. NEC Electronics achieved its fully CMOS-compatible eDRAM technology by integrating a metal-insulator-metal 2 (MIM2) stacked DRAM capacitor on the company’s standard CMOS process.

NEC Electronics first introduced MIM2 technology on 90 nm eDRAM in 2005, and volume production started that same year. The technology requires a material with a high dielectric constant to be placed between two electrodes, and a large charge to be maintained on the capacitor with low leakage and a small cell size. NEC Electronics has achieved this by 1) using a MIM structure for the electrodes in the DRAM cell to achieve lower resistance values and higher data processing speeds, 2) using cobalt-silicide (CoSi) DRAM cell transistors to increase driving performance, and 3) using zirconium-oxide (ZrO2) in the capacitance layer (ahead of other vendors) to increase capacitance of the unit area. These and other breakthroughs have allowed NEC Electronics to develop eDRAM chips using its most advanced 90 nm process and to secure its roadmap to 55 nm eDRAM and beyond.

Available now with NEC Electronics’ unique 90 nm CMOS-compatible eDRAM process, the ASICs deliver significant advantages that promise to continue along the technological roadmap toward 55 nm processes and beyond.

NEC Electronics selected MoSys® as the DRAM macro design partner for the Wii devices because MoSys is experienced in implementing 1T-SRAM® macros on NEC Electronics’ eDRAM process. MoSys designed the circuits and layout of high-speed 1T-SRAM macros on NEC Electronics’ 90 nm CMOS-compatible eDRAM technology.

The adoption of these eDRAM ASICs for Nintendo’s new game console is a clear vote of confidence in NEC Electronics’ eDRAM technology. The company is committed to delivering more value-added features to spur the adoption of its CMOS-compatible eDRAM solutions across a wide range of applications.

More information about NEC Electronics’ eDRAM can be found at http://www.necel.com/process/en/edram.html.

Posted: 06/19/2006 10:37:22 PST
http://wii.vggen.com/news/news.php?id=1421
 
My guess would be mask rom with the bios. With the mask rom being on the same multichip package it is pretty much impossible to solder a modchip or something like that to it. On the other hand it is still pretty easy to change the BIOS if a update is needed for some reason, because you don't have to change the main die.
If it isn't mask rom: Maybe they didn't integrate the audio dsp on the same die and that tiny die is the DSP?

I don't think it's the sound DSP since there would be no reason to have it on a separate die. It might be some sort of security logic from BroadON or it could be communication/arbitration logic between the 24MB eDRAM module and GPU. Or maybe it's a tiny PPU coprocessor.:oops:
 
Having looked at many PROMs from old electronics (yea, I'm a geezer taht likes old stuff so sue me :cool:) and they always have that shiny uniform appearance seen from the little extra chip.

I think it's some sort of ROM, wether flash or something else who can say. It's covered by epoxy in any case so whatever it is its not going to make a lot of heat..

Peace.
 
Having looked at many PROMs from old electronics (yea, I'm a geezer taht likes old stuff so sue me :cool:) and they always have that shiny uniform appearance seen from the little extra chip.

I think it's some sort of ROM, wether flash or something else who can say. It's covered by epoxy in any case so whatever it is its not going to make a lot of heat..

Peace.

Mask ROM by Macronix?
 
MDX, I really wouldn't take any notice of Bit Tech's article, they don't seem to have much of a clue whay they're talking about in this case. Just look at the title of the article "Wii copies 360 with NEC eDRAM" That alone shows there ignorance, then there's this:

bittech said:
What's slightly odd is that the Wii is rumoured to lack high-definition outputs, which is really where anti-aliasing is needed - AA on standard definition isn't really a good use of hardware.

I'm speechless :LOL:
 
MDX, I really wouldn't take any notice of Bit Tech's article, they don't seem to have much of a clue whay they're talking about in this case. Just look at the title of the article "Wii copies 360 with NEC eDRAM" That alone shows there ignorance, then there's this:
:

Youve got a good point. Because the NEC press release doesnt state how much MB was put in.
So Im being cautiously optimistic :neutral:
But that said, funny enough an earlier rumor, that was actually discussed in this same thread, made me start to wonder if there was something to the Wii having more than 3MB eDram available:

Remember this:
http://www.beyond3d.com/forum/showthread.php?t=30935&page=7

EN: As we saw in many current-gen titles, 480p can result in significant jaggies. Can we expect AA and AF from Ubisoft's Wii titles?
XP: Red steel will have 4x Antialiasing and 8x Anisotropic Filtering.

EN: Could you possibly tell us anything specific about the Wii GPU?
XP: I can tell you that it has double the number of pixel pipelines (of gamecube), and that it processes physics. It really takes a huge load off of the cpu.

EN: Is the T&L setup fixed function like it was on the gamecube?
XP: No, fully programmable.

EN: How much Edram does it have?
XP: 2MB for the framebuffer 2MB for the Zbuffer and 4MB for texture cache. Unfortunately I am out of time, id like to come back for another interview sometime.


Ok, Mr. Poix stated about eight MB in total. Mr. Poix, I found out, is a real person working for Ubisoft:

That game, a favorite of Mr. Jackson's, was directed and designed by Michel Ancel of Ubisoft, who is also overseeing the making of the King Kong video game. In Montpellier, Mr. Ancel works with a staff of 30 game designers, animators and programmers. For King Kong, the staff swelled to 80, said Xavier Poix, a studio director and producer of the King Kong game, which is budgeted at more than $20 million.

The more expensive games can cost up to $25 million to make. But if successful, they can turn into big money-makers: Halo and its sequel sold 135 million copies and generated sales of $600 million in North America. Ubisoft did not disclose financial details for King Kong, but Mr. Jackson will receive a percentage of the game's profits, an unusual arrangement for a movie director.
In April 2004, Mr. Poix, Mr. Ancel and a team of game designers and cinematographers flew to New Zealand to meet Mr. Jackson.
http://research.techkwondo.com/files/kingkong_blurs_line_between_films_and_games.pdf

So, I dont know if the interview was real, or has been officially debunked, but there was several things that Mr. Poix said in the interview that sounded like it could be true.
For instance:

EN: Will Sam Fisher be making a return to Nintendo's home console?
XP: At some point.

EN: Speaking of Prince of Persia, will we be seeing more acrobatic platforming adventures on Wii?
XP: No comment.

EN: Will Rayman on the Wii be a port of the current-gen game with new controls, or will it have new art assets for Wii?
XP: Its not a port, other than that i cannot say.

This supposed interview took place around early June
And the press announcement by NEC was mid June

Sure, this is all circumstantial, but at the end of the day, wouldnt it just be probable that the Wii has more eDram than the Gamecube originally had?
 
Yeah I think most here will remember the Ubi-Soft interview saga. Unfortunately it was made up by some attention seeking idiot. First Ubi-Soft denied the interview, which in itself didn't prove anything, but then the source of the interview disapeared. Also some of the things said were definitely false. Red Steel has no anti aliasing or anisotropic filtering and no way is Broadway as powerful as a Athlon XP 2500. I wish the interview had been true but unfortunately it wasn't :(

Sure, this is all circumstantial, but at the end of the day, wouldnt it just be probable that the Wii has more eDram than the Gamecube originally had?

I'd definitely consider that a possibility if not for one thing. No games so far seem to have anti aliasing. If Wii had enough eDRAM to perform anti aliasing then that kind of thing would be something that could be added easily even to a game initially designed on GC hardware. That's why I think the extra die space is being taken up only by extra logic transistors.

It would be great if Wii had say 6MB of eDRAM, because that should only take about 20mm^2 which would still leave enough die space to potentially have 2-3 times the amount of logic as Flipper. But I just don't see it unfortunately.
 
Last edited by a moderator:
It would be great if Wii had say 6MB of eDRAM, because that should only take about 20mm^2 which would still leave enough die space to potentially have 2-3 times the amount of logic as Flipper. But I just don't see it unfortunately.

The nintendo said it haven't got more edram than the gc.Othervise it have to be able for 720p.
 
Status
Not open for further replies.
Back
Top