WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
pc999 said:
BTW I finally got the change to see this again (still very bad qualitity), anyone think this can be real time (the driving parts)? I think they can/will possible be.

http://video.google.com/videoplay?docid=-4835745581631650859&q=Disaster:+day+of+crisis


I think it could possibily be realtime, after seeing Biohazard 4 / Resident Evil 4 cutscenes on Gamecube.


I'm not saying that the 'Disaster: Day of Crisis' teaser was realtime, i'm just saying it's possible.

the close-up of the spikey-hair guy could've been pre-rendered or realtime.
 
News:

http://www.planetgamecube.com/newsArt.cfm?artid=11735
MoSys' 1T-SRAM(R) Embedded Memory Technology Enables Nintendo's Next Leap in Video Games; High Performance, High Density 1T-SRAM Powers Upcoming Wii Home Game Console


http://www.planetgamecube.com/newsArt.cfm?artid=11736
The console's CMOS-compatible embedded DRAM will complement the 1T-SRAM from MoSys, also announced today.
Nintendo's New WiiTM Video Game Console Uses NEC Electronics' Embedded DRAM

KAWASAKI, Japan, SANTA CLARA, Calif., June 19, 2006

NEC Electronics today announced that Nintendo Co., Ltd. has selected NEC Electronics' 90-nanometer (nm) CMOS-compatible embedded DRAM (eDRAM) technology for WiiTM, its innovative new video game console. Designed to provide advanced graphics functions for this new gaming platform, the new system LSI chips with eDRAM will be manufactured using advanced technologies on NEC Yamagata's 300-millimeter (mm) production lines.

Embedded DRAM technology integrates DRAM on the same chip with logic circuits, and is viewed as an optimal solution for three-dimensional (3D) graphics acceleration systems and other applications that need to process high bandwidth data using low power. In the past, the integration of an eDRAM structure with a standard CMOS process proved challenging. NEC Electronics achieved its fully CMOS-compatible eDRAM technology by integrating a metal-insulator-metal 2 (MIM2) stacked DRAM capacitor on the company's standard CMOS process.

NEC Electronics first introduced MIM2 technology on 90 nm eDRAM in 2005, and volume production started that same year. The technology requires a material with a high dielectric constant to be placed between two electrodes, and a large charge to be maintained on the capacitor with low leakage and a small cell size. NEC Electronics has achieved this by 1) using a MIM structure for the electrodes in the DRAM cell to achieve lower resistance values and higher data processing speeds, 2) using cobalt-silicide (CoSi) DRAM cell transistors to increase driving performance, and 3) using zirconium-oxide (ZrO2) in the capacitance layer (ahead of other vendors) to increase capacitance of the unit area. These and other breakthroughs have allowed NEC Electronics to develop eDRAM chips using its most advanced 90 nm process and to secure its roadmap to 55 nm eDRAM and beyond.

Available now with NEC Electronics' unique 90 nm CMOS-compatible eDRAM process, the ASICs deliver significant advantages that promise to continue along the technological roadmap toward 55 nm processes and beyond.
 
hm... Why jump straight to 55nm? Is it just a quirk in their manufacturing technology (i.e. they could do 45nm, but it's not quite mature yet, so they'll back off a bit and do 55nm with a relatively minor tweak)?
 
I don't think they'll jump straight to 55 nm. It's just that right now they have transitions to 55 nm on their roadmap. So it'll probably go from 90 nm to 70 nm to 65 nm and 55 nm, and from there, continue to shrink.
 
http://gear.ign.com/articles/713/713254p1.html

The announcements confirm IGN's previous reports that the Wii would make use of 1T-SRAM both in an embedded and individual application. Our most up-to-date specs promise 16MB of eDRAM (integrated in NEC's LSI chips) and 88MB of 1T-SRAM (the "additional external memory chip"), for a total of 104MB of system RAM, not counting the allegedly accessible 512MB of Flash RAM or the ATI Hollywood GPU's on-board memory, which is said to amount to 3MB.

This things is really bad writen and it proves false the Jessica article that only have 8Mg of eDram (joking:LOL:)

Althought those 16Mgs of eDram separated from the other 3 could make some sense if there is more than just a GPU (eg physics)?
 
Last edited by a moderator:
Now waiiiiit just a minute. I thought the GPU was on an LSI. Or have I gotten completely hosed by acronymos again? What's the point of this eDRAM if it's not hanging out on the same die as the GPU? What the heck is it embedded in. I think IGN's just plain confused, and this eDRAM is in fact one of Wii's improvement on the Cube's design. It also bespeaks of a low logic transistor count on the GPU as well, since Microsoft had to have a 2nd die--er, "daughter die"--just to have only 10 MB of eDRAM. I don't pay attention to any of IGN's commentary on the news they receive, because I don't believe their Nintendo editors understand technology in the slightest. For example, on Matt's blog, he was wondering why his speakers for his home theater system aren't wireless.

Anyway, 16 MB sounds like overkill to me unless they've got a much larger texture cache this time around.
 
Last edited by a moderator:
Yes I did think it was odd that you would still need additional eDRAM on the GPU if you already had 16MB eDRAM. That amount of RAM would be pointless to use as system RAM and probably wasted as a sort of L3 cache.
The figure seems very high even if it were for the GPU that is rumoured to power the Wii.
Some clarification is definitely needed.
 
The announcements confirm IGN's previous reports that the Wii would make use of 1T-SRAM both in an embedded and individual application. Our most up-to-date specs promise 16MB of eDRAM (integrated in NEC's LSI chips) and 88MB of 1T-SRAM (the "additional external memory chip"), for a total of 104MB of system RAM, not counting the allegedly accessible 512MB of Flash RAM or the ATI Hollywood GPU's on-board memory, which is said to amount to 3MB.

Have they any dignity?

First they talked about 88MB 1T-SRAM and 16MB A-RAM, after this they said 3MB eDRAM, 88MB 1T-SRAM and the A-RAM disappeared, now they are talking about 16MB of Internal RAM in the GPU when they never have talked about it and when they said 24MB of Internal RAM as a type of external RAM.

And all the people knew that 1T-SRAM was going to be the RAM in Wii before IGN said anything about their supposed specs.
 
IGN wrote:

Our most up-to-date specs promise 16MB of eDRAM (integrated in NEC's LSI chips)

16 MB of 1T-SRAM on the Hollywood LSI, huh? laughing my ass off. they said basicly the SAME thing about Flipper in 1999, that it would have 8 to 16 MB of embedded memory

May 21, 1999 IGN wrote:
http://cube.ign.com/articles/068/068223p1.html
According to NEC's associate vice president, systems integration, Junshi Yamaguchi, the Nintendo graphics chipset will use as low as 8MBs and as high as 16MBs of on-board embedded DRAM.





I will believe it when I know its true. I don't trust IGN, or blogs, or most websites anymore
(Beyond3D is still good though!)
 
Last edited by a moderator:
Just to point that Moores Law would give 12Mg of eDram and once 1T-Sram-Q is twice as dense up to 24Mgs of eDram would be possible at the same price of the flipper.

Althought I find this very very hard IMO.
 
I'll sum up how IGN Nintendo's reporting has been on Wii:

1. Get some kind of "fact" from developers or manufacturers.
2. Completely ignore any relevant information that would inform what that piece of information might actually mean.
3. Pull factoids together using a process of collective ignorance.
4. Write a news article.

Moral of the story: never trust Mac users to tell you anything about technology.
 
fearsomepirate said:
I'll sum up how IGN Nintendo's reporting has been on Wii:

1. Get some kind of "fact" from developers or manufacturers.
2. Completely ignore any relevant information that would inform what that piece of information might actually mean.
3. Pull factoids together using a process of collective ignorance.
4. Write a news article.

Moral of the story: never trust Mac users to tell you anything about technology.

I am going to cry, I am a Mac user.

(Urian Crying)

You have hurt my heart, I want to die, *sob, *sob.
 
Perhaps the memory is divided as such for Gamecube compatibility. Nintendo wants lots of on-chip memory for Hollywood, so they include Flipper's 3 MB, then add to that what used to be Gamecube's A-RAM.

What I don't understand is why we would be seeing 88 MB of 1T-SRAM. It makes sense that it is Gamecube's 24 MB of memory, plus another 64 MB. But if you're going to do that, why not just put two 64 MB modules in and call it good? For backwards compatibility you just hide the extra 104 MB from applications.
 
What on earth would be the point in that kind of memory configuration? I mean, why have 16MB of memory embedded into the GPU if its no faster then the 88MB of external memory?, what a total waste of money, and a seperate faster 3MB also embedded into the GPU??.. I think IGN have confused themselves yet again.
 
Last edited by a moderator:
Teasy said:
What on earth would be the point in that kind of memory configuration? I mean, why have 16MB of memory embedded into the GPU if its no faster then the 88MB of external memory?, what a total waste of money, and a seperate faster 3MB also embedded into the GPU??.. I think IGN have confused themselves yet again.

Not sure if he is talking about BW or just latency, the first dont make sense but the second do.

Still a very strage configuration unless it is for other functions (not the bufer or tex cache) like geometry or can be acessed from any part of the GPU, but would it really be usefull for that kind of things? The other thing I can think is other HW that is also integrated in the GPU (eg like I said physics).

What more would make this make any sense?
 
Yeah, well...I've not seen her in a quite a while and been online all day for the past coupla days. Sorry. :cry:

Anyway, thought I should mention while the topic of what GC can/can't do, I wanted to mention that a reasonable number of Gamecube exclusives had a nifty depth-of-field effect. And honestly, the only reason I'm not a Mac user is money.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top