A Summary of the Huge Wii Thread

Status
Not open for further replies.
It's their newest low power high density 90nm eDRAM process so it's not 1T-SRAM-R.

http://www.necel.com/process/en/edramoptions.html



First of all what does the 24MB of 1T-SRAM have to do with anything? Second, according to NEC's numbers Hollywood is a straight die shrink/clock increase since 30% of the die is used for the 3MB of eDRAM. This is the low power higher density eDRAM process we are talking about too, not their low density high performance one. Believe what you want to believe, but when Wii games don't show significant improvement over GC, don't pretend you didn't hear it here first.;)


Sonic and the Secret Rings show a significant improvent over the GC's visuals. Try playing it.
 
Again. Play it on your own tv. It looks better in person than it does on any video availble online. You'll be impressed.

Yes it's an improvement but not a significant one. Nothing an overlocked Flipper couldn't do. ;)

Again I sense that ERP wants the speculation to stop because in the end it's all nonsense and he knows it. He just can't say anything because of NDAs.
 
Last edited by a moderator:
It's their newest low power high density 90nm eDRAM process so it's not 1T-SRAM-R.

http://www.necel.com/process/en/edramoptions.html

Look at the table and you can see the cell size comparison of the NED3 process vs UX6D process.

http://www.necel.com/process/en/edramprocess.html



First of all what does the 24MB of 1T-SRAM have to do with anything? Second, according to NEC's numbers Hollywood is a straight die shrink/clock increase since 30% of the die is used for the 3MB of eDRAM. This is the low power higher density eDRAM process we are talking about too, not their low density high performance one. Believe what you want to believe, but when Wii games don't show significant improvement over GC, don't pretend you didn't hear it here first.;)
Did you bother reading your own links?
NEC said:
A 15 x 15mm die can incorporate as much as 256 Mb, for example, assuming that the eDRAM occupies half the chip's area.
When you do the math, (32 MB = half of 225 mm^2) you find that each 1 MB of eDRAM occupies 3.5 mm^2. So your evidence points to 15% of Hollywood's die being 3 MB of eDRAM.
 
Guys, friendly mod advise:

The deus-ex-game / screenshot argumentation gotta stop or else this will turn into WeeGeePeeU part 2. Don't replace logic and arguments by screenshots or game titles.
 
According to NEC, at 90nm 32MB of eDRAM takes up 225mm^2 that means for every 1MB of eDRAM you'd need roughly 7mm^2. There's no way you could fit 18MB of eDRAM into 70mm^2 + Flipper logic. Actually 3MB of eDRAM takes up 21mm^2 at 90nm which is exactly 30% the area of Hollywood which means it IS indeed an overclocked Flipper. :eek:

So there you have it, I guess we can stop the nonsense now. Hollywood IS a simple die shrink/clock increase. :mad:

No, according to NEC 32MB of eDRAM takes up 112.5mm^2 on a 90nm process (roughly 3.5mm^2 per 1MB). But even if you hadn't been wrong that still wouldn't have meant that Hollywood was simply an overclocked Flipper. Since you'd still have around 51mm^2 of space for the rest of the chip, which is still well over twice the size that Flipper minus eDRAM should be at 90nm. As it is 3MB of NEC's eDRAM takes up 10.5mm^2 according to NEC (15% of Hollywoods die). Leaving 62mm^2 for the rest of the chip.
 
Haha..ok I guess that means my theory that it's simply a die shrink/overclock with more eDRAM is still a valid possibility. :p
 
Haha..ok I guess that means my theory that it's simply a die shrink/overclock with more eDRAM is still a valid possibility. :p

How much do you suggest that Wii has eDRAM?!? and what would be the point to include so much?
I think it's way more likely that the large difference comes from something else. Your math was way of base in your previous statement so why should this massive eDRAM prediction hold any more water?

edit: I'm asking not bitching, though it sure looks like bitching :smile:
 
Last edited by a moderator:
This is what we should know about hollywood based on all the researched gathered.

-It is not an overclocked flipper ( it's too big)
-It is custom built for the Wii (says ATI)
-It is in between the xbox and 360's GPU in performance. Probably right in the middle.
- The 24MB of the GC's main memory is now located on the GPU's die.
-There is edram on Hollywood but no one is sure how much.( could be more than the 3MB's on flipper)
-There are 2 or more dies on hollywood (napa and vegas)
-The GC's "Aram" seems to have been replaced by 24MB of 1t-sram.
-Hollywood was made on the 90nm process
-Most likely clocked at 243mhz
 
-It is in between the xbox and 360's GPU in performance. Probably right in the middle.
There is no way that Hollywood's fillrate, vertex, and shader performance are at the midpoint between Xbox and X360. Even a doubled-up Flipper at the known clockspeed would only be equivalent to the XGPU in fillrate and still have weaker shaders.
The 24MB of the GC's main memory is now located on the GPU's die.
I thought it was under a separate die and lived under the same heat spreader. Was I wrong on this?-
The GC's "Aram" seems to have been replaced by 24MB of 1t-sram.
No, the ARAM was replaced by 64 MB of GDDR3. The 24 MB of 1T-SRAM on Wii "replaces," uh, the 24 MB of 1T-SRAM on the GC.
 
There is no way that Hollywood's fillrate, vertex, and shader performance are at the midpoint between Xbox and X360. Even a doubled-up Flipper at the known clockspeed would only be equivalent to the XGPU in fillrate and still have weaker shaders.

I thought it was under a separate die and lived under the same heat spreader. Was I wrong on this?-
No, the ARAM was replaced by 64 MB of GDDR3. The 24 MB of 1T-SRAM on Wii "replaces," uh, the 24 MB of 1T-SRAM on the GC.

I guess you could see it that way, but that brings me to ask the question, how many developers are using the extra 64MB? If many of them are used to developing for the Gamecube, then maybe they are relying on the 3mb edram + 24mb 1tsram setup and not venturing passed what is not familiar to them. So basically they are seeing a Gamecube with a better GPU.

But anyway, I dont think you can compare the ARAM to the GDDR3:

In addition to its 24M of system RAM the GameCube contains 16M that is normaly used to hold audio data buffers. The CPU cannot address the audio RAM directly so this memory cannot be used like normal system memory but it is possible to use the audio RAM as a swap device. (from wiki)

So as far as I understand the GDDR3 is the main Ram of the Wii like the 1tsram was for the main ram for the Cube that can be access by both the CPU and GPU.

Aram wasnt replaced, it was just gotten rid of- unless you want to say it was replaced by the flash memory.
 
How much do you suggest that Wii has eDRAM?!? and what would be the point to include so much?
I think it's way more likely that the large difference comes from something else. Your math was way of base in your previous statement so why should this massive eDRAM prediction hold any more water?

edit: I'm asking not bitching, though it sure looks like bitching :smile:

Going from 3MB to 6MB is not massive. ;)

Ok... but did they need 3-4 years to do it?

That's where your theory goes in the toilet.

And how do you know ATI wasn't developing something else that was scrapped or saved for a future console after Wii? BTW how do you know they "needed" 3-4 years? Maybe they wanted to wait and launch the console at a strategic time? Plenty of possibilities. Why are you so obssessed with some magical untapped transistors in Hollywood using a Sonic game to futily try and push your point?

It also doesn't explain why devs didn't get Hollywood sooner rather than later in early devkits.

There could be many reasons why Hollywood was late. You choose to believe it was some difficult to add "magic logic". My reason is this, it's an overclocked Flipper so there was no "need" to get final GPUs to developers using GC devkits...make sense? Look at the games, do they look like they were developed on GPUs with "magic logic"?

IMO at the beginning of development of the Wii console, Nintendo planned on having a powerful console, but somewhere along the line they felt that competing with SONY and MS didn't make business sense so they basically just overclocked the CPU/GPU.
 
Last edited by a moderator:
Well, an X1600 is about 60% of Hollywood in die area, according to my awful hasty icky measurement. Broadway looks to be about 50% the size of a single core 90nm A64.

Just fun observations from within paint shop pro.
 
There could be many reasons why Hollywood was late. You choose to believe it was some difficult to add "magic logic". My reason is this, it's an overclocked Flipper so there was no "need" to get final GPUs to developers using GC devkits...make sense? Look at the games, do they look like they were developed on GPUs with "magic logic"?

IMO at the beginning of development of the Wii console, Nintendo planned on having a powerful console, but somewhere along the line they felt that competing with SONY and MS didn't make business sense so they basically just overclocked the CPU/GPU.

What is this magic you speak of. Hollywood is architecturally based on Flipper. Using Flipper in early devkits is a no brainer if your final hardware isn't ready. Adding an additional TEV unit and pixel pipelines, is one of many reasons.

I'm done though, this is a rerun thats not worth it. Especially, when this enters the thread "magic logic", what a way to derail a thread.
 
Using Flipper in early devkits is a no brainer if your final hardware isn't ready. Adding an additional TEV unit and pixel pipelines, is one of many reasons..

That's contrary to the evidence that you are using to support your own theory. Why would it be late from adding some simple TEV and pixel pipelines? According to your evidence it was worked on for 3-4 years so your'e saying ATI couldn't do this in 3-4 years? This isn't "magic logic" that they're adding so why did it take them so long?
 
Status
Not open for further replies.
Back
Top