WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
INKster said:
The Xbox 360 also has GDDR3 as main memory.

So far Nintendo has announced all their partners and what they would provide. There's been zero mention of GDDR3. So that sounds really bogus to me.
 
http://www.maxconsole.net/?mode=news&newsid=8802

Even more BS? Or is it?


*UPDATE* The dev mailed us and said he was shocked to see all these cries of 'fake', so he provided some more info to show he's not bluffing...

Broadway CPU

Broadway is Wii's CPU. Broadway functionality and specifications are as follows.

• Operating speed: 729 MHz
• Bus to main memory: 243 MHz, 64 bits (maximum bandwidth: 1.9 gigabytes/sec)
• 32-kilobyte 8-way set-associative L1 instruction cache
• 32-kilobyte 8-way set-associative L1 data cache (can set up 16-kilobyte data scratch pad)
• Superscalar microprocessor with six execution units (floating-point unit, branching unit, system regis
ter unit, load/store unit, two integer units)
• DMA unit (15-entry DMA request queue) used by 16-kilobyte data scratch pad
• Write-gather buffer for writing graphics command lists to the graphics chip
• Onboard 256-kilobyte 2-way set-associative L2 integrated cache
• Two, 32-bit integer units (IU)
• One floating point unit (FPU) (supports single precision (32-bit) and double precision (64-bit))
• The FPU supports paired single floating point (FP/PS)
• The FPU supports paired single multiply add (ps_madd). Most FP/PS instructions can be issued in
each cycle and completed in three cycles.
• Fixed-point to floating-point conversion can be performed at the same time as FPU register load and
store, with no loss in performance.
• The branch unit supports static branch prediction and dynamic branch prediction.
• When an instruction is stalled on data, the next instruction can be issued and executed. All instructions
maintain program logic and will complete in the correct program order.
• Supports three L2 cache fetch modes: 32-Byte, 64-Byte, and 128-Byte.
• Supports these bus pipeline depth levels: level 2, level 3, and level 4.
Reference Information: Broadway is upward compatible with Nintendo GameCube’s CPU (Gekko).

Hollywood GPU

Hollywood is a system LSI composed of a GPU and internal main memory (MEM1). Hollywood is clocked at 243 MHz. Its internal memory consists of 3 megabytes of embedded graphics memory and 24 megabytes of high speed main memory.

Hollywood includes the following.
• Graphics processing unit (with 3 megabytes of eDRAM)
• Audio DSP
• I/O Bridge
• 24 megabytes of internal main memory
• Internal main memory operates at 486 MHz.
Maximum bandwidth between Hollywood and internal main memory: 3.9 gigabytes per second
• Possible to locate a program here
Reference Information: Hollywood is similar to Nintendo GameCube’s Flipper and Splash components.

External Main Memory (MEM2)

Wii uses 64 megabytes of GDDR3 (MEM2) as external main memory. Like internal main memory, MEM2 can be accessed directly from Broadway and the GPU at high speed and has a peak bandwidth of 4 gigabytes/sec. Programs can also be placed in MEM2.

Reference Information: Nintendo GameCube ARAM is used as auxiliary memory for the DSP. The CPU and GPU did not have direct access to it.

http://www.neogaf.com/forum/showpost.php?p=3942430&postcount=254

I can confirm that all those informations are from "RVL_overview.pdf" (dated from March 22,2006) , which is included in the last available revision of the Wii SDK.
 
Last edited by a moderator:
Those specs are so low I'm almost frightened for Nintendo.

Sure, having specs like that will make it really affordable at launch. They will probably even make profit on it right out of the gate.

But as far as being able to compete with PS3 and Xbox360... I have to argue it's not in even the same product catagory anymore. It's as if Sony and MS are building sports cars and Nintendo is building really nice bicycles.
 
Seems a slightly wierd setup to me. There's 64 MB GDDR with 4 GB/s. Broadway has 1.9 GB/s access to this. Hollywood has 24 MB internal RAM at 3.9 GB/s + 3 MB eDRAM. Hollywood can access this at 486 MHz, 3.9 GB/s. That's a 64 bit bus? eDRAM is less than a third the size of Xenos, for a screen that's a third the resolution at most. May be less than that such as PAL 640x512 or higher. Thus for the same activities as Xenos you'd need to use tiling. This would suggest no HDR or AA for the full buffer to fit in eDRAM? And the main memory MEM2 would feed into MEM1 for use by GPU and audio? So this design listed is a two pool NUMA, single direction MEM1 >> MEM2, and CPU can't read from GPU? In other words, the CPU has 64 MB available, the GPU+audio has 64+24 (excluding eDRAM workspace). Where does the 1T SRAM come in? The 24 MB GPU RAM? That'd be hardware compatible with GC for BC, no?

To me, it seems quite a complex system where in this day and age you could get faster more simply at for not much more cost, surely. What's wrong with 256 MB GDDR3 at 20+ GB/s shared or somesuch? It seems to be adding some of the PS3's memory complexity having to manage data in different locations, but without the advantages that brings in having more BW. Though if the 1T-SRAM is noticeably faster and yet costly (it wasn't an issue with GC) I guess a substantial quantity of 1T-SRAM may have been prohibitive.

Overall, it does look extremely low spec! I'm sure everyone would like more than this from a new console. I can't help but feel it's little more than someone upping the current GC specs to create something plausible, but at the other end of the scale that's almost all we've been hearing - GC+.
 
INKster said:
The Xbox 360 also has GDDR3 as main memory.

What works for one system doesn't neccesarilly work for another though. GC was about low latency and high efficiency, and Wii is supposed to follow that idea. Switching extremely low latency main memory (1T-Sram in GC) for much higher latency main memory with only slighty more bandwidth would be a terrible idea. It may even make Wii's main memory slower then GC's.. Which is just one reason why this rumour is obviously untrue.

Nightz said:
Probably fake but those specs dont look too far off either.
Looking at the games Wii can't do nice shaders, self shadowing, paralax mapping, normal mapping or even AA. Its probably a DX7 level GPU.

GC could do nice self shadowing, normal mapping and some pretty nice shader effects. You're not looking at the right games if you think Wii can't..
 
Teasy said:
What works for one system doesn't neccesarilly work for another though. GC was about low latency and high efficiency, and Wii is supposed to follow that idea. Switching extremely low latency main memory (1T-Sram in GC) for much higher latency main memory with only slighty more bandwidth would be a terrible idea. It may even make Wii's main memory slower then GC's.. Which is just one reason why this rumour is obviously untrue.

Why do you think that they switched 1T-SRAM to GDDR3? They switched 16 MB DRAM to 64MB of GDDR3.

I believe that these specs are very accurate.
 
Last edited by a moderator:
ZiFF said:
These specs do not suggest that they switched 1T-SRAM to GDDR3. They switched 16 MB DRAM to 64MB of GDDR3.

That rumour claims that the 64MB GDDR3 is to be used as main memory, A-Ram was never considered main memory...
 
Shifty Geezer said:
Seems a slightly wierd setup to me. There's 64 MB GDDR with 4 GB/s. Broadway has 1.9 GB/s access to this. Hollywood has 24 MB internal RAM at 3.9 GB/s + 3 MB eDRAM.

It would be more then slightly wierd, how on earth could they even get the 64MB pool to have 4GB/s bandwidth and the 24MB pool 3.9GB/s? Run the 24MB 1T-Sram at a 2 times multiplier and the 64MB GDDR3 at a 1.03 multiplier? :LOL:
 
Teasy said:
That rumour claims that the 64MB GDDR3 is to be used as main memory, A-Ram was never considered main memory...

Hollywood GPU

Hollywood is a system LSI composed of a GPU and internal main memory (MEM1). Hollywood is clocked at 243 MHz. Its internal memory consists of 3 megabytes of embedded graphics memory and 24 megabytes of high speed main memory.

External Main Memory (MEM2)

Wii uses 64 megabytes of GDDR3 (MEM2) as external main memory. Like internal main memory, MEM2 can be accessed directly from Broadway and the GPU at high speed and has a peak bandwidth of 4 gigabytes/sec. Programs can also be placed in MEM2.

I really dont see what is so hard to understand here. Only thing changed from GC is 3MB edram and 16MB DRAM -> 64MB GDDR.
 
I'd guess that's a rounding error, like the 3.9 is actually 3.888 GB/s if it's a multiplier of 486 MHz clockspeed (which even then isn't right as 1 Mega in bytes is not the ame as 1 Mega in Hertz). The numbers seem to work okay, with them being multiples of each other, if you are a bit loose with fitting them together, like 486 megahertz x 8 bytes = 3.7 GB/s real value, but approximately 3.9 if you don't count bytes properly. Being loose with definitions doesn't add much by way of creditability to a technical document though!
 
If you go to neogaf is there is an updated list of the specs, but also various people in their claiming they have seen the same documents.
 
ZiFF said:
I really dont see what is so hard to understand here. Only thing changed from GC is 3MB edram and 16MB DRAM -> 64MB GDDR.

I was under the impression that only the CPU could access the GDDR3 memory, if the GDDR3 memory is used mainly for less latency dependent tasks like graphics while the entire 24MB of 1T-Sram is used for latency dependent code then that makes more sense. Though I still have my reservations, like for instance the strange difference in bandwidth between the GDDR3 and 1T-Sram. By the way what do you mean when you say that the 3MB of eDram is different from GC?

Shifty Geezer said:
I'd guess that's a rounding error, like the 3.9 is actually 3.888 GB/s if it's a multiplier of 486 MHz clockspeed (which even then isn't right as 1 Mega in bytes is not the ame as 1 Mega in Hertz). The numbers seem to work okay, with them being multiples of each other, if you are a bit loose with fitting them together, like 486 megahertz x 8 bytes = 3.7 GB/s real value, but approximately 3.9 if you don't count bytes properly. Being loose with definitions doesn't add much by way of creditability to a technical document though!

Yeah I know that rounding errors often occur in tech documents/articles and that bandwidth definitions are often used losely. But a rounding error wouldn't explain why a Nintendo document would say that the 24MB of ram has 3.9GB/s and the 64MB pool has 4GB. Both are on the same width bus and using the same memory controller, so surely both would be listed either as 3.9GB/s or 4GB/s?

I'll say one thing though, if these were the real Wii specs then Nintendo must have completely lost there marbles. Not only because the system would be underpowered in the most extreme way possible but also because they'd have paid ATI and IBM to effectively do nothing. Not only that but it would have taken them until late 2006 to manage to do nothing... remember that IBM and ATI only recently managed to finish the Broadway/Hollywood design and finally include it in a final Wii dev kit. How can it take that long to complete a 50% overclocked 90nm version of a 180nm chip you designed 6 years earlier?? I'm not even going to go into the tiny amount of money this system should cost to put together vs what Nintendo seem to be suggesting for the launch price and the fact that Nintendo themselves have said they will make a small loss on hardware at launch. Nope none of this adds up, not even Nintendo do things that bizarrely.
 
Last edited by a moderator:
Isn't the whole point of having on-chip memory to obtain some kind of ridiculous bandwidth? Why on earth would 24 MB of embedded 1T-SRAM have only 3.9 GB/s of bandwidth? That would seem to almost defeat the purpose of embedded memory.

IGN, which despite poor interpretation of the facts reports actual facts, recently confirmed that the rumble motor is single-intensity. This forum post says it's variable intensity. I'm gonna go with IGN. And how come he overlooked the speaker and the on-controller memory?

ATI and Retro have already referred to Hollywood as a new architecture; I somehow doubt this means "GC architecture with a crapload of low-bandwidth eDRAM."
 
Last edited by a moderator:
Teasy said:
Not only that but it would have taken them until late 2006 to manage to do nothing... remember that IBM and ATI only recently managed to finish the Broadway/Hollywood design and finally include it in a final Wii dev kit. How can it take that long to complete a 50% overclocked 90nm version of a 180nm chip you designed 6 years earlier??
That to me is what makes this OC'd GC idea preposterous. Unless IBM and ATi only had one guy working on the changes in their lunch breaks! :p
 
This sounds bogus to me. I mean, I can't even remember how many false rumours we've been hearing about Wii / Revolution over the past 1 1/2 years. It's just ridiculous. We've heard more credible ones, we've heard incredible setups with 6 chips and more. But in the end they've had one thing in common they were completely false.

IGN is the only one who seems to have actually access to the dev documentation and all we've got from them so far is the clockspeeds. Even IGN's alledged ram configuration seems off. First it was 3 MB eDram, then 16 MB and now it's 16+3 MB eDram.

I am really sceptic. I need proof and not just a couple of posters of GAF who claim to be insiders. BTW, if there're so many insiders, how don't we know jack about Wii? How can Nintendo keep it a secret?
 
for what it's forth, let's recall what one mosys press release read:

mosys said:
The newest 1T-SRAM implementations embedded within the Wii console are fabricated using NEC Electronics' advanced 90nm CMOS-compatible embedded DRAM process technology. These high speed and ultra low latency memories are used as the main embedded memory on the graphics chip and in an additional external memory chip.

compare to this:

supposed leaked specs said:
Hollywood GPU

Hollywood is a system LSI composed of a GPU and internal main memory (MEM1). Hollywood is clocked at 243 MHz. Its internal memory consists of 3 megabytes of embedded graphics memory and 24 megabytes of high speed main memory.

this does not add up at all. the press release talks about edram and an external IC.
 
Last edited by a moderator:
Shifty Geezer said:
That to me is what makes this OC'd GC idea preposterous. Unless IBM and ATi only had one guy working on the changes in their lunch breaks! :p

Close! It was actually me...while I was doing my Master's in math, on the side I was trying to teach myself about computer architecture. It was hard, but ATI was giving me beer money. In the end, though, it was mostly over my head, so I told them to just stick 24 MB of RAM on the Flipper and overclock it as much as they could.
 
darkblu said:
for what it's forth, let's recall what one mosys press release:



compare to this:



this does not add up at all. the press release talks about edram and an external IC.

Also if you read up on IT-SRAM-R which is IIRC, the memory being implemented. Has a clockspeed of 650MHZ and has a DDR similar design for much higher bandwidth.
 
Posted by Gamedev on endgadget.com

I'm a game developer. Here are some specs so you can make an informed choice at launch , if Nintendo don't supply them:

* It's essentially a GameCube, no one has actual devkits yet, everyone is using GCs for their dev
* GPU has no shaders, fixed function, nothing new (it is an upgraded flipper)
* Unit has 88MB of memory, 24 for video/dsp and 64 for main external mem with 512 of flash mem for downloadable content, no HDD
* Clock speeds on GPU and CPU are approx 1.5x that of the GC
* Broadway CPU has a choice of 2,3,4 deep pipeline as opposed to Gekkos 2
* Hollywood is a LSI with DSP and GPU onboard with 2MB/s of eDRAM for framebuffer. No HD video fillrates here and that means no 1080i let alone 1080p, highest is 480p which all last gen units could do.
* Bus bandwidth is peak 1.9GB/s from GC 1.3GB/s

That's it... take these specs as you will. At least search around the net before you make comments here. You find some of these things confirmed already, as Game Developers speak to the press on their experience with Wii development.

One more thing:

The 64MB of external RAM is GDDR3, the 24MB is 1TSRAM.

GameDev

Whoops! Sorry that was meant to be 3MB of eDRAM... 2 is sooo close to 3 on the keyboard! :)

GameDev
http://laptops.engadget.com/2006/07/30/a-few-more-wii-details-on-the-table/#c1821493
 
Status
Not open for further replies.
Back
Top