Nintendo Wii specs (courtesy of maxconsole.net)

Maxconsole.net apparently has the full Nintendo Wii specs.

Broadway CPU

Broadway is Wii's CPU. Broadway functionality and specifications are as follows.

• Operating speed: 729 MHz
• Bus to main memory: 243 MHz, 64 bits (maximum bandwidth: 1.9 gigabytes/sec)
• 32-kilobyte 8-way set-associative L1 instruction cache
• 32-kilobyte 8-way set-associative L1 data cache (can set up 16-kilobyte data scratch pad)
• Superscalar microprocessor with six execution units (floating-point unit, branching unit, system regis
ter unit, load/store unit, two integer units)
• DMA unit (15-entry DMA request queue) used by 16-kilobyte data scratch pad
• Write-gather buffer for writing graphics command lists to the graphics chip
• Onboard 256-kilobyte 2-way set-associative L2 integrated cache
• Two, 32-bit integer units (IU)
• One floating point unit (FPU) (supports single precision (32-bit) and double precision (64-bit))
• The FPU supports paired single floating point (FP/PS)
• The FPU supports paired single multiply add (ps_madd). Most FP/PS instructions can be issued in
each cycle and completed in three cycles.
• Fixed-point to floating-point conversion can be performed at the same time as FPU register load and
store, with no loss in performance.
• The branch unit supports static branch prediction and dynamic branch prediction.
• When an instruction is stalled on data, the next instruction can be issued and executed. All instructions
maintain program logic and will complete in the correct program order.
• Supports three L2 cache fetch modes: 32-Byte, 64-Byte, and 128-Byte.
• Supports these bus pipeline depth levels: level 2, level 3, and level 4.
Reference Information: Broadway is upward compatible with Nintendo GameCube’s CPU (Gekko).

Much more info about the rest of the system is available at the site. At first glance - it seems to match what the speculation been said around here.
 
And it's GPU.

Hollywood GPU

Hollywood is a system LSI composed of a GPU and internal main memory (MEM1). Hollywood is clocked at 243 MHz. Its internal memory consists of 3 megabytes of embedded graphics memory and 24 megabytes of high speed main memory.

Hollywood includes the following.
• Graphics processing unit (with 3 megabytes of eDRAM)
• Audio DSP
• I/O Bridge
• 24 megabytes of internal main memory
• Internal main memory operates at 486 MHz.
Maximum bandwidth between Hollywood and internal main memory: 3.9 gigabytes per second
• Possible to locate a program here
Reference Information: Hollywood is similar to Nintendo GameCube’s Flipper and Splash components.
I'm going to guess this isn't very good.
 
Powderkeg said:
LOL

OK, for everything except the frame buffer that thing is drastically bandwidth limited.

Better?

not really. because the framebuffer is such a negligible BW contributer. it's really easy to forget about it. i mean, i still can't figure it out for the life of me why MS bothered with all the trouble to implement one in the 360, poor shmucks..
 
Powderkeg said:
LOL
OK, for everything except the frame buffer that thing is drastically bandwidth limited.
Better?

Except that Game cube uses the EDRAM for texturing and the framebuffer.
All the external bandwidth is used for is paging the textures into the EDRAM and the CPU.
 
ERP said:
Except that Game cube uses the EDRAM for texturing and the framebuffer.
All the external bandwidth is used for is paging the textures into the EDRAM and the CPU.


The Gamecube uses 1MB of it's EDRAM for texture cache.

The Wii and Gamecube apparently have the exact same amount of EDRAM. What are the odds the Wii will use the exact same 1MB texture cache?


And how much difference did that 1 whole MB make? Never enough to make it look better than the Xbox, which lacked EDRAM entirely.
 
darkblu said:
not really. because the framebuffer is such a negligible BW contributer. it's really easy to forget about it. i mean, i still can't figure it out for the life of me why MS bothered with all the trouble to implement one in the 360, poor shmucks..


Neither can Sony, who apparently feels it's not necessary at all. And strangly enough no one here seems to doubt Sony on their decision. Do you?
 
Powderkeg said:
And how much difference did that 1 whole MB make? Never enough to make it look better than the Xbox, which lacked EDRAM entirely.

Well, obviously MS liked the idea. Othwise X360 wouldn't have it, now would it?

Judging from the specs, GCN should have been a lot weaker than the Xbox, but as a matter of fact, it was a lot better than expected.
 
Powderkeg said:
Neither can Sony, who apparently feels it's not necessary at all. And strangly enough no one here seems to doubt Sony on their decision. Do you?

maybe because they don't have to? in case you missed that, we're talking of dedicated framebuffer memory. and the only console which did not have such recently was the acclaimed xbox, which, judging by MS' new design, was really an architecture to build upon.
 
Powderkeg said:
The Gamecube uses 1MB of it's EDRAM for texture cache.

The Wii and Gamecube apparently have the exact same amount of EDRAM. What are the odds the Wii will use the exact same 1MB texture cache?


And how much difference did that 1 whole MB make? Never enough to make it look better than the Xbox, which lacked EDRAM entirely.

But I wouldn't blame that on lack of bandwidth. Which was what was being discussed.
 
Reference Information: Hollywood is similar to Nintendo GameCube’s Flipper and Splash components.

What the heck was the Splash? I believe the audio component was called Wave or something like that.

Edit:
Ok, found out that apparently the Splash component of the gamecube was its system ram plus A-ram.
 
Powderkeg said:
And how much difference did that 1 whole MB make? Never enough to make it look better than the Xbox, which lacked EDRAM entirely.
Oh FFS, can you give the wii-bashing a rest already? Considering how much cheaper GC was than xbox, I'd say the integrated framebuffer and texture cache (along with other architectural design decisions) did a pretty good job of getting things done, particulary in games tailored for the cube.

Besides, the specs in the OP are wrong, main memory B/W in GC was more than what is listed for Wii despite lower clock speed; the person quoted as source forgot the memory is double-pumped. Memory B/W is hence twice the figure stated.
 
Powderkeg said:
And how much difference did that 1 whole MB make? Never enough to make it look better than the Xbox, which lacked EDRAM entirely.

More like, how much difference did that extra 10 GB/s for framebuffer effects and texturing make? Answer: enough that a system with a much, much simpler and 30% slower CPU and about 40% less total RAM was able to output relatively comparable graphics and maintain the most consistent framerates of any of the 3 consoles. And it didn't lose money. ;-)

P.S. Don't forget the 6x compression for 24-bit textures.
P.P.S. It's probably not a good idea to lecture ERP on how the Gamecube works.
 
Last edited by a moderator:
pc999 said:
ERP/Darkblu, GC do lack (particulary) on BW:?:

well, ERP would be really the one to comment on that, as my cube is still a homebrew-virgin due to some misfortunate circumnstances having to do with two deffective hb-assisting devices from one othewise well-respeced 3rd party vendor.. :|


ps: is it just me or does something from the alledged leaked specs sound really funny:

• Superscalar microprocessor with six execution units (floating-point unit, branching unit, system register unit, load/store unit, two integer units)

since when do CPUs have a 'system registers unit' when citing superscalarity? ..or maybe somebody was trying to add a little 'something' from themselves to the otherwise 100% (overclocked) Gekko that those specs depict?
 
Last edited by a moderator:
darkblu said:
ps: is it just me or does something from the alledged leaked specs sound really funny:



since when do CPUs have a 'system registers unit' when citing superscalarity? ..or maybe somebody was trying to add a little 'something' from themselves to the otherwise 100% (overclocked) Gekko that those specs depict?
The specs listed for Broadway come straight from the PowerPC 750CXe spec sheet. That particular list is from section 1.1:

750CX/CXe/CXr datasheet.
 
Some things just sound very fishy, to the degree that, if that is a genuine leak at all, I suspect those bits are "poison pills" to allow traceing back to the source of the leak.

I mean why would you list embedded memory and then list internal memory? And yeah, I can figure out what is probably meant (multi-chip-package) but noone formulating a spec sheet would do it that way IMO, unless it was done on purpose.

True or not, this is at least realistic. Gone are the days of the obvious wish list.
 
Back
Top