Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
How's it going guys?

lherre drops a lot of hints on neogaf and he knows what he's talking about i think. he is a dev. Ubisoft I imagine.

he says WII U CPU is triple core.

http://www.neogaf.com/forum/showpost.php?p=29784657&postcount=4165

This is pretty exciting, I definitely believe these new infos from espresso and lherre are likely to be true and it's the first hard infos.

It's probably a triple core higher clocked Wii CPU.

I'm quoting this post because it was the catalyst for the following "interview" I'm linking to. I joined to post these in case some of you missed it. In it lherre gives what info he can and I geared the questions around the GPU.

This is where it starts.
http://www.neogaf.com/forum/showthread.php?p=29833512#post29833512

And this is where it essentially finishes at least with my responses.
http://www.neogaf.com/forum/showthread.php?p=29863782#post29863782

If you want to get the full context of it, just start with the first link and read through the exchanges beyond mine and lherre's.
 
thanks.
I should have written 785G, but would only be slightly less wrong.

but still I wonder about the interface. do we expect it to be coherent? may we see a unified address space with common 32bit pointers?
it's something like the FlexIO in the PS3, but actually working and more useful.
 
Last edited by a moderator:
So you should have posted the first sentence above my quote or quote rpg.314. I am not offending you, just try to get it right next time ;)

And now - hush hush - BTT :smile:



My rough guess:
Xenon 1 - WiiU CPU 1.x
Xenos 1 - WiiU GPU 2
Xb360 RAM 1 - WiiU RAM 3
Xb360 eDRAM 1 - WiiU eDRAM 3.x


Nice try, I would go this way too and from what I understand you believe in the hypothesis of the GPU has 400 SIMD / stream processors?

More questions come to mind ...just thoughts...

But will the nintendo(i know they have other culture etc) based on impressions that developers its possible going might surprise us with 640 or even 800 SIMD?

I read that the launch is scheduled for April 2012, even when they can "delay" its production schedule with companies responsible for manufacturing its next gen console (until November / December?)?

I read once(im remember many talks) that sdk beta and final xbox360 was ready about 5 to 6 months before its release, so I guess a hypothesis that wii u may have changed their specs from last time.
 
Nice try, I would go this way too and from what I understand you believe in the hypothesis of the GPU has 400 SIMD / stream processors?

More questions come to mind ...just thoughts...

But will the nintendo(i know they have other culture etc) based on impressions that developers its possible going might surprise us with 640 or even 800 SIMD?

400 - 480, yes. I really can't think of more if it is in fact a SoC. Maybe 640 would be possible, but 800 is IMO out of sight (I am in no way an expert or insider, this is just based on looking @ transistor count / die size and the fact that it will be on 45nm).
 
400 - 480, yes. I really can't think of more if it is in fact a SoC. Maybe 640 would be possible, but 800 is IMO out of sight (I am in no way an expert or insider, this is just based on looking @ transistor count / die size and the fact that it will be on 45nm).

You're thinking of the CPU. Unless I missed it along the way there has been no information provided about the size of the GPU.
 
There's nothing ruling out CPU and GPU on the same die yet, and it could have a number of benefits. Even IBMs wording about what it is they're fabbing is vague, perhaps intentionally so...
 
There's nothing ruling out CPU and GPU on the same die yet, and it could have a number of benefits. Even IBMs wording about what it is they're fabbing is vague, perhaps intentionally so...

Same die 32nm anyone? Like x360s,ps2 slim (EE + GS and late eDRAM on same die)?
 
If you check out what I posted earlier it won't be as easy to commit to it being any R700. Plus there are other things that show "borrowing" from at least the level of Evergreen. But yeah 40nm would be the most likely size used.


Very interesting what was said Wii U to be a gpu like Frankenstein (mix many cores,simds etc) better), it would be interesting to see something similar with more developed in APUs ahead if sony or ms adopt them.
 
There's nothing ruling out CPU and GPU on the same die yet, and it could have a number of benefits. Even IBMs wording about what it is they're fabbing is vague, perhaps intentionally so...

From IBMs press release
Armonk, NY, USA - 07 Jun 2011: IBM (NYSE: IBM) today announced that it will provide the microprocessors that will serve as the heart of the new Wii U™ system from Nintendo. Unveiled today at the E3 trade show, Nintendo plans for its new console to hit store shelves in 2012.
The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.

IBM plans to produce millions of chips for Nintendo featuring IBM Silicon on Insulator (SOI) technology at 45 nanometers (45 billionths of a meter). The custom-designed chips will be made at IBM's state-of-the-art 300mm semiconductor development and manufacturing facility in East Fishkill, N.Y.
Incidentally, that's pretty much the only solid information we have directly from the horses mouth. I don't see much vagueness to be honest, they could hardly be expected to say "but please note that we will not integrate an AMD GPU on the die" even if they won't. It's a press release about an IBM processor, they won't be talking about what the product isn't! Of course, that does leave room for the idea that it might include a GPU, but...
If the die does contain a GPU, it would be a first on SOI as far as I know, and somewhat newsworthy, and as IBM wants to bang their custom silicon drum I'd expect them to talk about it.
Making a full custom chip like that with IP from two sides, and making it work, is one heck of a lot more complex than letting either company make their own chip, on familiar process nodes, and then letting the chips talk over an agreed upon interface. MUCH easier from a collaborative standpoint, less likely to run into deadline crushing snags, the individual dies are smaller and it is much safer to assume decent yields, and a problem on one end doesn't hold up either design work or production on the other.
It is not a coincidence that the 360 didn't integrate CPU and GPU until two process nodes down the line.
While nothing official explicitly denies integrated CPU and GPU, there is nothing that confirms it either:
From AMDs press release
“We greatly value our synergistic relationship with the AMD design team. The AMD custom graphics processor delivers the best of AMD’s world-class graphics expertise. AMD will support our vision of innovating play through unique entertainment experiences," said Genyo Takeda, senior managing director, Integrated Research & Development of Nintendo Co. Ltd.

Also note that there is NO mention of IBM in AMDs press release and NO mention of AMD in IBMs. For such a noteworthy collaborative effort as a joint design of a novel processor in service of a mutual customer, that's simply unheard of.
 
really I believe the Wii GPU could be a 780G derivate, with chipset functions (storage, usb, misc).
has 320 or 400SP, 512MB ddr3 64bit as a sideport. that is slow but deal with that.
I/O (SATA, USB etc) is AMBA AHB based, so most likely handled by an ARM coprocessor.
 
Very interesting what was said Wii U to be a gpu like Frankenstein (mix many cores,simds etc) better), it would be interesting to see something similar with more developed in APUs ahead if sony or ms adopt them.

I agree. I'm not as technically advanced as most of you here (I've been working to improve that after a long time away from tech knowledge), but with him saying only one thing in the final kit could probably be taken as an R700 my first inclination as of now would be the SPUs and 800 SPUs an amount (although high) that could cause him to say something like that. The second would be memory as I get the feeling GDDR5 won't be used.

From IBMs press release

Incidentally, that's pretty much the only solid information we have directly from the horses mouth. I don't see much vagueness to be honest, they could hardly be expected to say "but please note that we will not integrate an AMD GPU on the die" even if they won't. It's a press release about an IBM processor, they won't be talking about what the product isn't! Of course, that does leave room for the idea that it might include a GPU, but...
If the die does contain a GPU, it would be a first on SOI as far as I know, and somewhat newsworthy, and as IBM wants to bang their custom silicon drum I'd expect them to talk about it.
Making a full custom chip like that with IP from two sides, and making it work, is one heck of a lot more complex than letting either company make their own chip, on familiar process nodes, and then letting the chips talk over an agreed upon interface. MUCH easier from a collaborative standpoint, less likely to run into deadline crushing snags, the individual dies are smaller and it is much safer to assume decent yields, and a problem on one end doesn't hold up either design work or production on the other.
It is not a coincidence that the 360 didn't integrate CPU and GPU until two process nodes down the line.
While nothing official explicitly denies integrated CPU and GPU, there is nothing that confirms it either:
From AMDs press release


Also note that there is NO mention of IBM in AMDs press release and NO mention of AMD in IBMs. For such a noteworthy collaborative effort as a joint design of a novel processor in service of a mutual customer, that's simply unheard of.

True. But Nintendo loves their NDAs a lot. I can see something where Nintendo's CPU/GPU is based of the XCGPU, or vice versa considering Xenon used PPEs yet came out before PS3. I know wsippel and I have debated about this elsewhere, but incorporating some of his previous views I could see something where the CPU has a large amount of eDRAM for L2 cache (16MB so to speak) and there would also be another amount like 1T-SRAM (24-32MB) like Xenos' eDRAM.

I once read somewhere, and I need to find it for my sanity, that IBM's process for implementing the eDRAM on the chip would allow either 1 or 3 Gigabits (or maybe it was listed as Gibibits, can't remember that either but doesn't seem to matter too much) to be placed on a chip. With the A2 already having 8MB for L2 cache, I would assume doubling that amount is a reasonable task.
 
Same die 32nm anyone? Like x360s,ps2 slim (EE + GS and late eDRAM on same die)?

If it's not a CGPU then I guess they could, but I rather feel that Nintendo won't be in a hurry to shrink or combine, and may never do it unless heat and cooling issues encourage them to do so.

Compared to the 360S and the PS3 Slim, the WiiU will already be small, low power consumption and low heat output, and the benefits Nintendo could get from a shrink should be small compared to the 90nm and 65 nm PS360s. Perhaps smaller than for the current PS360s too.

AFAIK, Nintendo didn't shrink the N64, GC or Wii even though they could have. But when your case is small, you don't generate much heat, your PSU and regulation stuff is already cheap and you already have small fans and cheap, quiet cooling then the cost benefits aren't there and so the penalty of using a newer and more expensive process isn't worth it.

Unless Nintendo have accidentally made something that's a bit of a first gen Xbox 360 (overheating, needing 4 tons of heatsinks + emergency heatsink, self destructing, fans maxxing out and causing ear-bleed) I think Nintendo will be happy to sit back and let the launch system run its course. Mind you, there have been those stories about overheating ... if they can't sort them I guess it could be a downclock now or a shrink later (or both).

I don't see much vagueness to be honest, they could hardly be expected to say "but please note that we will not integrate an AMD GPU on the die" even if they won't. It's a press release about an IBM processor, they won't be talking about what the product isn't! Of course, that does leave room for the idea that it might include a GPU, but...
If the die does contain a GPU, it would be a first on SOI as far as I know, and somewhat newsworthy, and as IBM wants to bang their custom silicon drum I'd expect them to talk about it.
Making a full custom chip like that with IP from two sides, and making it work, is one heck of a lot more complex than letting either company make their own chip, on familiar process nodes, and then letting the chips talk over an agreed upon interface. MUCH easier from a collaborative standpoint, less likely to run into deadline crushing snags, the individual dies are smaller and it is much safer to assume decent yields, and a problem on one end doesn't hold up either design work or production on the other.

I still find it odd that IBM didn't refer to making a CPU or "central processor"; they talk about making a processor that is the "heart" of the system. This is a press release for general consumption too. They have no problem talking about having made the Gamecube "central processor" in the same press release. That feels like deliberate vague-ness to me. But perhaps I'm reading too much into this.

I think it'd be a pity if the GPU couldn't make use of all the edram that's supposed to be on the IBM processor at the "heart" of the Wii.

I thought Llano was SOI btw?

It is not a coincidence that the 360 didn't integrate CPU and GPU until two process nodes down the line.
While nothing official explicitly denies integrated CPU and GPU, there is nothing that confirms it either:
From AMDs press release

I don't think a WiiU CGPU would be as big as an all-in-one 90 nm or 65 nm 360. AMD didn't have experience of developing Fusion products then either. IBM have also cut their teeth on the "45nm CPU + GPU SoC" now.

I don't think AMD (or IBM) would dare confirm something that Nintendo wanted secret! But they wouldn't be dishonest about it either. Has anyone been able to directly ask AMD or IBM about any of this?

Also note that there is NO mention of IBM in AMDs press release and NO mention of AMD in IBMs. For such a noteworthy collaborative effort as a joint design of a novel processor in service of a mutual customer, that's simply unheard of.

In the case of a CGPU, AMD wouldn't be able to mention it for the same reason IBM mumbled something about making the microprocessor which is the heart of the WiiU (but yeah they made the central processor for the Gamecube). However cool it might be, if Nintendo wanted it secret (and Nintendo like secrets) it won't be talked about.
 
If it's not a CGPU then I guess they could, but I rather feel that Nintendo won't be in a hurry to shrink or combine, and may never do it unless heat and cooling issues encourage them to do so.

Compared to the 360S and the PS3 Slim, the WiiU will already be small, low power consumption and low heat output, and the benefits Nintendo could get from a shrink should be small compared to the 90nm and 65 nm PS360s. Perhaps smaller than for the current PS360s too.

AFAIK, Nintendo didn't shrink the N64, GC or Wii even though they could have. But when your case is small, you don't generate much heat, your PSU and regulation stuff is already cheap and you already have small fans and cheap, quiet cooling then the cost benefits aren't there and so the penalty of using a newer and more expensive process isn't worth it.

Unless Nintendo have accidentally made something that's a bit of a first gen Xbox 360 (overheating, needing 4 tons of heatsinks + emergency heatsink, self destructing, fans maxxing out and causing ear-bleed) I think Nintendo will be happy to sit back and let the launch system run its course. Mind you, there have been those stories about overheating ... if they can't sort them I guess it could be a downclock now or a shrink later (or both)..

Makes sense, if you start out small you can't really significantly shrink it much. Also if your console isn't designed to have long legs it doesn't make sense to spend money to shrink it even if you could.
 
So is there any credence to this 28nm process for MoSys 1T-SRAM?

What about MoSys Bandwidth Engine?

“Today, we have voice signals and data signals and video and so on, all travelling through these various system in data packets, and there are something like 12 to 14 different instances where a single data packet must be temporarily stored in memory while the packet information is analyzed. This is where the Bandwidth Engine excels, because it’s four times faster than traditional memory approaches. And by combining one ultra-high speed 10Gbps Serdes with four 1TSRAM blocks to create a Bandwidth Engine IC, we’re able to deliver four times the throughput of existing devices with two-to-four times the density and just 40 percent of the power requirement. All of this enables a system cost reduction of up to 50 percent, which is pretty significant.”

for a SoC or SoP?


http://siliconcowboy.wordpress.com/2010/06/15/mosys-fires-up-the-bandwidth-engine/
 
MoSys 1T-SRAM it is, then?


I guess this nullifies the chances of getting high amounts of RAM (>=2GB) using dirt-cheap G/DDR3.

Though it's still unkown if the 1T-SRAM will be for graphics, system or both.
 
Status
Not open for further replies.
Back
Top