Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I know the 5650 are Redwood (I have one in my laptop). They're all right... but going higher than the panel resolution of my laptop (1368x768) is a no go with that thing... though I do have a lower clocked DDR3 model. The fully clocked Redwoods (i.e. 650 instead of my 450Mhz) ones with GDDR5 could handle that better.
 
Seems to me that WiiU will have to be using a DX11 class GPU. There's pretty much no doubt that both the next xbox and the next playstation will be at that level - as will PC for at least a few more years so all games will be built with that base level of technology.

If WiiU is DX11 capable then ports will be as simple as scaling back assets/resolution/framerate but if it's DX10 then things are going to get a lot more difficult.
 
Maybe the WiiU is following a non-traditional trajectory like Nintendo's mobile line where you supercede a previous product while bridging backwards to it? i.e. Shortened life cycle in which case Nintendo may be looking at 2012-2015 as the prime window, leveraging the PS3/360 software in 2012 and 2013 and the stuff that trickles in past that, plus, knowing that many PS4/Xbox3 titles in 2014-2015 will be using older engines and the WiiU install base may influence them to do such initially to boost sales (if MS/Sony launch in 2014 the WiiU should already have 20M units in the wild which pubs will not ignore).

There may be no reason to go out of the way to support DX11.x as they have no intent of having to support software so far out--by then the WiiZ may be ready.
 
Nintendo needs atleast 3-5 years of solid sales to cover all the r&d costs they have been racking up. Even longer if the launch price is below costs (not likely). If this thing does not completely flop i doubt they intend to launch before 2017

I think it would be super hard to go against the industry trajectory when other platforms have already established to tens of millions of units. They have all the games and services and cheaper pricing.
 
Seems to me that WiiU will have to be using a DX11 class GPU. There's pretty much no doubt that both the next xbox and the next playstation will be at that level - as will PC for at least a few more years so all games will be built with that base level of technology.

If WiiU is DX11 capable then ports will be as simple as scaling back assets/resolution/framerate but if it's DX10 then things are going to get a lot more difficult.

Maybe the WiiU is following a non-traditional trajectory like Nintendo's mobile line where you supercede a previous product while bridging backwards to it? i.e. Shortened life cycle in which case Nintendo may be looking at 2012-2015 as the prime window, leveraging the PS3/360 software in 2012 and 2013 and the stuff that trickles in past that, plus, knowing that many PS4/Xbox3 titles in 2014-2015 will be using older engines and the WiiU install base may influence them to do such initially to boost sales (if MS/Sony launch in 2014 the WiiU should already have 20M units in the wild which pubs will not ignore).

There may be no reason to go out of the way to support DX11.x as they have no intent of having to support software so far out--by then the WiiZ may be ready.

Both of these are something I see as very likely.
 
Last edited by a moderator:
Seems to me that WiiU will have to be using a DX11 class GPU. There's pretty much no doubt that both the next xbox and the next playstation will be at that level - as will PC for at least a few more years so all games will be built with that base level of technology.

If WiiU is DX11 capable then ports will be as simple as scaling back assets/resolution/framerate but if it's DX10 then things are going to get a lot more difficult.


I really have my fingers crossed that Wii U' GPU is DX11 class. That would be wonderful.
 
Nintendo needs atleast 3-5 years of solid sales to cover all the r&d costs they have been racking up. Even longer if the launch price is below costs (not likely). If this thing does not completely flop i doubt they intend to launch before 2017

I think it would be super hard to go against the industry trajectory when other platforms have already established to tens of millions of units. They have all the games and services and cheaper pricing.

No they don't. A quick google tells me in 2008 - 2009 the R&D budget was ~450 million. In 2011 Nintendo made a 2 billion profit. So no, they won't need ~2/3 of WiiU's lifecycle just to earn back R&D.
 
No they don't. A quick google tells me in 2008 - 2009 the R&D budget was ~450 million. In 2011 Nintendo made a 2 billion profit. So no, they won't need ~2/3 of WiiU's lifecycle just to earn back R&D.

What has previous profit have to do with anything with Wii U? 2008-2009 r&d is not even fully relevant to Wii U. Yes, they dont need that much if you expect them to try sell it initial 3DS margins.
 
Assuming the chip they worked on is indeed the Wii U GPU, we'd be looking at a 40nm chip with a GDDR5 controller and ~625 million transistors, so roughly in line with Redwood. Which seems to fit the "2x 360" rumor popping up every once in a while.

If those 625M transistors are also carrying the dedicated DSP, (rumoured) Cortex-A level ARM CPU for I/O and "sleep activity", embedded DRAM and other stuff, then it's really disappointing for what's left for the GPU itself..

Unless it's clocked at >1GHz or something.
 
Last edited by a moderator:
What has previous profit have to do with anything with Wii U? 2008-2009 r&d is not even fully relevant to Wii U. Yes, they dont need that much if you expect them to try sell it initial 3DS margins.

No it's not but R&D probably didn't multiply or more just for WiiU development. So your claim of Nintendo needing 3 - 5 years just to earn back Wii U development is just not true.
 
If those 625M transistors are also carrying the dedicated DSP, (rumoured) Cortex-A level ARM CPU for I/O and "sleep activity", embedded DRAM and other stuff, then it's really disappointing for what's left for the GPU itself...

Why on gods' green earth would you assume all that? (Cortex A5 isn't terribly heavy btw, but even an ARM11 is overkill for what you outline, and requires a grand total of 1mm2 on 40nm.)
I distinctly remember the fully custom PS3 GPU, as both Kutaragi and Jen-Hsun assured us it was. Turned out to be an almost plain vanilla product, with some CPU-GPU communications bolted on. That will be needed on a Nintendo GPU as well, on the other hand there is limited need for PCI-express, and even less for IEEE754 rounding compliance...

We don't know that the implied part is a Nintendo product, nor do we know if the part is intended to be used exactly as described in the case study or if specific functional blocks are readily modified, and a plethora of other things. However, if we assume that the described part is indeed the Nintendo GPU, then we can safely dismiss GPU e-dram for instance. Why would a GPU of those dimensions, on a GDDR5 memory subsystem, need e-dram? The CPU will have it, we know that from the horses mouth, IBM. But the GPU?

If they are integrating stuff onto the GPU, it is likely the above mentioned CPU<->GPU bus, plus the usual housekeeping. Nothing that takes a lot of die area. It just isn't needed assuming a dedicated die for the CPU, and GDDR5 memory. (I'm assuming that whatever is needed for over-the-air communication with the tablet controller resides on its own chip.)

It is seductively easy to assume that wsippels lead is indeed Nintendos upcoming GPU, but it seems very premature to take it for granted. I sincerely doubt we'll ever know for sure. Even a year after the release of the 3DS, the real specs for its GPU still isn't public knowledge.
 
If those 625M transistors are also carrying the dedicated DSP, (rumoured) Cortex-A level ARM CPU for I/O and "sleep activity", embedded DRAM and other stuff, then it's really disappointing for what's left for the GPU itself..

Unless it's clocked at >1GHz or something.

Would be right around the RV730 I predicted right along, which is 514m transistors.

It is seductively easy to assume that wsippels lead is indeed Nintendos upcoming GPU, but it seems very premature to take it for granted. I sincerely doubt we'll ever know for sure. Even a year after the release of the 3DS, the real specs for its GPU still isn't public knowledge.

Ehh, we know it pretty well dont we? Anyway there's more interest in consoles so I assume we'll get more knowledge, we know PS3, 360, and Wii well enough, even if those stats aren't public.

We will have a good handle on whats in all the consoles no matter how much they try to keep it secret. iPad would be another example...Apple doesn't like to divulge it's specs, yet we know them.
 
If those 625M transistors are also carrying the dedicated DSP, (rumoured) Cortex-A level ARM CPU for I/O and "sleep activity", embedded DRAM and other stuff, then it's really disappointing for what's left for the GPU itself..

Unless it's clocked at >1GHz or something.

All that crap you outlay is probably ~25m trans or so, I don't think it cuts significantly into the budget. Other than the eDRAM, but the size is unknown. If its the rumored 32MB then that leaves 350m trans for GPU, which maybe puts it in range of a console modified 320:32:8 ~4670 product. Note that this is almost 2x the trans of Xenos, and they can probably achieve close to 2x the performance or so between clocks, more eDRAM and what not.

I'm thinking if they are targeting 720p though as I suspect, it might only have 16MB eDRAM, which brings you up to 470m trans, in the range for a console modified Redwood or even maybe Turks product. These could be 3x the performance of Xenos.
 
All that crap you outlay is probably ~25m trans or so, I don't think it cuts significantly into the budget.

Cortex A9 alone is ~26M transistors. The rumoured ARM CPU afaik is Marvell's 4-core Armada XP, so expect some ~100M transistors on the (secondary) CPU part.

Again, this is only if the Armada XP rumours are correct, which may not be, of course.
 
Cortex A9 alone is ~26M transistors. The rumoured ARM CPU afaik is Marvell's 4-core Armada XP, so expect some ~100M transistors on the (secondary) CPU part.

Again, this is only if the Armada XP rumours are correct, which may not be, of course.

Lol wat? Why would you need a 100m trans quad core A9 for a standby network processor? Look up the Starlet core in Wii, its an ARM9e, and at least my expectation is its something similar. What was this quad ARM rumor?
 
What was this quad ARM rumor?

From a 2010 Engagdet article:

Marvell's been teasing potent little processors for over a year now, but we've yet to see the firm's Armada appear in anything we'd actually want... but co-founder Sehat Sutardja just let slip that Marvell silicon will power a genuine game console of some sort. "Approximately 15% of the sequential increase [in quarterly sales] was due to the initial production revenue from our ARMADA application processors, primarily as a result of a major customer preparing to launch a new gaming platform," he told investors in a conference call last week, which roughly translates to "We just sold a load of processors for a new game console, yo" if our business-speak is correct. While there's absolutely nothing connecting this transaction to Nintendo's 3DS (which was confirmed to have a Pica200 GPU), we honestly can't think of a single other game platform slated to launch anytime soon -- so don't be surprised if there's a quad-core Armada 600 under that variably-stereoscopic hood.


http://www.engadget.com/2010/08/24/marvell-major-customer-launching-new-game-platform/

From there speculation went to the next Wii at the time.
 
Lol wat? Why would you need a 100m trans quad core A9 for a standby network processor? Look up the Starlet core in Wii, its an ARM9e, and at least my expectation is its something similar. What was this quad ARM rumor?

An ARM9 wouldn't be able to send video+sound streams to the tablet controllers, so it's pretty obvious that either Nintendo leaves that for the main CPU or they'll have to significantly upgrade the fuctionality/performance in Starlet 2.

There's the Marvell's president claim in 2010 that there were significant shipments of Armada XP units going to "a new gaming platform" and there's the claim that people at AMD have been calling the Wii U's GPU as a System-on-Chip, which suggests it has an integrated application processor of some kind.

Check out page 22 on this very same thread.
 
An ARM9 wouldn't be able to send video+sound streams to the tablet controllers, so it's pretty obvious that either Nintendo leaves that for the main CPU or they'll have to significantly upgrade the fuctionality/performance in Starlet 2.

There's the Marvell's president claim in 2010 that there were significant shipments of Armada XP units going to "a new gaming platform" and there's the claim that people at AMD have been calling the Wii U's GPU as a System-on-Chip, which suggests it has an integrated application processor of some kind.

Check out page 22 on this very same thread.
They called Hollywood an SoC too. I wouldn't put too much stock into the ARM in a console thing. I think Microsoft was the original contractor of those ARM units, but have moved onto a different target.
 
Status
Not open for further replies.
Back
Top