Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
So then you'd probably be looking at a 40nm GPU from AMD that is in the range of 40-50W TDP.*

You can see that the 5570 series kind of fits (with some headroom for clock tweaks around 650MHz) within that TDP range. That's pretty much where the 400ALU idea came from. Mind you, it's already in the 600M+ transistor area.

If Nintendo wants something basically off-the-shelf, cheap and lower power:
http://www.anandtech.com/show/4307/amd-launches-radeon-e6760

35W including 1GB of GDDR5, 480ALUs at 600Mhz, and specifically designed for embedded platforms with a long support lifecycle.

Then team that up with 4 or so PowerPC 476FP cores and some eDRAM on a chip.

I don't see the point in Nintendo going with something weaker than that.
 
I don't see the point in Nintendo going with something weaker than that.

Sounds about right. Just adjust the clocks really. They could go with a higher clock with fewer ALU etc or vice versa. Ballpark.


edit:

We also don't know what the characteristics of the CPU will be. OOOE + high clocks notwithstanding...

But anyways, I'll reiterate that the overall TDP does need to be pretty reasonable compared to what the 360S was dealing out, and I'm sure Nintendo will want to keep fan speeds low enough.

edit2:

Quite interesting to see the different TDP ratings for the 6670. Seems kind of odd to see that bumping the clocks to 800MHz and 4GHz GDDR5 results in 66W TDP up from 35W. Going down to 650MHz for the 6570 yields 60W TDP.


http://www.anandtech.com/show/4278/amds-radeon-hd-6670-radeon-hd-6570
 
BGASSASSIN; What was you original predication form your 'sources' of wii hardware power?...certainly the sheepish comments coming from Ninty doesn't exactly fill its fans with confidence?...Im really trying to think how they could design a gpu.. some 7 years later thats...best case 'on par' with a Xbox 360.....i mean Ninty didn't even try and drum up the excitement...seriously if it was more powerfull in anyway...he would have said so....knowing full well if he did..it would produce more sales....knowing that...he didn't which speaks for its self really.

Still, i can't even fathom how they could make it worse...even a last years Llano was more powerfull...logic tells me that it IMPOSSIBLE for not to be better...it really isnt...ipad 4 will smoke Wii U...thats embarrassing.

A quick look on AMD website at the 4670; 320 ALU's-1gb ram...that alone would smoke a 360 in a console form factor...i mean it HAS to be better...AMD doesn't make GPU's that CAN be worse....i bet its all some kind of trick...baffling.

AMD 4670- -514m transistors
-DX10.1
-Shader model 4.1
-320 shaders/ALU's
-8 ROP's
-32 TMU's
-750mhz
-128bit bus with 1gb ddr3..@32gb p/s
-Retailed @ £55 sept 08..

That compares with a comparitively anaemic Xenos..with half the TMU's...2/3 shaders... 3/4 frequency....DX9.0c shader model 3.0.... 1/2 ram with 3/4 bandwidth...that came out in 2005...seriously they CAN'T make a worse GPU in 2012..if they tried.

My original prediction came from looking at what was in the dev kit. Nintendo didn't give any GPU specs in their target specs, so that's all I had to work with. So the thought process for me was what would the GPU look like based on what was in the kit? That began to change once I heard things about what the final would look like so as time passed I started considering how a modern GPU with Nintendo characteristics would look.

Pasting what I linked earlier.

I better add some context to this. I wouldn't call it a conclusion. I would say it's no more than a hypothetical scenario of what would easily be many scenarios IMO. This is based on the Wii U GPU (last I heard) will have some unknown (by me) "Nintendo-designed" feature(s). Those unknowns could be anything. I think my version is slightly different from wsippel's. If these unknowns were fixed and/or programmable functions (no TEVs), they would have silicon dedicated to them. From there the GPU would have something like 400-480 ALUs. Run-of-the-mill ports would most likely only take advantage of the 400-480 ALUs and in turn would not have much of a difference, if any, between Wii U and PS360 games. Games built from the ground up would incorporate these functions on top of the ALUs allowing for a more noticeable difference. So if this custom GPU were say 600Mhz, just with the ALUs you're looking at 480-576 GFLOPs (2-2.4x Xenos). Games that incorporated the other hardwired functions while using the shaders would make the GPU equivalent to a 1 TFLOP or more GPU.

Again I just want to reiterate this is just a hypothetical of what the GPU might look like if it didn't resemble a "traditional" GPU that had 640-800 ALUs.

Now this is me thinking based on "primitive" ideology, but there does seem to be something to this.

Honestly I'm not even concerned about the GPU. Some have tried to view the " Wii U is weaker than PS360" talk as contradicting others that have said it was more powerful. I never really saw that. Even since Arkam's posts, it sounded like the issue was more on the CPU-side. And since then things continued to point in that direction. To me it meant that either the CPU was a bottleneck or the ones complaining were porting and not bothering with optimizing anything, and in turn blaming Wii U for their work. For there to be comments from those complimenting Wii U, that to me means the latter of those two things for those complaining.

Even then the CPU shouldn't be an issue. The console has an ARM I/O controller, audio DSP, and the console and controller both have hardware codecs for the stream.

And since the Gamecube I don't get hung up on Nintendo comments, or lack thereof, in that regard. After all they said the GC was capable of 6-12M polys/sec and Factor 5 was able to achieve 20M.
 
Ha ha..you could grab the worlds top super brains..give them a budget of say £500 million, 7 years...with a challange to bring out a WORSE console than the xbox 360..in 2012...they would FAIL :LOL:

Ninty on the other hand are dab hands at pulling poor components out the bag!:p check out the 3ds's processing power..lol....genius how they manage to sell that crap.
I can't wait for the final specs tho at this point I'm ready for anything...
But if indeed they can't match or exceed and in all the scenario I'm going to have a good laugh.
they are better to be super profitable on the device and have spend peanuts in R&D.
It sounds almost impossible... but if they pull it.... their R&D team needs to be sent cupcakes, plenty...
I mean if they are one a budget their is no thinking twice: 1 power a2 module, tiny and cool stuck to a RV730. A 128 bits bus,1 GB of gddr5 and call it day...
I don't even know why they of edram... speak about spending time and silicon on something far from critical if basically you have no horse power....
I mean a power A2 module comes with 4MB of L2, they may move to 2MB, pass on edram and produce cheaper using bulk @40nm ( sansung, tsmc).

Really sounds impossible to fail but I still haven"t dare answered Shifty pool on the matter, my logic conflict with my feeling...
 
Last edited by a moderator:
If Nintendo wants something basically off-the-shelf, cheap and lower power:
http://www.anandtech.com/show/4307/amd-launches-radeon-e6760

35W including 1GB of GDDR5, 480ALUs at 600Mhz, and specifically designed for embedded platforms with a long support lifecycle.

Then team that up with 4 or so PowerPC 476FP cores and some eDRAM on a chip.

I don't see the point in Nintendo going with something weaker than that.

35W may well be too high given the WiiU's form factor and that GPU would be way to big to be integrated into a 45nm CPU/GPU/SOC (assuming they'd want to go that route).

Just a thought, but a SoC and/or customised GPU might explain why Nintendo were using relatively old GPUs in their early development kits. They could represent the point on the Radeon line where the technology branched off to get Nintendo specific customisations and then go to IBM for work on integrating into a single package. Or maybe Nintendo were originally planning on a standalone GPU (back in 2009) and something convinced them to go for something smaller and cheaper.

Until someone sees inside the WiiU or we hear about a GPU fab partner I'll be keeping the SoC dream alive!
 
35W may well be too high given the WiiU's form factor and that GPU would be way to big to be integrated into a 45nm CPU/GPU/SOC (assuming they'd want to go that route).

Even if Nintendo are taking the SoC route (and I'm not convinced), I don't see why such a GPU would be too big.

Look at Apple's A5X for instance, 163mm2 on a potentially very similar 45nm process, and able to fit into an iPad form factor at a BoM cost of US$23.

A SoC containing 4 PowerPC 476FP cores, an underclocked Turks GPU, and whatever other misc DSPs and IO are required, would probably be around the same size.


Just a thought, but a SoC and/or customised GPU might explain why Nintendo were using relatively old GPUs in their early development kits. They could represent the point on the Radeon line where the technology branched off to get Nintendo specific customisations and then go to IBM for work on integrating into a single package..

Although I don't think it is likely, it would sure be interesting if what you say is true; and AMD is using the opportunity to develop a separate very-low power Radeon architecture for the Wii U, which they would then use as a direct competitor for PowerVR, Adreno, Tegra Geforce, etc.
Just as Tegra's shader cores are supposedly semi-based on the NV40 architecture, a future AMD very-low power GPU architecture could have the RV7xx architecture as its basis.
 
Last edited by a moderator:
The size of the PS3 slim doesn't compare that badly to the Wii U when you consider the PS3 slim consumes up to 120W, and has to contain a 250W power adaptor and a 2.5" hard drive in a removable metal caddy as well.

And a small HTPC such as the ASRock ASRock Core 100HT (e.g. http://www.legitreviews.com/article/1568/1/) is only 20% larger in volume than the Wii U despite being built with a degree of user access and servicing in mind.
 
Last edited by a moderator:
Even if Nintendo are taking the SoC route (and I'm not convinced), I don't see why such a GPU would be too big.

Look at Apple's A5X for instance, 163mm2 on a potentially very similar 45nm process, and able to fit into an iPad form factor at a BoM cost of US$23.

A SoC containing 4 PowerPC 476FP cores, an underclocked Turks GPU, and whatever other misc DSPs and IO are required, would probably be around the same size.

I know it's not particularly scientific, but here's my thinking:

Look at the GPU on a 32nm Llano, then imagine it on a 45nm process (just the GPU and IO, ignore the bulky CPU). And then add the rumoured 32MB of edram to it.

If Turks were viable in a 45nm, edram heavy SoC then I think you have to wonder why something weaker is taking up about half of Llano.

http://pc.watch.impress.co.jp/docs/column/kaigai/20101104_404182.html

Although I don't think it is likely, it would sure be interesting if what you say is true; and AMD is using the opportunity to develop a separate very-low power Radeon architecture for the Wii U, which they would then use as a direct competitor for PowerVR, Adreno, Tegra Geforce, etc.
Just as Tegra's shader cores are supposedly semi-based on the NV40 architecture, a future AMD very-low power GPU architecture could have the RV7xx architecture as its basis.

I was just thinking of them doing a one-off sideline GPU for Nintendo (kind of like Xenos) but I guess any experience they gained would feed back into the main Radeon line (again, a bit like Xenos I guess). That has reminded me of something though, and that's Bobcat.

Bobcat was all about making low power processors that could, using tools, be painlessly scaled in and adapted for different manufacturing processes. Doesn't seem to have worked out so well with their 28nm stuff (which has abandoned GF and jumped to TSMC, incurring a large delay) but 3X whatever is in Zacate (or even 2X with sufficient clocks) could easily be a match for the 360. Maybesortof ...
 
That has reminded me of something though, and that's Bobcat.

Bobcat was all about making low power processors that could, using tools, be painlessly scaled in and adapted for different manufacturing processes. Doesn't seem to have worked out so well with their 28nm stuff (which has abandoned GF and jumped to TSMC, incurring a large delay) but 3X whatever is in Zacate (or even 2X with sufficient clocks) could easily be a match for the 360. Maybesortof ...

One of the few things we do know is that IBM is providing the CPU component for the Nintendo Wii U on a 45nm SOI process and it feature IBM's eDRAM technology in some way.

Anway, the PowerPC 476FP, based on its specifications, seems to have a similar design methadology to Bobcat but is smaller (3.6mm2 on 45nm IBM vs. possibly 4.6mm2 on 40nm TSMC), has lower power consumption (1.6W at 1.6Ghz on 45nm IBM vs. ?), and greater theoretical ILP.
Using the Power ISA other than x86 is probably an advantage for Nintendo considering matters of backwards compatibility and porting from the Xbox360 and PS3.
 
Last edited by a moderator:
Oh I wasn't talking about using x86, I was talking about using AMD's graphics technology, memory controller and IO and incorporating that into a processor manufactured on IBM's 45nm process (seeing as we know that's what the CPU and probably edram are being made on). Pretty much what the hugely successful 360S CGPU is only with the edram on there as well (less power, less silicon, simpler package).
 
Ha ha..you could grab the worlds top super brains..give them a budget of say £500 million, 7 years...with a challange to bring out a WORSE console than the xbox 360..in 2012...they would FAIL :LOL:

Ninty on the other hand are dab hands at pulling poor components out the bag!:p check out the 3ds's processing power..lol....genius how they manage to sell that crap.
Doesn't the 3DS even have dedicated VRAM and a second CPU locked out (recently unlocked with newer devkits)? I wouldn't exactly call them using poor components (otherwise why would they go out of their way to stick "exotic" stuff like 1T-SRAM?) but some of their design choices are...unique to say the least.

I'm kinda confused as to whether or not they are going for an SoC design.
 
Doesn't the 3DS even have dedicated VRAM and a second CPU locked out (recently unlocked with newer devkits)? I wouldn't exactly call them using poor components (otherwise why would they go out of their way to stick "exotic" stuff like 1T-SRAM?) but some of their design choices are...unique to say the least.

I'm kinda confused as to whether or not they are going for an SoC design.

The technology in the 3ds is abysmal...seriously...it may have 2 cpu's but they originated from 2003 or something and clocked at a pathetic rate.
the screen resolution/size is poor...the batterylife is poor...and the gpu..well..that originates from 2006!!:eek:

BGASSASSIN; Yea cpu is going to be important..will be very interesting to see what happens with this console....seriously they seem to milk there generous fans a bit too often..theres only so much that can be done with software...the devs wont thank Ninty if they have to redesign ports:cry:

If any of the rumours are to be believed about power of ps4/720 then this may be even worse than wii>ps360

DS/Wii had genius features that were brand new...nintendo did a fantastic job of marketing and to create a little niche for itsself..how ever game quality= revenue tells its own story...
The point being Ninty was able to gloss over the poor hardware with revolutionary gimmicks....unless they havn't told us something..they wont have that advantage this time.
 
The focus on eDram seems ill adviced... Really with something that could be akin to a RV730 (or less...) Slow gddr5 should have provide enough bandwidth.
I guess Nintendo wants to use some a cheapest ddr3 around. As a trade off they go with a more expansive process that include Edram. They also go for a less dense process (40 vs 45).

Altogether I no longer know what they are doing as far as hardware is concerned. There are workloads where CPU clock is prevalent (stated by many clever persons here) and other situation were SIMD power is relevant. A three cores with low clock (and so low through put) won't cut it.

OoO may make the thing convenient it won't cut it...
Edram may provide more bandwidth but if shader power is not there...
Looking at the system they wanted to put together I wonder how they came to the conclusion bandwidth was the main issue...
 
BGASSASSIN; Yea cpu is going to be important..will be very interesting to see what happens with this console....seriously they seem to milk there generous fans a bit too often..theres only so much that can be done with software...the devs wont thank Ninty if they have to redesign ports:cry:

If any of the rumours are to be believed about power of ps4/720 then this may be even worse than wii>ps360

DS/Wii had genius features that were brand new...nintendo did a fantastic job of marketing and to create a little niche for itsself..how ever game quality= revenue tells its own story...
The point being Ninty was able to gloss over the poor hardware with revolutionary gimmicks....unless they havn't told us something..they wont have that advantage this time.

IMO even if Wii U retained "DX10.1-level" shaders (which I don't expect), it wouldn't be as bad as the architectural differences Wii had vs PS360. There are two things that Nintendo needs to do that I also expect them to do.

1. Make sure 1st party and 3rd party exclusives look great.

2. Make sure the hardware is capable of receiving ports in the future.

Number 2 will show how much they listened to developers. Though after the software drought they began to see with Wii, common business sense would suggest they will try to avoid that from happening again and do "just enough" with their hardware to achieve that.
 
35W may well be too high given the WiiU's form factor

People are saying that they will be disappointed if the WiiU is under power compared to the ps360 and wonder how Nintendo could develop something like that. But I think you bring up a point that many forget and that's the small for factor and likely very low power requirements.

I personally would find it very impressive if Nintendo developed a console with the similar processing power of a 360 and but at one third to one half of the 360's power consumption. People need to consider that system design is an optimization of several metrics not just processing power and the console can be cutting edge and not an uber processing machine at the same time.
 
People are saying that they will be disappointed if the WiiU is under power compared to the ps360 and wonder how Nintendo could develop something like that. But I think you bring up a point that many forget and that's the small for factor and likely very low power requirements.

I personally would find it very impressive if Nintendo developed a console with the similar processing power of a 360 and but at one third to one half of the 360's power consumption. People need to consider that system design is an optimization of several metrics not just processing power and the console can be cutting edge and not an uber processing machine at the same time.

Honestly I wouldn't because IMO MS and Sony could be more efficient in their cooling/case design.
 
Anyway back on topic..

IMO even if Wii U retained "DX10.1-level" shaders (which I don't expect), it wouldn't be as bad as the architectural differences Wii had vs PS360. There are two things that Nintendo needs to do that I also expect them to do.

1. Make sure 1st party and 3rd party exclusives look great.

2. Make sure the hardware is capable of receiving ports in the future.

Number 2 will show how much they listened to developers. Though after the software drought they began to see with Wii, common business sense would suggest they will try to avoid that from happening again and do "just enough" with their hardware to achieve that
I have no doubt that the games will look ''good''..current gen games look ''good''..that depends on what you define as looking ''good''...for 12 months they will look like current gen...long in the tooth but ''good''....then when PS720 arive they will start to look ''last gen''=''not good''...don't forget Sony and Microsoft also have Gimmicks..so the 'niche' no longer exists..
The second point is related to the first...if they havn't listened to the devs..and they can't port properly..then they are screwed...sales this time won't be a good as the Wii...there will be even less incentive to make the effort...for that NOT to happen, we need to see 5570/6670 gpu +quad OoO POWER 6 derivitive.and at least 1gb ram....with decent bandwidth 80-100gb/s.
I personally would find it very impressive if Nintendo developed a console with the similar processing power of a 360 and but at one third to one half of the 360's power consumption. People need to consider that system design is an optimization of several metrics not just processing power and the console can be cutting edge and not an uber processing machine at the same time.
No it can't be cutting edge with out advanced processing...because apart form the controller input..processing is all there is to designing the hardware that makes the difference....that dosn't mean they need to go crazy..just a nice ballance where they use modern parts...without going power mad..there is a little pocket ready made for Ninty to survive..a bit like AMD.....but they need to get the right ballance...bringing in hardware thats at best 'on par' with consoles 7 years old isn't one of them.
Here check out these few posts in the other thread to see whats possible in that form factor..really you don't need to work at NASA to able to pull it off in 2012;
http://forum.beyond3d.com/showthread.php?t=31379&page=448
 
Last edited by a moderator:
I have no doubt that the games will look ''good''..current gen games look ''good''..that depends on what you define as looking ''good''...for 12 months they will look like current gen...long in the tooth but ''good''....then when PS720 arive they will start to look ''last gen''=''not good''...don't forget Sony and Microsoft also have Gimmicks..so the 'niche' no longer exists..
The second point is related to the first...if they havn't listened to the devs..and they can't port properly..then they are screwed...sales this time won't be a good as the Wii...there will be even less incentive to make the effort...for that NOT to happen, we need to see 5570/6670 gpu +quad OoO POWER 6 derivitive.and at least 1gb ram....with decent bandwidth 80-100gb/s.

Even if they do well and aren't screwed with ports, I don't expect Wii U to sell as much as Wii. However I expect the console market to shrink next gen anyway, but that's another story.

POWER6 is in-order, so we can eliminate that and from what I understand three OoO cores are still the target. There was a memory target range with 1.5GB (at least last year) being the max and lherre indicated Nintendo was going with the max. I also don't see it having that much BW, but we'll see. The target eDRAM amount was 32MB, so I don't know if that is still the same or saw an increase. Then of course is wondering how this GPU is going to turn out. That might be a part of why Nintendo didn't give GPU specs in the early target specs.
 
Status
Not open for further replies.
Back
Top