Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Okay, and now that the console has been out for 7 years how much non-GPU stuff has been done on Xenos? Probably not an awful lot. That's much more telling than anything anyone may have claimed about XBox 360 a long time ago.

My point in question here was not so much whether it has been done, but rather that it was used in defense of the 360 CPU. But as I can see from other posts it turns out that it probably has been used to some degree at least. Though I doubt it's something that devs would always be willing to tell the press about if they did, for various reasons.

R7xx is where all the rumors have been pointing for a long time. 3-4 years old does not qualify as almost current gen. I've also not heard of any extra features except for the eDRAM. Could you provide any sources on this?

Nevermind what exact version the GPU is, it is very likely at least five years younger than the 360 GPU, with at least a doubling of the shader units, most probably many more.
And no, I'm sorry I don't have time to look for sources right now, and they would probably not satisfy you anyway because they are mostly (well grounded) rumours and interpretations from the official information. If you are really interested you should go googling yourself.

If you want to believe that the GPU will be used to pick up the slack in lack of vector FP on the CPU then that's where those extra shader units will be going instead of towards improving the graphics. You can't have it both ways.

Well to some degree you can. That's the advantage of programability. But of course it would be a balancing act for individual situations, just like the SPEs in Cell.

These sorts of posts really irk me. It's basically a personal attack against a perceived prejudice without naming names or presenting evidence in support of the view. Who are these people? I recall talk of stuff like physics on Xenos as a possibility, but I'm not sure it was ever presented as sure thing. I certainly couldn't name anyone proposing/supporting that idea. And as it pans out, they would have been wrong, it seems. Xenos does graphics, not GPGPU. So anyone who though Xenos was able to do physics back in 2005 and would in all fairness be able to change their mind as it turns out their belief in GPGPU was unjustified then.

It can never be a personal attack if I don't mention names. It' would just be a matter of using the search function to see who I'm talking about though.
Again, it's not important whether they turned out to be right, but that they are not inclined to say the same thing for Wii U, which has a newer architecture and is most likely, to some degree, better prepared for GPGPUlike functionality.

Except we don't know the (practical) BW of the eDRAM.
Well, why not give Nintendo the benefit of doubt for the time being?
 
That's the thing, I don't doubt we will see GPU compute used on the Wii for some things, just as it's been used on the 360.

What I don't believe is that Nintendo gave the WiiU a weak-sauce CPU because they planned for GPGPU to "take over" all the work in "CPU centric" PS360 ports. Nintendo gave the WiiU a cheap CPU because they wanted to put a cheap CPU in the Wii. :D


Probably true, to an extent.

I think (and this is pure cold blooded speculation) that their design breif was to produce a machine capable of receiving current gen ports, be backwards compatible with Wii and have an architecture/feature set that wouldn't lock it out of receiving next gen ports (obviously with some downgrading) as happened with the Wii - which lest we forget couldn't run the most popular 3rd party engines of the day used by most major studios.

They produced this, and then probably set about "trimming the fat" so to speak: fine tuning, tweaking and cutting down 'excess' until the bare minimum remained which could still fulfill the specification set out in the design breif, at the cheapest price. Of course this runs the risk of ending up with a product which is too bespoke and alienates a large portion of 3rd party developers who don't have the time/resources or financial incentive to try to get their game ported over from a completely different, well established development environment, onto a platform with no room to manouvre and no extra grunt to power through the problems.

And even if the next gen of consoles follow the same layout (relying on the GPU to take some work off the CPU) and even if the WiiU can run UE4, CryEngine3 etc just fine, they may well have gimped the CPU so much that its still too much of a hassle to port games, even downgraded significantly from much more muscley* systems.

/wild speculation.

*thats my new favourite term.
 
If the eDRAM is nothing more than embedded DDR3 (or equivalent) going over the DDR3 bus then it probably would have been cheaper for them to make it external.

I get function's argument that in time DDR3 would go up in price as it became scarce but this isn't actually how things have worked. DDR and DDR2 DIMMs for motherboards are scarce and expensive but the chips themselves are still made and made very inexpensively - for instance a lot of very cheap handhelds and tablets favor DDR2 over DDR3 or LPDDR2 because it's cheaper. It's currently being made by a variety of Chinese and Taiwanese manufacturers, many of whom aren't even making DDR3 yet (or weren't last I checked).
 
If the eDRAM is nothing more than embedded DDR3 (or equivalent) going over the DDR3 bus then it probably would have been cheaper for them to make it external.

Looking at the end of last year, when 28nm TSMC prices were sky high you could reportedly get a 12" wafer for around $4,000 to $5,000*. With 35mm^2 of edram, assuming the worst case $5000, and assuming shitty yields of 50% for a small the processor in the WiiU that would see you around $1.20 for that 32 MB of embedded memory.

(* http://www.xbitlabs.com/news/other/...g_on_28nm_Wafers_Due_to_Increased_Demand.html )

And that was a year ago when prices were higher, and for 28nm instead of the dated 40nm process the WiiU is likely to be made on. And according to Nvidia's presentation on transition between process nodes (from early this year iirc) the cost would be a fraction of that of the 28nm node last year.

Granted, I'm drunk, but even now that compares favourably in terms of cost per MB/s to using 8 x 256MB chips once you factor in a single $ for a more complex motherboard for the double width memory bus. And in 5 years things would only get worse and worse if you'd chosen the 8 memory chips and 128-bit bus over the edram.

I get function's argument that in time DDR3 would go up in price as it became scarce but this isn't actually how things have worked. DDR and DDR2 DIMMs for motherboards are scarce and expensive but the chips themselves are still made and made very inexpensively - for instance a lot of very cheap handhelds and tablets favor DDR2 over DDR3 or LPDDR2 because it's cheaper. It's currently being made by a variety of Chinese and Taiwanese manufacturers, many of whom aren't even making DDR3 yet (or weren't last I checked).

The only figures I have to go on are from DRAM Exchange (http://www.dramexchange.com) but they show 1GB of 800 mHz DDR2 (8 x 128MB) going for well over half the price of 4GB of 1600 mHz DDR3 (8 x 512MB), and 1GB of DDR3 1333 mHz (8 x 128MB) being around half the price of the massively slower DDR2 alternative.

Unless you have some alternative figures I can only conclude that DDR2 offers much worse value compared to DDR3 (which is 2~2.5X the value even *excluding* the huge gains in BW) and that things are just going to keep getting worse as volume drops (as was the case for DDR1).

(LPDDR2 seems to be used primarily for low voltage and power consumption in mobile devices, where DDR3 doesn't yet seem to be able to compete, so it can justify a higher price).

If may be that the WiiU has some super fast pool of edram (but developers have been unable to use it properly for some reason), but even if it doesn't, I think edram would pay for itself over a double width external bus with double the memory chips over the long term.
 
Last edited by a moderator:
Nevermind what exact version the GPU is, it is very likely at least five years younger than the 360 GPU, with at least a doubling of the shader units, most probably many more.

Wait! Don't drink the Kool-Ai ...
...

:(

Probably true, to an extent.

I think (and this is pure cold blooded speculation) that their design breif was to produce a machine capable of receiving current gen ports, be backwards compatible with Wii and have an architecture/feature set that wouldn't lock it out of receiving next gen ports (obviously with some downgrading) as happened with the Wii - which lest we forget couldn't run the most popular 3rd party engines of the day used by most major studios.

They produced this, and then probably set about "trimming the fat" so to speak: fine tuning, tweaking and cutting down 'excess' until the bare minimum remained which could still fulfill the specification set out in the design breif, at the cheapest price. Of course this runs the risk of ending up with a product which is too bespoke and alienates a large portion of 3rd party developers who don't have the time/resources or financial incentive to try to get their game ported over from a completely different, well established development environment, onto a platform with no room to manouvre and no extra grunt to power through the problems.

And even if the next gen of consoles follow the same layout (relying on the GPU to take some work off the CPU) and even if the WiiU can run UE4, CryEngine3 etc just fine, they may well have gimped the CPU so much that its still too much of a hassle to port games, even downgraded significantly from much more muscley* systems.

/wild speculation.

*thats my new favourite term.

I think your last point about them possibly gimping the system too much to receive "next gen" ports is probably the case. There's only so far an engine can practically scale even with the same feature set, and a game running on that engine will usually scale less. With the CPU being as it is, Nintendo have already ruled the WiiU out of receiving some current generation ports, while other ports have struggled. Next gen system are likely to have several times the cpu power and several times the GPU power, and so it's not like the WiiU will have much space for GPGPU to save the day, even if it were possible for some routines to be ported from the CPU to the GPU.

*There's only so much muscle you can lose before you can't even pick up the bar. ;)
 
Perhaps you're right, but I can't help thinking that some of the die area on the edram is used primarily to reduce costs related to the main memory bus and to allow for Wii BC (in the absence of a fast CPU and GPU). The Xbox 360's software emulation of Xbox 1 has perhaps made emulation powered BC look easy, but that was a massive and ambitious effort by someone who appears to be a genius OS guy, and it was by no means the complete solution that Nintendo have typically gone for in their portable consoles and in the Wii and WiiU.
Well with regard to costs saving and looking at what seems to be the performance of the chip, I really wonder if Edram is cost effective. There are plenty of cards (gpus) that are dirty cheap and ship with 128 bit bus. I'm lazy right now but a while ago (a couple of months) Alstrong posted a link with serious estimations about the BOM of various AMD GPUs. Would be interesting to look at the price of the Mobo for for low end GPU shipping with a 128 bit bus (there could be a light difference depending on memory type but in the gran scheme of things I guess we could discard it as an Epsilon).

Wrt to the 360 and BC, indeed it seems like it was quiet a "tour de force". Hasn't BKillian stated that it was achieved though a virtual machine? Anyway pretty impressive stuffs, over my head ;).

Back to Nintendo I wonder if sticking to a PPC with a really close ISA if it would have proved such a task to achieve. Most likely far from trivial but possibly worse the investment vs grounding the design too much in the last decade (or prior to that).
Regarding pure performance, Trinity desktop processors on a 128-bit DDR3 bus seem to offer far more performance than the WiiU and Xbox 360 (massively, massively more on the CPU front), so I do think it has to be a cost thing and BC thing.
The scary part is that I actually never dig much into what the Broadway and Gecko consisted in but I hope Exophase is right and IBM does some, for now undisclosed, work on Expresso.
Damned they should perform significantly slower than the PPC47x series quiet some people were expecting Nintendo to use, quiet significantly :(
It would not have matched AMD big cores for sure but the picture would be less ugly. I would assert, thought someone like Exophase could confirm if I get it properly, that those PPC 47x should perform per cycle pretty close to those PPC 7447 / G4 derivative in the benchmark ERP provided (putting aside FP performance). They should also be pretty close, possibly better than Bobcat.

But there are others issues like clock speed, I'm not sure my memory serves right but I believe those CPU @1.6GHz burn 1.5 Watt. I've a tough time understanding why would Nintendo stick to broadway, I'm puzzled by the clock speed, I really hope 1.6GHz as per IBM data it seems achievable in a really reasonable power envelope :cry:


It's very kind of you to think my opinion is worth listening to, but you shouldn't put me on the same level of Exophase! You probably know more about processors and low level performance issues than I do - I know you follow this closely on B3D. I'm really just a console warrior who came to B3B a long time ago and over the years has gradually given it up (no doubt influenced by the atmosphere here) and actually started trying to actually learn stuff.
Well no offense but I knew that on the contrary of Exophase you're not in the semi conductor business, I was not putting you on the same level. Though I value indeed your opinion as I find that you make sense more than often and I can't spot any bias in your post which I greatly appreciate :)
As for knowledge, well I'm not sure I know more than you do, we might read exactly the same things. I've close to no academic knowledge on the matter, actually I learnt more by reading (the now pretty tough to access) the old articles on ars technica (the kind of encyclopedia they had) than I did in my short "electronic lessons" whatever that included (damned I'm getting old lol ).

Let say that we both are fit for Dominik D signature... :LOL: I don't state that in any sarcastic way, I meana there is nothing wrong about it as long as one is honest in his opinion, acknowledge is completely off (when he is), to do pretend to be what he is not, and so on.
I think, having followed Nintendo for 20+ years, that you could be correct and that there may have been other options that would have given Nintendo more performance for a similar cost per unit manufacured. But Nintendo value familiarity (who doesn't?) and they also understand the value of the right level of backwards compatibility for certain customers. Being a conservative company I think they plan BC in at an early stage (unlike MS) and plan to do it cheaply (unlike Sony who just include an almost complete version of the old system).
I do get that BC is important to them but I'm not qualified enough to know if achieving BC was as much as a challenge as it was for MSFT moving from the xbox to the 360.

I don't know what the R&D costs of doing that would be, but I get the feeling with Nintendo that the are also very conservative with R&D as part of the approach to minimising risks. You saw it with the N64 (originally offered to and tweaked by someone else) and the Wii (an overclocked GC, almost). I don't think Nintendo would spend hundreds of millions of dollars on a custom architecture like MS or Sony would.
I feel like again BC is the determining factor here. As far as R7D is concerned I could almost defend the point that building a SoC out of existing element (namely PPC 470s and radeon serie 5xxx or 6xxx, all that on TSMC 40 nm process) may have been a lesser effort.
They went trough the effort of "enhancing" broadway (even though we don't know the extend of the changes it looks pretty minimal), they have a custom GPU which include EDRAM, they had to use a mcm, etc.
I'm really not sure all the process cost them less money, and will cost them less money to produce, more expansive wafer and lithography, more chips to test, more assembly tests. etc.

Really I think that 12 (11?) years after the GC was introduced they had time to think and design a software solution for BC and make design decisions freed from this constrain. I mean if they release a new system in 4/5 years they will again design it with a close to 20 years old design in mind? Imho, they should have been planning around that issue for a while.
Do you know what that 3rd tiny die on the WiiU package is? I don't. What the hell is that? I think it's likely that MS or Sony would have spent the cash to integrate that component - whatever it is - into another chip from day one.
No idea, Anandtech did not know either but I don't think it is memory, weirdly enough I've been wondering if it could be the north bridge/memory controller ( :LOL: weird I know but the idea has been floating around my head for some reason).

I don't know for sure, but using an older process like 40nm and taken over a period of 6 ~ 8 years (when DDR3 will be expensive) I think getting a slightly larger GPU to minimise the number of memory chips and board complexity will probably pay for itself.
I doubt it my self, DDR3 as someone pointed should be available for quiet a while from many cheap provider (Chinese founders, etc), and I'm not sure that a 128 bit is that much of expanse at this point in time.

Yeah, I don't have a problem with Nintendo wanting to be competitive either, but I think with the WiU they may have missed an opportunity by being a little too conservative on the hardware. A faster CPU and a relatively small bump in everything else would have seen them laughing off the PS360 and giving the impression (even if it wasn't true) that they could perhaps handle PS4720 ports.
Indeed I believe they could have been both cheaper and better, I could see some core gamers with significant buying power falling for it if it had provided them with the best graphic even for only one year. Damned that was pretty easy to achieve.

I agree, it wouldn't have taken much to outperform the PS360 and if it had gotten more users, more developers and more engines on board then it couldn't have hurt. Nintendo seem to think (right or wrong) that money committed to building hardware is dead money and so they seem reluctant to do it.
Well they seem to think that as their brand power is somehow "immortal". I would be a tad more wary, any users they don't rally may be locked down to another system (/ app store soon) and not willing to move and they should not dismiss that they user base might erode too with few chances to reverse the trend looking forward (see the 3ds and how casual relies on their phones more and more for occasional gaming). To me it think that they are making in fact a dangerous bet, avoiding risk at short term and making fair amount of money instead of facing a more serious trend at mid term (handled erosion, smart tv, various boxes, tablets and the interaction between those devices).
Imo they needed to secure core gamers, the biggest editors have been loud about how the market needed a new system etc. They would have get support.

Anyway some time I wonder if they are just minimizing risks, take the easy money while ultimately knowing that they will at some point exist the hardware business to become a software editors, one has to wonder.

I think cost (and design cost) is part of the reason for sucky design. I agree with you and Mize btw; Nintendo should have got AMD to design them a console and given them a larger power budget (maybe 45W) and it would have been a single chip on a volume process and it would have crushed the PS360.
well the sad part is that I'm not sure that beating the ps360 was impossible within their power budget, but indeed 45Watts may have make sure that they beat them and that they could receive ports of high profile PC games for a couples of years. Once again, against their initial claim, it seems Nintendo has design the system mostly fro them selves.
As for out sourcing the design, I really wonder, I'm not sure about how the 3ds compares in GPU power with comtemporary SoC in the embedded space, but it seems that Chinese companies are definitely delivering on goodness while on a budget too. I just read about the new A31 AllWinner SoC and it is pretty impressive. Archos sells the gamepad for 150$ (and they don't subsidize their hardware) and the thing includes 2 A9 and a mali 400 Mp4.
Actually it kind of pushes me out of topic as I'm close to think that Sony choices for the psv weren't that great either.
 
Last edited by a moderator:
Hey guys, I finally made an account. Been tracking the Console Forums for many, many months and I want to get into this action about the Wii U. I read somewhere, can't find the source though but it was really good, that the Wii U can achieve better results than the PS3 and Xbox 360 even if the clock speeds is slower. It is about performance and efficiency that can be done under the clock speed, so that is where GPGPU comes in along with the eDRAM. Am I right? Not a huge tech guru like people here but I like putting out my opinions is all.
 
It is about performance and efficiency that can be done under the clock speed, so that is where GPGPU comes in along with the eDRAM. Am I right? Not a huge tech guru like people here but I like putting out my opinions is all.
That's a whole great big debate that pages of this thread are talking about. ;) The chances of that are looking very remote at the moment. Nintendo's performance and efficiency technologies have been turned to making a small, low power console rather than a PS360 beater. Had Nintendo used the same architecture but in a more aggressive system with higher power draw and more heat, they could have got better performance than PS360 at the same wattage.
 
if they are doing vertex shading/animation on the GPU instead of using the CPU to help like with the PS3/360 does that count as GPGPU in the context of the current generation? It's sort of like saying a game is '1080P' when that is the output resolution when the real question is about the rendering resolution.
 
Nintendo's performance and efficiency technologies have been turned to making a small, low power console (...)

And cheap.
First and foremost, cheap. Cheap to develop, cheap to manufacture and with the ability of getting even cheaper with a possible 28nm shrink.

So.. I don't think Nintendo's main concern was to save the planet into reducing the power draw by some 20W that would make the system twice as fast.
Their first and foremost concern was to make cheap ICs even though they're charging the price of a true next-gen console.

As a former potential customer, that drove me away from the console and I might be getting a PS3 this christmas instead because it's cheaper, the games catalog is gigantic with quality titles being sold for cheap, and they look as good as the Wii U will ever be able to output. Plus, I'd put more money into the capavilities of a PS3+Vita combo than in a Wii U, which costs almost the same.
But that's just me, and offtopic. So sorry about this paragraph.
It's just a comment on the consequences for Nintendo becoming the corporal embodiment of Uncle Scrooge.
 
And cheap.
Not that cheap as they are getting the same sort of performance as PS360 from a box that apparently costs a lot more. If they were getting the same performance from a $150 or less console, then the claim could be the hardware is a good performance/price choice.
 
Not that cheap as they are getting the same sort of performance as PS360 from a box that apparently costs a lot more. If they were getting the same performance from a $150 or less console, then the claim could be the hardware is a good performance/price choice.

Aren't you forgetting about the GamePad cost now?
 
Aren't you forgetting about the GamePad cost now?
No, because the gamepad isn't going to cost $100 over and above an XB360 or PS3 controller. A $300 Wii U Basic isn't $150 of hardware and $150 of controller, and if it had a $20 controller like PS360, the Wii U Basic wouldn't be priced at $170. If we generously add $50 for the added cost of the controller features, a $300 Wii U is $250 of CPU, GPU, drive, case, etc., versus $200 for an XB360. Wii U is certainly not a cheap hardware choice for Nintendo with a more cost effective price/performance ratio than PS360 (unless Sony and MS are taking massive hits in lossy hardware!).

Which is pretty remarkably when you consider how small and cheap the CPU must be. How Nintendo can't be making a profit on each unit sold is surprising. Unless they are lying when they say a console is profitable with one game, such as factoring in RnD costs into each unit or something odd.
 
Not that cheap as they are getting the same sort of performance as PS360 from a box that apparently costs a lot more. If they were getting the same performance from a $150 or less console, then the claim could be the hardware is a good performance/price choice.

Do you actually believe they're selling the console at a loss?
Or that they would be losing money if they sold the "Premium" version for $250?

They're not. The BOM in this thing should really low. One can tell from the tiny ICs made on an old process, the small-ish screen with a really low resolution, the single-touch resistive panel (there are multi-touch capable resistive panels BTW), the tiny heatsink, the tiny fan, the license-less optical drive, the smal amount of slow mass storage, the tiny battery in the controller, etc.

Any claim from NIntendo saying they're losing money on each Wii U sale is either complete bull or they're counting with R&D, marketing and distribution costs (which makes it a completely bull statement either way).
There's no way the BOM on that console is over $200.
 
Hey guys, I finally made an account.
Hello and welcome to the forum, Mr. Dead.

It is about performance and efficiency that can be done under the clock speed, so that is where GPGPU comes in along with the eDRAM. Am I right? Not a huge tech guru like people here but I like putting out my opinions is all.
Word of advice... B3D IS a fairly high-level/techy board, and thus there's not a whole lot of room for opinions about a piece of hardware's performance. There's this notion going around some circles that opinions can't be wrong - well this is incorrect. A piece of hardware will have a certain performance envelope, and all the opinioning in the world can't change that - so anyone still arguing about it at that point just becomes fanboyism, and such users tend to get themselves banned pretty quick. Not saying you're a fanboy of course... :)

As for your question, yes, more efficient hardware can absolutely accomplish more with less clock speed, but the jury is still out on wether the wuu actually is all that much more efficient or not, so far it has not shown much proof of that, and rather seems weaker than the PS360, which have already been out on the market for 6/7 years by now as we know.

As for GPGPU, it is not a one-stop solution to all problems. Traditionally, even highly programmable GPUs in the directx 10 and 11 era, have been strongly optimized towards graphics rendering tasks and simply can't be efficiently adapted to any/all computing task a game might call for. In order to be more flexible and efficient at a wider variety of computational tasks, GPUs must be equipped with quite a bit of additional hardware which adds extra weight in transistors and a corresponding dollar cost in manufacturing. Even so, there's still many tasks that even the best GPGPUs are just plain unsuited for. With the hardware we suspect Nintendo has chosen for the wuu (AMD Radeon HD 4000-class) there was not a whole lot of effort spent in designing the chip for GPGPU, and more recent radeons would have been a better starting point if Nintendo intended to modify the chip for such tasks. It's therefore unlikely that GPGPU will ever play any significant role for the wuu.
 
the Wii U can achieve better results than the PS3 and Xbox 360 even if the clock speeds is slower. It is about performance and efficiency that can be done under the clock speed, so that is where GPGPU comes in along with the eDRAM. Am I right?

No. It has yet to be shown that the Wuu can produce better results in current gen games that use any amount of processor for game logic or that it can do any decent GPGPU at all.
 
Status
Not open for further replies.
Back
Top