Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Well, many games are not CPU but GPU bound. Those could still be better. And as said before, we don't really know, but it is certainly not unlikely the GPU in the Wii U would be better for GPGPU applications, that can take care of some stuff currently done on CPU in the PS360s. Look at the FarCry 3 discussion of how Spurs jobs are triggered from the GPU for instance on the PS3, and then run on SPEs. The Wii U GPU could perhaps just keep doing all that easily just on the GPU side, leaving the CPU with a lot less work.

Launch games are relatively meaningless, but they do show that currently there are bottlenecks that developers are struggling with.

I feel like the usual "CPU bound or GPU bound" dichotomy is too simplistic. A game will switch through a bunch of different kinds of workloads. A weak CPU can have an impact on minimum frame rates even when the average is barely affected.

It would seem that workloads that were already being targeted for deployment on Cell SPEs or CPU threads could be easier to push to GPGPU, although not necessarily trivial, especially with pretty old GPU hardware that isn't really as flexible as it is today. The question is, do we know for sure that (at least some) games aren't already doing this?
 
I think the elephant in the room is that the WiiU is a piss-weak machine unless you compare it only with the Wii and PS360. The discussion seems to have been directed along this line (perhaps subconsciously) as kind of diversion from the fact that for a mains powered console - ostensibly trying to attract core gamers - it quite amazingly struggles to stand nose to nose with a core gamer system launched seven years earlier.

It's not like the 360 can't do GPGPU either (sebbbi has already talked about doing it in Trails Evolution), or that the 360 is a DX9 machine and WiiU is DX 10.1 so it automagically wins everything, or anything like that.

And I don't think Nintendo are banking on GPGPU anyway. Nintendo are banking on being able to make a profit from a platform that is, in terms of processing, cheap (and therefore weak) and that's it. And good luck to them, it worked with the Wii. And yes, I still want one, and yes I'm still exited about Wuu Zelda but dammit the WiiU is the way it is because Nintendo don't value powerful hardware not because of GPGPU / unused edram / low latency / lazy devs not optimising for OoOE etc etc.
 
Last edited by a moderator:
And it should be a more advanced OoO design. Broadway just makes the cut to earn this designation.

It (Broadway) doesn't seem like a more advanced OoO design than Bobcat to me. It looks like there's nothing more than a single reservation station slots in front of the execution units, or two in the case of the load/store unit. This is combined with a six-entry (in order) completion queue that balances the six-entry instruction queue and six (total, across five execution units) reservation stations. Also balance with some minimal register renaming: there are a total of 6 GPR + 6 FPR + CR, LR, and CTR rename registers,

So I'd actually call this pretty minimally OoO and far behind Bobcat's reordering capabilities.

In terms of peak execution width it's somewhat similar - both are bottle-necked by dual-dispatch per cycle, although Broadway can fold a branch (not sure if Bobcat can). On the other hand, it only has a single load/store unit (Bobcat can co-issue a load and store) which can be a pretty major deficiency.
 
And I don't think Nintendo aren't banking on GPGPU anyway. Nintendo are banking on being able to make a profit from a platform that is, in terms of processing, cheap (and therefore weak) and that's it. And good luck to them, it worked with the Wii. And yes, I still want one, and yes I'm still exited about Wuu Zelda but dammit the WiiU is the way it is because Nintendo don't value powerful hardware not because of GPGPU / unused edram / low latency / lazy devs not optimising for OoOE etc etc.

Of course Nintendo values power.
But who is going to buy it?

What makes more sense, selling 75 million units of one console every 10 to 12 years
or 150 million units from two consoles each every 10 to 12 years?

When it comes down to it, Nintendo is only a game company.
They have to take into consideration more than just how powerful to make their machines.
Their machines need to be profitable, they need to be affordable, and they need to attract new customers to continue to stay in business.

Their first party titles all have shown marked increase in visuals for each console generation. As long as they can continue to doing so, you cant criticize them for taking smaller iterations with their hardware.

I used to make fun of the small black and white gameboy when I had the big powerful full color LYNX. I could never figure out why the Gameboy was the better selling machine. But guess who is still in business, not Atari. Now we got Sony on the ropes.
 
That makes more sense now :D But I don't regret my post... too much. Maybe someone wanted to know >_>

I did!

I tried reading an old M68000 assembly programming book earlier this year (dunno why) and it pretty much kicked my ass (I have no assembly programming experience, I'm a hobbyist C++/C# guy) but I maybe learned a couple of few things too. Seeing Bobcat compared to another small, low power processor is not only fun, I think I actually understood some of it too! Hurrah!
 
Of course Nintendo values power.
But who is going to buy it?

What makes more sense, selling 75 million units of one console every 10 to 12 years
or 150 million units from two consoles each every 10 to 12 years?

When it comes down to it, Nintendo is only a game company.
They have to take into consideration more than just how powerful to make their machines.
Their machines need to be profitable, they need to be affordable, and they need to attract new customers to continue to stay in business.

Their first party titles all have shown marked increase in visuals for each console generation. As long as they can continue to doing so, you cant criticize them for taking smaller iterations with their hardware.

I used to make fun of the small black and white gameboy when I had the big powerful full color LYNX. I could never figure out why the Gameboy was the better selling machine. But guess who is still in business, not Atari. Now we got Sony on the ropes.

I don't have a problem with Nintendo releasing weak and outdated hardware to try and make money. I just think it's odd that so many seem to deny this is what Nintendo are doing, or seek to justify it beyond Nintendo profit motives.

I think it's also worth saying that it's not a binary "WiiU" or "PS4720" situation. Nintendo could have included another CPU core, or better CPU cores (break the bank on a behemoth like Bobcat), or used DDR3 1866 memory, or added additional shaders (the GPU is not a big chip by console standards even with the embedded memory) or gone batshit-insane-mega-power crazy and used a 6 cm fan instead of a 5 cm fan and pushed the CPU clock all the way up to a dizzying 1.3 gHz. Nintendo didn't because they - as a business - do not value powerful hardware. They don't think it brings in the returns necessary to justify itself. This isn't a recent things either - look at the SNES (incredibly weak CPU), the N64 (hardware Sega turned down two years earlier) or even the Gamecube (well below the Xbox). Even before they made the Wii / WiiU Nintendo knew hardware had to pay for itself and that R&D costs shouldn't commit you to a platform for a decade to make your money back.

Nintendo are right to do what they think they should, but it doesn't mean that the hardware isn't .... what it is.
 
Question would be would these 80 millions Wii owners willing to pay for 300 for outdated hardware again base on their satisfaction from wii. Most early adopters are most likely die hard nintendo fans, which I doubt that even makes up ok 1/3 of the wii user base. And if they want to sell them to ppl outside of Nintendo fans, or ppl that never care for nintendo franchise what can they attract ppl with? It won't be getting next gen consoles games or look even close, no blu ray play back, 300 bucks with tiny libraries that 90% are going to be sloppy ports from current gen? Assume that porting gotten better even if Wii U starts getting superior version of multi platform games, would ppl rather get or already own 360 or ps3 that play that same games a long with its 6-7 years line up of games.
 
Okay, can we contain Wii U talk in this thread to just the technology in it, and not Nintendo's business plan etc. There are plenty of other threads for that! ;)
 
GPU bound games? for example?

I would imagine that quite a lot of games are fill rate bound by this point in the life cycle.

That's why native resolutions have been dropping slowly, COD being a prime example of this.

I don't have a problem with Nintendo releasing weak and outdated hardware to try and make money. I just think it's odd that so many seem to deny this is what Nintendo are doing, or seek to justify it beyond Nintendo profit motives.

Oh they are doing that, is it because they're worried about failing and ending up like Sega or because they care more about the money?

Either way with the sheer profit they made from the Wii and are still currently making on the Wii they could of completely gone all out on Wii U, after all they can afford too!
 
I used to make fun of the small black and white gameboy when I had the big powerful full color LYNX. I could never figure out why the Gameboy was the better selling machine. But guess who is still in business, not Atari. Now we got Sony on the ropes.

I saw a first gen Lynx a few days ago in a store that sells and buys used stuff, with a collection of about 12 or 15 games in a bag made for them. My reaction was first, jaw dropping : it was the first time I've actually seen one. Then I saw it was really huge. bigger than a PSP, a game gear, maybe wider than Wuu pad. Then I remembered it probably had a low battery life, like the game gear - the standard used to be disposable batteries.

The Game Boy had a good battery life on regular batteries (over 10 hours, and four batteries rather than six), rechargeable ones were probably quite expensive 20 years ago and were some crappy NiCad, plus with disposable batteries you can just play after not using it for one week or six monthes.
It had a great form factors too and when it came to the games, they were fun (funnily, smooth scrolling relies on the LCD blur). Tetris, Gargoyle Quest, whatever (I liked the Batman game) then a few years later what was probably the greatest Zelda game.
Then it even played the Pokémon crap for the kids but that was not my cup of tea.
 
I think the elephant in the room is that the WiiU is a piss-weak machine unless you compare it only with the Wii and PS360. The discussion seems to have been directed along this line (perhaps subconsciously) as kind of diversion from the fact that for a mains powered console - ostensibly trying to attract core gamers - it quite amazingly struggles to stand nose to nose with a core gamer system launched seven years earlier.

I don't have a problem with Nintendo releasing weak and outdated hardware to try and make money. I just think it's odd that so many seem to deny this is what Nintendo are doing, or seek to justify it beyond Nintendo profit motives.

I think it's also worth saying that it's not a binary "WiiU" or "PS4720" situation. Nintendo could have included another CPU core, or better CPU cores (break the bank on a behemoth like Bobcat), or used DDR3 1866 memory, or added additional shaders (the GPU is not a big chip by console standards even with the embedded memory) or gone batshit-insane-mega-power crazy and used a 6 cm fan instead of a 5 cm fan and pushed the CPU clock all the way up to a dizzying 1.3 gHz. Nintendo didn't because they - as a business - do not value powerful hardware. They don't think it brings in the returns necessary to justify itself. This isn't a recent things either - look at the SNES (incredibly weak CPU), the N64 (hardware Sega turned down two years earlier) or even the Gamecube (well below the Xbox). Even before they made the Wii / WiiU Nintendo knew hardware had to pay for itself and that R&D costs shouldn't commit you to a platform for a decade to make your money back.

Nintendo are right to do what they think they should, but it doesn't mean that the hardware isn't .... what it is.
For me the sad part is that I don't think that cost explains the lacking of the system.
If you look at the overall picture, they have used 189 mm^2 (33 for the CPU and 156 for the GPU) of silicon. It is not too shaby, not too mention that the CPU seems to use EDRAM and that the GPU includes Edram, so it is not the cheapest process available.

What I think is disheartening is that the enhanced broadway core might be in the same ballpark as IBM PPC 470s which are more modern CPU designs. May they have chosen those CPU they may have been able to produce them on a possibly cheaper process and wafers.
Those CPU are according to IBM own number ~4mm^2 on their 45nm process, it could be less on TSMC process. They may have give up on some cache but I'm not sure about the implication on perfs as those processors are "better" (wider, more advance out of order execution, etc.).
Those CPU also use an interconnect fabric that make the connection of PCI type of device possible (from IBM paper).
To me it sounds like those CPU cores along with the matching interconnect could have allowed Nintendo to design a SOC without having to spend humongous amounts or R&D, could be wrong though but people like you or Exophase would have a better opinion on the matter.

On the other side you have the GPU significantly bigger than Redwood (104mm^2) or Turks (118mm^2). I think it is not pushing too far to assume that Nintendo could have put both one quad core based on PPC 470s and a redwood/Turks within a chip not bigger than the GPU they ended up using. It would have been produced at using TSMC 40 nm process.

I think that is not pushing too far either to assume that going with a 128 bit bus to even DDR3-1600 may have been cheaper than using two chips on more expansive process, having a bigger silicon budget, using a MCM.

The whole thing is that I've no issue with Nintendo business model and their will to not engage the fight on specs and largely subsidize their systems. I've an issue with them closing them selves some part of the market based on what seems to be bad design choices.

I think that Nintendo could have been king of the hill for one year and that would not have hurted them in anyway, quiet the contrary . It may also have result in more high studio studio jumping on board. EA for example seems it is going FB 2 engine for more franchises going forward and it seems that it won't be ported to the WiiU.

Really as I see it cost is not the issue, the issue is sucky design. Like I said months ago and Mize concurred lately, they should not longer design Hardware them selves and move to for example Google practices on the matter.

I believe that the WiiU (a superior one on top of it) could have launched at 299$.
I think they should have passed on 2 SKUs and have a better basic SKU.
Looking at some early game troubles I think that they should have embark a bit more flash reserved for caching (8 more GB)
 
I dont think that would fit with their latency objectives.
Higher clocked, more expensive RAM actually tends to give better latencies compared to standard bulk RAM. There's several articles over at Anandtech for example that shows this.
 
Or at the least the latency stays the same. "CAS" latency is expressed in clock cycles, so if you have ddr3 1333 CL9, 1600 CL10 or CL11 and 1866 CL11 (and 1066 CL7) the latency is roughly the same or a bit better, with marginal benefits when the memory controller runs faster.

incidentally, ddr2 800 CL5, or ddr 400 CL 2.5 is still around the same ballpark in latency so processors need great caches and design to improve on their predecessors, or just rely on multithreading.
 
For me the sad part is that I don't think that cost explains the lacking of the system.
If you look at the overall picture, they have used 189 mm^2 (33 for the CPU and 156 for the GPU) of silicon. It is not too shaby, not too mention that the CPU seems to use EDRAM and that the GPU includes Edram, so it is not the cheapest process available.

Perhaps you're right, but I can't help thinking that some of the die area on the edram is used primarily to reduce costs related to the main memory bus and to allow for Wii BC (in the absence of a fast CPU and GPU). The Xbox 360's software emulation of Xbox 1 has perhaps made emulation powered BC look easy, but that was a massive and ambitious effort by someone who appears to be a genius OS guy, and it was by no means the complete solution that Nintendo have typically gone for in their portable consoles and in the Wii and WiiU.

Regarding pure performance, Trinity desktop processors on a 128-bit DDR3 bus seem to offer far more performance than the WiiU and Xbox 360 (massively, massively more on the CPU front), so I do think it has to be a cost thing and BC thing.

What I think is disheartening is that the enhanced broadway core might be in the same ballpark as IBM PPC 470s which are more modern CPU designs. May they have chosen those CPU they may have been able to produce them on a possibly cheaper process and wafers.
Those CPU are according to IBM own number ~4mm^2 on their 45nm process, it could be less on TSMC process. They may have give up on some cache but I'm not sure about the implication on perfs as those processors are "better" (wider, more advance out of order execution, etc.).
Those CPU also use an interconnect fabric that make the connection of PCI type of device possible (from IBM paper).
To me it sounds like those CPU cores along with the matching interconnect could have allowed Nintendo to design a SOC without having to spend humongous amounts or R&D, could be wrong though but people like you or Exophase would have a better opinion on the matter.

It's very kind of you to think my opinion is worth listening to, but you shouldn't put me on the same level of Exophase! You probably know more about processors and low level performance issues than I do - I know you follow this closely on B3D. I'm really just a console warrior who came to B3B a long time ago and over the years has gradually given it up (no doubt influenced by the atmosphere here) and actually started trying to actually learn stuff.

I think, having followed Nintendo for 20+ years, that you could be correct and that there may have been other options that would have given Nintendo more performance for a similar cost per unit manufacured. But Nintendo value familiarity (who doesn't?) and they also understand the value of the right level of backwards compatibility for certain customers. Being a conservative company I think they plan BC in at an early stage (unlike MS) and plan to do it cheaply (unlike Sony who just include an almost complete version of the old system).

On the other side you have the GPU significantly bigger than Redwood (104mm^2) or Turks (118mm^2). I think it is not pushing too far to assume that Nintendo could have put both one quad core based on PPC 470s and a redwood/Turks within a chip not bigger than the GPU they ended up using. It would have been produced at using TSMC 40 nm process.

I don't know what the R&D costs of doing that would be, but I get the feeling with Nintendo that the are also very conservative with R&D as part of the approach to minimising risks. You saw it with the N64 (originally offered to and tweaked by someone else) and the Wii (an overclocked GC, almost). I don't think Nintendo would spend hundreds of millions of dollars on a custom architecture like MS or Sony would.

Do you know what that 3rd tiny die on the WiiU package is? I don't. What the hell is that? I think it's likely that MS or Sony would have spent the cash to integrate that component - whatever it is - into another chip from day one.

I think that is not pushing too far either to assume that going with a 128 bit bus to even DDR3-1600 may have been cheaper than using two chips on more expansive process, having a bigger silicon budget, using a MCM.

I don't know for sure, but using an older process like 40nm and taken over a period of 6 ~ 8 years (when DDR3 will be expensive) I think getting a slightly larger GPU to minimise the number of memory chips and board complexity will probably pay for itself.

The whole thing is that I've no issue with Nintendo business model and their will to not engage the fight on specs and largely subsidize their systems. I've an issue with them closing them selves some part of the market based on what seems to be bad design choices.

Yeah, I don't have a problem with Nintendo wanting to be competitive either, but I think with the WiU they may have missed an opportunity by being a little too conservative on the hardware. A faster CPU and a relatively small bump in everything else would have seen them laughing off the PS360 and giving the impression (even if it wasn't true) that they could perhaps handle PS4720 ports.

Of course, I thought the Wii was going to sell less than half what it did and come third so I'm certainly no Oracle ... :D

I think that Nintendo could have been king of the hill for one year and that would not have hurted them in anyway, quiet the contrary . It may also have result in more high studio studio jumping on board. EA for example seems it is going FB 2 engine for more franchises going forward and it seems that it won't be ported to the WiiU.

I agree, it wouldn't have taken much to outperform the PS360 and if it had gotten more users, more developers and more engines on board then it couldn't have hurt. Nintendo seem to think (right or wrong) that money committed to building hardware is dead money and so they seem reluctant to do it.

Really as I see it cost is not the issue, the issue is sucky design. Like I said months ago and Mize concurred lately, they should not longer design Hardware them selves and move to for example Google practices on the matter.

I think cost (and design cost) is part of the reason for sucky design. I agree with you and Mize btw; Nintendo should have got AMD to design them a console and given them a larger power budget (maybe 45W) and it would have been a single chip on a volume process and it would have crushed the PS360.

I believe that the WiiU (a superior one on top of it) could have launched at 299$.
I think they should have passed on 2 SKUs and have a better basic SKU.
Looking at some early game troubles I think that they should have embark a bit more flash reserved for caching (8 more GB)

Agree!
 
Higher clocked, more expensive RAM actually tends to give better latencies compared to standard bulk RAM. There's several articles over at Anandtech for example that shows this.

Well I read an article that stated the following:

The DDR4 memory interface will double the clock speed of earlier DDR3 devices, but some fundamental DRAM timing parameters will remain at the same number of nanoseconds, effectively doubling the number of memory clock cycles required for those timing parameters to elapse.

http://www.neogaf.com/forum/showpost.php?p=44811066&postcount=412
 
Talk about shooting yourself in the foot if the clock speed is true. For the love of god Nintendo, boost the speed up some and add VMX units or something. Geez, no wonder the 4A guy was so disappointed. While I'm sure plenty of multiplatform titles never pushed the Cell or the Xenon to the max, having a highly refined memory system (supposedly) can only do so much. I'm a bit shocked the multiplatform titles on the Wii U are even capable of running at all. Any indications of huge CPU slow down on any games like AC3 or Darksiders 2?
 
Status
Not open for further replies.
Back
Top