Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Nintendo's typical art style is cartoonish, which doesn't place very high requirements on the hardware. Zelda Wind Waker, which has been claimed to be native 1080P, does not run even close to 60fps, nor does it use a whole lot in the way of textures.

Wuu's main CPU is also very, very weak compared to even last generation of consoles (like 1/10th the flops of Cell or roughly on that order), the GPU is something akin to what you'll find in a budget laptop, and the system as a whole has very low main memory bandwidth (half of PS360 main RAM.) So it's not a powerful design by any stretch, even when eDRAM is taken into account.

I have heard statements like that for a long time, and while I agree that Wii U is no powerhouse by any stretch of the imagination, its obvious that alot of the criticism isnt founded in reality. How does a console that is supposedly completely gimped from memory bandwidth with a terrible CPU run Need for Speed MW better than either the PS3 or 360? Even Mass Effect 3 outperformed the PS3 version. Hard to believe a "port" not done by the original team could perform so well if the hardware is bottle necked severely from the memory bandwidth and the weak CPU. Memory bandwidth is obviously not a problem, even if people want to look at the 12.8GB/s from the main memory pool and chalk it up as gimped. Not a single developer ever once complained about the memory performance.

The CPU is obviously not very strong with floating point performance. The 360 and PS3 were definitely stronger with floating point performance. Obviously developers took advantage of what the Cell and Xenon were good at, and used them to assist the GPU with some graphics rendering. Asking the Espresso to handle that same workload would probably not work out to well. I think developers who leaned very heavily on the CPU's with the 360 and PS3 to render graphics will not find much advantage from the Wii U hardware. I think its safe to say that regardless of SPU counts, the developers have found the Wii U's GPU to be more powerful and certainly more modern than the ones powering the current generation consoles. Frozenbyte for example runs Trine 2 almost exclusively on the GPU, they do very little on the CPU, so for them it was pretty easy for them to identify the Wii U as being the stronger console for their game.

There was a member here that had made mention that they could never truly use all three cores on the Xenon because of the limited L2 cache, the Espresso on the other hand has tons of L2 cache, so I do think the developers will find multi threading on Wii U to be far more beneficial,and necessary to get the most out of the system. Developers have been able to optimize for the Cell and Xenon for so many years now that many of the inefficiencies of those CPU's has been masked by code structured to avoid those potential stalls. I would love to hear from a developer on the differences and weaknesses between Wii U and the current gen consoles. I say current gen since its pretty obvious that its closer to those consoles in terms of outright power than it is with the new PS4 and X1.
 
It's worth noting that Trine 2 is yet another side-scroller with very limited overdraw and complete control of everything that goes on in every scene at all times. It's easy to allocate enough rendering power for all the effects you want to draw when the player can't freely turn the camera.

First-person shooters are much harder to make and get to run smoothly, due to the deeper levels and free camera control. For example, Metroid Prime masterfully handled some very complex levels for its time, ragdolled enemies and lots of particles, transparencies, special effects coupled with 60fps gameplay by careful level design, by blocking off levels into separate rooms which limited overdraw.

I'd kill for a true HD re-release of the MP trilogy on the wuu. ...Well, not literally, obviously. But you know what I mean. The game could use the pad for scanning and/or the map.
 
Is anyone disputing that a 2D game isn't inherently easier to render? Thats not to say that a developer cant ramp up the effects and layers to the point where its very taxing, but given the same level of graphics fidelity, a 2D game is far less taxing on the hardware.

I agree about the Metroid Prime games, I loved them and have the Trilogy on Wii. They werent only some of the best looking games on GC, but also held a solid 60fps with limited dips. Further proof that the final product on any given console has more to do with the talent and budget of the developer, and hardware limitations are only a piece of the pie. Remember early on with the Wii, lots of PS2 ports to Wii that actually looked and ran like crap. No one is their right mind would claim that the PS2 was more powerful than the Wii, but the Wii was getting lots of quick and dirty ports. Its hard to blame publishers for not devoting a ton of resources to Wii U ports though. Even their most popular titles have failed to sell decent numbers. Black Ops 2, Mass Effect 3, and Need For Speed MW were all solid ports to Wii U that did very low numbers on Wii U. I am sure publishers were skeptical of the Nintendo audience to begin with, and with abysmal sales for games that sell very well on the Xbox and Playstation probably caused publishers to scale back resources devoted to the Wii U ports.
 
I have heard statements like that for a long time, and while I agree that Wii U is no powerhouse by any stretch of the imagination, its obvious that alot of the criticism isnt founded in reality. How does a console that is supposedly completely gimped from memory bandwidth with a terrible CPU run Need for Speed MW better than either the PS3 or 360?

They had to cut the number of cars in multiplayer by something like 25%. That's most likely a CPU thing.

I don't think that BW is a particular issue for the Wii U, except for in high BW frame buffer ops where it consistently loses to the 360. The real issue is that the hardware is as weak as fuck. And that's a technical term, used to describe when a platform from several years and nodes and generations of architecture later delivers less overall performance than the bloody ancient platforms it is competing directly (emphasis: directly) against but for a much higher price.
 
They had to cut the number of cars in multiplayer by something like 25%. That's most likely a CPU thing.

I don't think that BW is a particular issue for the Wii U, except for in high BW frame buffer ops where it consistently loses to the 360. The real issue is that the hardware is as weak as fuck. And that's a technical term, used to describe when a platform from several years and nodes and generations of architecture later delivers less overall performance than the bloody ancient platforms it is competing directly (emphasis: directly) against but for a much higher price.
..at a tenth of the powerdraw, with correspondingly small size and low noise, offering dual screen gaming, while remaining backwards compatible with the bestselling plattform of last generation.

Not that this has done Nintendo a whole lot of good in the marketplace.
 
All of which are points I've made many times.

What it does for the process node, power draw etc is all very competitive with the much older PS360. But it's continuing to offer sub Xbox 360 performance in multiplatform games, only at a higher price. And that's not good enough.

Pre-launch Nintendo themselves went on tv in America and made a huge deal about how, not only would their multiplatform games be the best versions, but they would look like different games. And the early project Cafe slides show "unparalleled next gen performance at current gen price" and "easy portability from ... Xbox 360". Well it can't be that easy, because almost everything runs worse.

http://www.sidequesting.com/2011/04/the-next-nintendo-what-we-think-we-know/

Oops?

Dual screen gaming has so far been a dud - even in Nintendo's hands (which is very disappointing), although that is outside the bounds of this thread.

When you judge the Wii U against the market it has to exist in - the market Nintendo chose to compete in especially once they abandoned waggle - there's no way that the machine's performance can be seen as anything other than awful.

Edittus: And to say again, the problem isn't IBM, AMD or Nintendo engineering skills, or the CPU arch or the GPU arch, all of which have no doubt done what was expected of them. It's whoever decided to launch against their competition with this product with this level of performance
 
Last edited by a moderator:
I'm not convinced that building the wuu properly (IE, DX11.x, proper SIMD for the CPU, 2GB+ 128-bit DDR3@1600-2100 main memory) would have been significantly more expensive. None of this is exotic tech by ANY stretch. All of it is a year or years old at this point, and was so even when the wuu launched.

It would have been a system capable of running current-gen console games scaled down in resolution and fidelity. Current wuu just can't do that without sacrificing so much it would require a complete re-build of the game from scratch. No DX11 means you can't use same renderer, same art assets and so on. The weaksauce CPU can't run the same code straight up either. Having to make the same game twice - double the workload, with far less impressive results - all for a platform that just isn't selling... Who's willing to go through that?

Nintendo can make its own games for its own system, they don't suffer, but third parties aren't going to be very interested. And they're not, that is obvious.
 
Not every product that plays games needs to be cutting edge fellas. Lets face it, there are tablets that are far more advanced than the latest Kindle Fire, but that doesnt make the Kindle Fire a bad product, and certainly not an unpopular product. I think its safe to say that if the Wii U didnt have the touch screen controller the Wii U could be selling for $199 right now. Just because you dont personally value the Gamepad doesnt change the fact that its a significant factor in the price of the Wii U. Nintendo did a terrible job of identifying their target audience with Wii U. The Wii consumer is not a long term customer, they are mostly gone, and even consumers that are fans of titles like Mario and Zelda may be holding off for a cheaper price on the hardware, and a bigger library of Nintendo's first party games before they make the purchase.

I think its safe to say budget ports are hardly an indication of a consoles peak abilities. The PS3 suffered from shoddy ports for years, and that didnt hinder the system in the slightest when Uncharted, The Last of Us, or Killzone 3 showcased some of the most advanced graphics of the generation. Look at Bayonetta 2 on Wii U compared to Bayonetta on 360 and PS3. Its a significant improvement over the original. Is the Wii U as powerful as a PS4 or X1? Obviously no, but that wont stop it from having some of the best looking games this generation. Rayman Legends may not be a "technical" masterpiece, but its certainly one of the best looking games I have ever played. Mario 3D World is perhaps the cleanest 3D platforming game I have ever laid my eyes on.

Even getting back to the hardware itself, it always takes a direct approach to getting the most out of any console. The comment about Need for Speed MW only having 6 players online because of the CPU is laughable. Black Ops 2 (a launch title) had 18 players in Ground War and the framerate was still held above 30fps most of the time. My guess is that Criterion knew the Wii U build would likely have a pretty low number of gamers playing online, and having a low number in each match gives players more potential lobbies to join. Criterion didnt even really start focusing on Need For Speed MW for Wii U until after Christmas last year. They spent about two months working full time on the project and it turned out pretty darn good. Ever time a port comes to Wii U and it has performance issues the haters want to come out and blame the hardware when its really the software that is just so poorly optimized for the hardware at hand. Its fair to criticize the Wii U, but lets at least look at things objectively. Does it make any sense for Need for Speed MW to be limited to six players because of the CPU when Black Ops 2 had 18 players running at a higher framerate on the same hardware.
 
Meanwhile, I don't think it's safe to say any of that. Those points there are a matter of opinion.

As for artistic impression, that is entirely subjective and has no place what so ever in this TECHNICAL thread.

Really, you're going to go with the Lazy Developers defense?

The fact that even with Rayman Legends targeting the WiiU as the lead launch platform, getting delayed very early on in the WiiU lifespan until they could launch on other platforms, and the follow up action by UbiSoft not targeting the WiiU says all that really matters about that platform. It's not worth it from an economic standpoint. It's not like Nintendo said it would be -- it's not easy to port games to.
 
I have heard statements like that for a long time, and while I agree that Wii U is no powerhouse by any stretch of the imagination, its obvious that alot of the criticism isnt founded in reality. How does a console that is supposedly completely gimped from memory bandwidth with a terrible CPU run Need for Speed MW better than either the PS3 or 360? Even Mass Effect 3 outperformed the PS3 version. Hard to believe a "port" not done by the original team could perform so well if the hardware is bottle necked severely from the memory bandwidth and the weak CPU. Memory bandwidth is obviously not a problem, even if people want to look at the 12.8GB/s from the main memory pool and chalk it up as gimped. Not a single developer ever once complained about the memory performance.

The CPU is obviously not very strong with floating point performance. The 360 and PS3 were definitely stronger with floating point performance. Obviously developers took advantage of what the Cell and Xenon were good at, and used them to assist the GPU with some graphics rendering. Asking the Espresso to handle that same workload would probably not work out to well. I think developers who leaned very heavily on the CPU's with the 360 and PS3 to render graphics will not find much advantage from the Wii U hardware. (...)

Developers have been able to optimize for the Cell and Xenon for so many years now that many of the inefficiencies of those CPU's has been masked by code structured to avoid those potential stalls. (...)

The PS360 CPUs are extraordinarily wasteful, let's deal with the Xenon for instance, it's worse than the Pentium 4 in having a lot of paper flops and exploiting so little of it. Those flops are only usable in limited, specific workloads (e.g. the Pentium 4 was very good at encoding video and crap at much everything else).
It seems to me the only way to use those paper flops was to add friendly workloads to the games (like texture decompression/recompression jobs, audio work etc. and whatever fits), stuff that will work less well on the Wii U CPU. Else, much of the braindead-ness of the Xenon is unescapable even after optimizing for it.

The thing is it's like if Nintendo put a 1.2GHz Athlon 64 to face a 3.2GHz Pentium 4, it's a ton more efficient - draws a tenth the power and does most of the job its competitor does - yet the Pentium 4 is a bit on top.

Regarding memory bandwith, Wii U GPU is a couple generations newer (X360 has R500, Wii U has RV7xx) and benefits from more advanced bandwith saving techniques ; ATI did a very decent work of usable performance on bandwith-starved GPUs, with that tech. I'm thinking of Radeon HD3200, HD4200 (both chipset graphics with very low bandwith), and Radeon 4650 with its ddr2.

In all the Wii U is somewhat elegant but hell.. The day I put my hands on it, a couple games run on tablet streaming mode, I didn't really like the controller and how it's so wide and somewhat uncomfortable so anyway it's moot for me. Hands too far apart and it weighs significantly more than a 1989 Game Boy. I do not want.
Maybe if they trim 200 grams from it (OLED or other post-LCD display, lower power use, smaller battery..) it will be more desireable. I'm sure we'll see a Wii Gamepad refresh eventually.
 
Meanwhile, I don't think it's safe to say any of that. Those points there are a matter of opinion.

As for artistic impression, that is entirely subjective and has no place what so ever in this TECHNICAL thread.

Really, you're going to go with the Lazy Developers defense?

The fact that even with Rayman Legends targeting the WiiU as the lead launch platform, getting delayed very early on in the WiiU lifespan until they could launch on other platforms, and the follow up action by UbiSoft not targeting the WiiU says all that really matters about that platform. It's not worth it from an economic standpoint. It's not like Nintendo said it would be -- it's not easy to port games to.

Then say something technical? So you think the ports were given as much effort as the 360 version? Really?:rolleyes: They are typically not even ported by the original team, that's not an opinion, that is a fact. Budget port does not mean the developer was lazy, but that resources and time were a limiting factor. Over six years of tweaking and optimizing software for the 360/PS3 have some benefits.

There is a thread talking about the insane fill rate the PS2 had for its time, thanks to the 4MB of edram. Wii U has 32MB of edram.....Obviously 32MB would give a console that targets 720p for the majority of games the same fillrate benefits that the PS2 had thanks to its 4MB of edram.

People still want to point at the tri core 1.25Ghz Espresso as a weak link, but the reality is that the processor will actually allow the developer to effectively scale the workload over all three cores thanks to the large amount of L2 cache, something that hindered the Xenon. Flops performance might not be great, but integer performance could very well be much better than either the Cell or Xenon.

I would love to have a developer really break down the Wii U, where are the strengths and weaknesses. What are the real world compromises that have to be made because of those weaknesses.
 
There is a thread talking about the insane fill rate the PS2 had for its time, thanks to the 4MB of edram.
Thanks to the insane bandwidth of the eDRAM @ 48 GB/s.

Wii U has 32MB of edram...
Which, although not officially substantiated, seems to be something like 32 GB/s (see rest of this thread). The eDRAM in Wii U is apparently there as a cost-cutting measure and not a performance driver. Nintendo could go really cheap on the main system BW (12 GB/s) and still get reasonable overall BW comparable to PS360. Ergo, the eDRAM in Wii U is not comparable regards what it brings to the system as the eDRAM in GS was in PS2. There's no similarly massive fillrate or overdraw and particle power in Wii U.
 
Not every product that plays games needs to be cutting edge fellas.... I think its safe to say that if the Wii U didnt have the touch screen controller the Wii U could be selling for $199 right now....
But it does have one and it isn't, so it's more expensive than the products Nintendo identified as direct rivals but is technically weaker than them.

I think its safe to say budget ports are hardly an indication of a consoles peak abilities. Look at Bayonetta 2 on Wii U compared to Bayonetta on 360 and PS3.
It's not out so we can't compare the two in any meaningful fashion

Rayman Legends may not be a "technical" masterpiece, but its certainly one of the best looking games I have ever played.
Subjective and the game is on those other consoles with no noticable drop in quality.

The comment about Need for Speed MW only having 6 players online because of the CPU is laughable.
Why?

Black Ops 2 (a launch title) had 18 players in Ground War and the framerate was still held above 30fps most of the time.
Wow a game known for 60 fps on X360/PS3 and you're boasting of >30 FPS most of the time???

My guess is that Criterion knew the Wii U build would likely have a pretty low number of gamers playing online, and having a low number in each match gives players more potential lobbies to join. Criterion didnt even really start focusing on Need For Speed MW for Wii U until after Christmas last year. They spent about two months working full time on the project and it turned out pretty darn good. Ever time a port comes to Wii U and it has performance issues the haters want to come out and blame the hardware when its really the software that is just so poorly optimized for the hardware at hand. Its fair to criticize the Wii U, but lets at least look at things objectively. Does it make any sense for Need for Speed MW to be limited to six players because of the CPU when Black Ops 2 had 18 players running at a higher framerate on the same hardware.

So you have nothing to add beyond bizarre theories (lower player counts = more lobbies, really?) and apples/oranges comparisons to a game that performs worse on the Wii U than the other consoles it launched on?

Nintendo made their h/w choices to suit Nintendo and their existing s/w skillsets it's working for them, as the latest Mario and Pikmin attest, but expecting 3rd parties to step up teams just to serve the vanishing small Wii U market is madness. Nintendo consoles are for Nintendo games and nothing about the Wii U suggests otherwise especially with their bloody minded refusal to implement modern hardware conventions (SIMD, DX11/OGL 4.0, high b/w memory access, etc)
 
Half the powerdraw, not tenth

..at a tenth of the powerdraw, with correspondingly small size and low noise, offering dual screen gaming, while remaining backwards compatible with the bestselling plattform of last generation.

Not that this has done Nintendo a whole lot of good in the marketplace.

You are really exaggerating here.

Half, not tenth. At the same die size, PS3 super slim uses 70W measured with games compared to Wii U 35W.
 
They created a new bus size for edram that is out of the norm just to lower bandwidth? I do agree that this was a cost saving measure to get the performance they want without more expensive memory, but seeing as how edram isnt cheap, it owuld have likely been cheaper to go with a 128bit bus to DDR3 if they were only going to get 32GB/s from the edram. Who the hell came up with this 32GB/s theory?

The CPU limits to number of cars to 6 online in Need For Speed when Mario Kart 8 will have at least 12 (Mario Kart Wii had 12) and run at 60fps? Hmm, doesnt really add up. My point about BO2 was more so about the number of players, and I have the game and there is no doubt that it doesnt hold 60fps in Ground War, it doesnt on any console for that matter. The idea that the CPU is the limiting factor for 6 players online is not backed up by anything and makes no sense.
 
You are really exaggerating here.

Half, not tenth. At the same die size, PS3 super slim uses 70W measured with games compared to Wii U 35W.

No, you lack in your reading accuracy. Function specifically wrote, and I quoted, "several years and nodes" later in his rant. Both he and I are well aware that compared at the same node, the WiiU enjoys roughly a factor of two advantage in power draw over the PS360.

The purpose of my addendum was to point out that the PS360 and the WiiU had hugely different design targets, and to imply that the base of his critisism of the WiiU lies in not recognizing this, but insisting on measuring it by a yardstick its designers simply were not using.

Whether this was a good move by Nintendo in the market place is a different question, but the mixup of the issues keeps pulling every damn thread concerning the hardware of the WiiU down, together with the fact that many here with an interest in tech simply feel that "more is better". Which right there disconnects them from the design ethos of the WiiU.

That little substantial from a technical point of view has surfaced since two weeks in after the die shot doesn't help the level of discourse either, obviously.
 
No, you lack in your reading accuracy. Function specifically wrote, and I quoted, "several years and nodes" later in his rant. Both he and I are well aware that compared at the same node, the WiiU enjoys roughly a factor of two advantage in power draw over the PS360.

The purpose of my addendum was to point out that the PS360 and the WiiU had hugely different design targets, and to imply that the base of his critisism of the WiiU lies in not recognizing this, but insisting on measuring it by a yardstick its designers simply were not using.

Whether this was a good move by Nintendo in the market place is a different question, but the mixup of the issues keeps pulling every damn thread concerning the hardware of the WiiU down, together with the fact that many here with an interest in tech simply feel that "more is better". Which right there disconnects them from the design ethos of the WiiU.

That little substantial from a technical point of view has surfaced since two weeks in after the die shot doesn't help the level of discourse either, obviously.

Ok then. Tenth the powerdraw using some fuzzy logic.
 
Honestly if you don't have any new information about the console I'm not sure why you're posting in this thread. ATM there's little known, much guessed, and AFAIK nothing new.
 
No, you lack in your reading accuracy. Function specifically wrote, and I quoted, "several years and nodes" later in his rant. Both he and I are well aware that compared at the same node, the WiiU enjoys roughly a factor of two advantage in power draw over the PS360.
TBH, last-gen consoles never drew 350W, not even from the wall, taking power supply inefficiencies into account. 360 was like, what, 175ish W? Launch phat PS3s were like 200+ something IIRC. A far, far way from 350W in any case. Also, doesn't wuu top out at 40+ W in-game? 400W is the rated max capacity of the phat PSU, but the console never comes close to that.

The purpose of my addendum was to point out that the PS360 and the WiiU had hugely different design targets, and to imply that the base of his critisism of the WiiU lies in not recognizing this, but insisting on measuring it by a yardstick its designers simply were not using.
The criticism is based on the common yardstick which everyone is using, and has to be seen from that perspective, not the wonky, non-standard yardstick nintendo is using (which few are buying and with understandable reason.)

together with the fact that many here with an interest in tech simply feel that "more is better". Which right there disconnects them from the design ethos of the WiiU.
You expect more is better simply from the passage of time. Wuu is weaker than last-gen consoles, yet launched six or seven years later - it's not really a question of people mistakingly believing more is better, it's actually more a case of "what the f*** was Nintendo thinking?!"

It's like, here's our new new thing, right? It's pretty much the same as the thing you already got, and have had for the major part of a decade, except there's a tablet with a bad touchscreen on it (it's terrible actually) and no software. Please buy it.

When you launch something new, people expect something new, alright? Not a rehash of what everyone already owns.
 
They created a new bus size for edram
I don't think there's any evidence for that. It's a theory postulated by a Nintendo fanboy who made a couple of correlations between vague PR comments and a Renesas tech sheet, and concluded their widest bus eDRAM must be in Latte. That's the only argument I've seen, anyhow, and that's all my GoogleFu can find on the theory.

Who the hell came up with this 32GB/s theory?
It's in this thread. After someone posted a truly crazy 500 GB/s theory (that they've added to Wikipedia, so must be true...), people challenged that and two figures were presented < 35 GB/s. But also unsubstantiated. I don't think there's any decently supported theory for the BW. The only thing I'm comfortable recognising is that there isn't stupid amounts of BW in there (and so probably not a super-wide bus). It only made sense on PS2 because it used multipass rendering. Modern shader architecture doesn't need massive overdraw, so super massive BW is of limited value if the rest of the system components aren't enough to use it. I can't see any evidence in games of super high BW either (even just looking at first party titles).

There's an IGN thread where a lot of the contributors in this thread are still going at it, with speculation of 70 and 130 GB/s.
 
Status
Not open for further replies.
Back
Top