Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
UE4 first on iPad or Wii U by Epic? Wii U is in nowhere land. People are not going to put iPad UE4 mobile games on Wii U and neither the high end stuff console stuff.

:LOL:

You didn't read what Sweeney said then because you are trying to make Wii U sound like it's stuck in between two things. This is what Sweeney said.

Unreal Engine 4’s next-generation renderer targets DirectX 11 GPU’s and really starts to become interesting on hardware with 1+ TFLOPS of graphics performance, where it delivers some truly unprecedented capabilities. However, UE4 also includes a mainstream renderer targeting mass-market devices with a feature set that is appropriate there.
All he said is that the target is DX11 GPUs. Then he says it becomes interesting on more powerful GPUs. As you can clearly see he does not say 1+ TFLOP is the minimum requirement. Now if that's a mistake then that's one thing, but right here he gives no indication of what you are trying to say.
 
Last edited by a moderator:
From Neo-GAF

thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).

CPU

The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:

- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.

- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.

- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.

- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.

The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.

There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.

GPU

The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:

- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.

- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.

- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.

- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.

RAM

It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.


Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:

- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).

- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.

- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.

- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).

- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.


Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.
 
Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.

It not sound like a "real" next gen machine...:???:
 
didn't the rumour say the big edram was on the CPU?
while suboptimal it makes sense, IBM takes care of the edram. they do sell a CPU with 32MB edram already, and are now even using it as L2 on part of their CPU range.
there would be one chip with one CPU die, one GPU die and a very fast and wide interconnect between them.
 
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.
What does the cell excel at that modern day processors have not surpassed yet?
 
It's already obvious that Wii-U games will look like just as dated compared to PS460 games, as how dated Wii games are look like vs PS360 games today. We also know that Nintendo is not after a high-end hardware anymore (since the GC, which was an awesome piece of little box in my opinion, they just failed on the 3rd party side), and after all, no system is perfect.
I'm not a programmer, so I would like to ask: isn't it the best if the CPU is the slow part if something must be sacrificedto cost effectiveness anyway. I mean, there is not much can be done in situations where the fill rate is utterly slow, or if the texture cache is too small and has a very high latency (etc). The CPU is the part which the developers may affect and control the most, so that's the part where they have the most chance to counter the technical difficulties, right?
It's game development, and there are so many procedures, solutions and tricks one may use to "avoid" extensive real-time calculations, so isn't it the best if the "problem" is at a "place" where you can actually solve it?

edit.: Or would it be better if the Wii-U would have something like the Ivy-Bridge with an IGP?
 
Last edited by a moderator:

Isn't that the sum of only the best-looking rumours for the tech specs so far?


If the console is capable of that much, why didn't they bother making at least a 30-seconds tech demo or mini-game that would stop people from being convinced otherwise?
And why wouldn't they release GPU theoretical numbers that proved it was more than 2x more powerful than Xenos, along with all the eDRAM, etc?


That would at least keep the stocks from hitting record lows, and it would raise the expectation and mindshare, IMO.



It's already obvious that Wii-U games will look like just as dated compared to PS460 games, as how dated Wii games are look like vs PS360 games today. We also know that Nintendo is not after a high-end hardware anymore (since the GC, which was an awesome piece of little box in my opinion, they just failed on the 3rd party side), and after all, no system is perfect.
I'm not a programmer, so I would like to ask: isn't it the best if the CPU is the slow part if something must be sacrificedto cost effectiveness anyway. I mean, there is not much can be done in situations where the fill rate is utterly slow, or if the texture cache is too small and has a very high latency (etc). The CPU is the part which the developers may affect and control the most, so that's the part where they have the most chance to counter the technical difficulties, right?
It's game development, and there are so many procedures, solutions and tricks one may use to "avoid" extensive real-time calculations, so isn't it the best if the "problem" is at a "place" where you can actually solve it?

The thing is, PS360 developers got used to doing lots of GPU stuff in the CPU. This happened more in the PS3 where the GPU lacked shader performance, but I think even the X360 does FXAA and other post-processing effects through the CPU.
If they go from that to an architecture with a CPU that can't do GPU stuff as effectively as the others (floating-point performance, afaik), they may get the impression that the hardware is weaker.

Then again, I think an experienced console developer should be able to see through that at a first glance, so that may not be the case.

edit.: Or would it be better if the Wii-U would have something like the Ivy-Bridge with an IGP?

Llano or Trinity maybe, but definitely not Ivy-Bridge as its architecture is a lot more CPU-centric.
Nonetheless, it seems the 35W Trinity + 1866MHz DDR3 combo is able to do a lot more than what we're seeing on the Wii U.
A Trinity with a 50W power budget should be able to do even more.
 
Last edited by a moderator:
Isn't that the sum of only the best-looking rumours for the tech specs so far?
Yeah, that's what it looks like to me as well. The Ninty fanboy in me wants to believe that rumor set, but the rationalist in me is just going to assume that the CPU is 3x friggin' broadway and the rest of the hardware similarly anemic to match, until reliably proven otherwise.

If the console is capable of that much, why didn't they bother making at least a 30-seconds tech demo or mini-game that would stop people from being convinced otherwise?
And why wouldn't they release GPU theoretical numbers that proved it was more than 2x more powerful than Xenos, along with all the eDRAM, etc?
Only two reasons I can think of - and neither are good - is that Nintendo's either stupid, not wanting to blow their secrets for no rational reason, OR, that they've got something to hide.

Nintendo's always been a very secretive company, holding its cards to its chest as a matter of tradition. However sometimes you shouldn't do that. Sometimes you should trumpet and wave your arms and say hey hey, lookatme, see how great I am. Even if maybe it's not entirely true. Apple knows this, they're secretive, but they also self-promotes. Nintendo's secretive...and nothing.

As for the hiding bit... They seem to be under the mistaken impression that if you don't reveal just how bad your hardware is, it won't matter. Except it does, of course. So the smart company wouldn't aim for the exact rock-bottom lowest specs you could possibly get away with, but rather realistically a bit higher, so that your product can stand up for itself in comparison to the competition for some years at least without embarrassing everyone involved.

That would at least keep the stocks from hitting record lows, and it would raise the expectation and mindshare, IMO.
This is also true.
 
So the smart company wouldn't aim for the exact rock-bottom lowest specs you could possibly get away with, but rather realistically a bit higher, so that your product can stand up for itself in comparison to the competition for some years at least without embarrassing everyone involved.
If you look at things with a different mindset, specs aren't important. How often do you compare the processing power of various cuddly, mechanical pet toys? Or the performance of individual parts in a radio-controlled flying toy? From the perspective of a toy manufacturer, Nintendo could happily ignore the underlying hardware without it being at odds with the toy business. Of course, the console business is a bit different, and Nintendo were happy to talk hardware when they were top dog, so this isn't a clear-cut confusion on their part. Still, the notion that there's something wrong with hiding the hardware comes down to the perspective of who's asking.
 
The thing is, PS360 developers got used to doing lots of GPU stuff in the CPU. This happened more in the PS3 where the GPU lacked shader performance, but I think even the X360 does FXAA and other post-processing effects through the CPU.
If they go from that to an architecture with a CPU that can't do GPU stuff as effectively as the others (floating-point performance, afaik), they may get the impression that the hardware is weaker.
I know that they are offloading a lot to the CPU on the PS360, but my question was about Nintendo hardware choice on a ~$300 budget system, and if the situation would have been "better" with a stronger CPU and a cheaper GPU or not (assuming that Nintendo got the best possible price for the chip while negotiating with AMD).
I also understand that priorities can change a lot when going after top performance (like how it will probably happen with the PS460 if they don't go after cloud stuff or something), but - in my opinion - they have a different case with such a "low" budget. The more you increase the resolution, the more the games tend to be GPU bond (limited), so perhaps they decided to save a little more money on the CPU in favor to have a stronger GPU?


Then again, I think an experienced console developer should be able to see through that at a first glance, so that may not be the case.
I'm pretty sure they do, this is simply can't be a question when talking about professional "A" developer teams.


Llano or Trinity maybe, but definitely not Ivy-Bridge as its architecture is a lot more CPU-centric.
Nonetheless, it seems the 35W Trinity + 1866MHz DDR3 combo is able to do a lot more than what we're seeing on the Wii U.
A Trinity with a 50W power budget should be able to do even more.
Looking at reviews at the professional hardware sites (e.g.: Anandtech), there is not much difference between Hd4000 and Trinity when it comes to gaming, but I think you can't really compare things with the console world anyway, because MS will continue to hinder the performance down to 360 levels as much as possible until they will have a stronger console again, so we can't really tell how faster are the latest IGPs could perform in consoles, we can only guess that they would do way much better.
 
If you look at things with a different mindset, specs aren't important. How often do you compare the processing power of various cuddly, mechanical pet toys? Or the performance of individual parts in a radio-controlled flying toy? From the perspective of a toy manufacturer, Nintendo could happily ignore the underlying hardware without it being at odds with the toy business. Of course, the console business is a bit different, and Nintendo were happy to talk hardware when they were top dog, so this isn't a clear-cut confusion on their part. Still, the notion that there's something wrong with hiding the hardware comes down to the perspective of who's asking.

By the same token if it's "unimportant", then Nintendo shouldn't care to hide it. It literally shouldn't matter. Hiding it says it does.
 
By the same token if it's "unimportant", then Nintendo shouldn't care to hide it. It literally shouldn't matter. Hiding it says it does.
I think what Shifty Geezer meant is that they are releasing/showing a family car not a sport car, so they are not talking about 0-100 acceleration figures, or how many horsepower, etc... it's a different PR approach.
 
even family cars have plenty of relevant specs, but i guess we could go back and forth on that forever...

I'm just saying, Nintendo actively hides their specs for a reason.

Apple is the analogue, but Apple actually puts cutting edge hardware in their stuff, almost too their credit imo (they could probably get by and more profitably with less imo).

Like Apple, Nintendo wants to say "it's about the experience, not the hardware" (then again, as noted, Apple DOES put great hardware out).

I just dont think they're right, not in the market they're in. It works if you have a gimmick to attract casuals. I dont think a tablet is it this time.

Too be fair I can see everybody hiding their specs soon enough, just as Sony stopped listing PS3 clock speeds at some point before last gen. Gives your competitors less to go on, reduces spec wars in the minds of ignorant consumers who dont understand said specs, etc.

Microsoft was happy to tout 360 specs cause it all sounded blazing fast. Who knows if that even lasts.
 
even family cars have plenty of relevant specs, but i guess we could go back and forth on that forever...

I'm just saying, Nintendo actively hides their specs for a reason.

Apple is the analogue, but Apple actually puts cutting edge hardware in their stuff, almost too their credit imo (they could probably get by and more profitably with less imo).

Like Apple, Nintendo wants to say "it's about the experience, not the hardware" (then again, as noted, Apple DOES put great hardware out).

I just dont think they're right, not in the market they're in. It works if you have a gimmick to attract casuals. I dont think a tablet is it this time.

Too be fair I can see everybody hiding their specs soon enough, just as Sony stopped listing PS3 clock speeds at some point before last gen. Gives your competitors less to go on, reduces spec wars in the minds of ignorant consumers who dont understand said specs, etc.

Microsoft was happy to tout 360 specs cause it all sounded blazing fast. Who knows if that even lasts.
Again, it doesn't matter from their point of view imho. I'm not saying if they are right or wrong, i just guessing it's their PR approach to the target audience. Do you care how many Mhz the signal processor is running in your TV, or what kind of a CPU do you have in your car? They are targeting a segment of the market who wouldn't even understand those numbers even if they would tell them, and by looking at the numbers as how many Wii unit they shiped since 2006, that segment is pretty large.
Yes, we could say that new things like Smart-TVs with could gaming, phones and tablets will take over that market, but that would be nothing more than guessing atm =]
 
The thing is, PS360 developers got used to doing lots of GPU stuff in the CPU. This happened more in the PS3 where the GPU lacked shader performance, but I think even the X360 does FXAA and other post-processing effects through the CPU.

Pretty sure that FXAA is done on Xenos and I have not read about any other effects being done on 360 CPU.
 
By the same token if it's "unimportant", then Nintendo shouldn't care to hide it. It literally shouldn't matter. Hiding it says it does.
They're not hiding it though. It's just not a relevant value. Loko at this random listing in Amazon:
http://www.amazon.co.uk/Emotion-Pet...f=sr_1_2?s=kids&ie=UTF8&qid=1339591629&sr=1-2

There's no mention of its tech specs or sensor types used. And why should there be? Does it work? Yes. That's all the end user cares about. No-one buys a Hasbro cuddly toy over a Disney cuddly toy by comparing specs. By that same token, what's in the box in any console is immaterial when it's what's on the outside that matters. Who cares is PS3 one iggleflop or three iggleflops as long as its games look good and play well? Now as as consoles also straddle the computer/electronics business, where technical specifications are a deciding factor between devices that serve the same purpose (two devices of the same price with one offering better specs), then there is interest in specs (often misplaced, such as buying cameras on megapixels and forgoing image quality caused by lens). But a toy manufacturer isn't hiding their specs if they are selling a product, especially a unique one. Nintendo never released Wii specs AFAIK and that didn't affect the purchasing habits of those who wanted the toy they were offering. The only people who care enough about the specs of Wuu will never buy it if it doesn't look good enough, no matter what the specs. So why bother?
 
If you look at things with a different mindset, specs aren't important. How often do you compare the processing power of various cuddly, mechanical pet toys? Or the performance of individual parts in a radio-controlled flying toy? From the perspective of a toy manufacturer, Nintendo could happily ignore the underlying hardware without it being at odds with the toy business. Of course, the console business is a bit different, and Nintendo were happy to talk hardware when they were top dog, so this isn't a clear-cut confusion on their part. Still, the notion that there's something wrong with hiding the hardware comes down to the perspective of who's asking.

This. I think Nintendo is actively going for a very different kind of gamers now. This will make their console not that interesting to people like us, but it doesn't make their approach a failure.

When the customer has never bought another console before, has never seen a good graphical comparison, and frankly wouldn't be that interested about one if she/he had seen one, getting the MRSP down is a lot more important that high specs.
 
If you look at things with a different mindset, specs aren't important. How often do you compare the processing power of various cuddly, mechanical pet toys?
I know this, and so do most everybody else too, except that specs DO matter. ...If you want to attract 3rd party developers, that is.

And Nintendo claimed that this time they want to do that. With Wii, Nintendo seemed to be happy with their own catalogue of diverse titles, including a Zelda and a Metroid and two Mario Galaxies, and a big bunch of mostly lame-ass party gaming crap from third-parties. A number of attempts at serious, hardcore Wii games failed spectacularly, and largely because Wii was seen as a not very serious gaming console by the serious, hardcore gaming crowd.

If Nintendo really means it when they say they want to fix that they can't throw together a console made of bottom-dredge from the cheapest, most low-spec barrel they could find.
 
I know this, and so do most everybody else too, except that specs DO matter. ...If you want to attract 3rd party developers, that is.
I'm sure Nintendo aren't keeping the specs a secret from developers. ;) And TBH I reckon specs are about the last thing developers will care about, unless you can boast about them and prove you have something of worth to attract the hardcore gamers. A dev I was speaking with the other day was telling me no-one they knew was developing for Vita as they don't see a market there, despite its awesome specs. Nintendo needs to convince devs that they can develop for Wuu economically and will have a large, game-buying install base to buy games. I don't see that that'll be facilitated by telling the world how uninspired their hardware is.
 
Status
Not open for further replies.
Back
Top