Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I think that is the main problem... I imagine most people were expecting something at least a little better than the 360.

Initial impression is that it is gimped by the CPU and low bandwidth...

Its can be better than the x360.
A good , new GPU with 32 megs of edram, and with direct link to the CPU (fast one) can run round around the combined performance of the x360.
 
I think that is the main problem... I imagine most people were expecting something at least a little better than the 360.

Initial impression is that it is gimped by the CPU and low bandwidth...

Well, those two are low points in relation to PS360.
Where it would seem that the WiiU has advantages is in data paths on chip and between the chips on the MCM, latencies, and cache/local storage sizes. We still don't know how it stacks up in GPU ALU capabilities, but it would seem reasonable that it has advantages there as well.

It is notable that the strong sides of the WiiU doesn't really pay off with ports - simply giving it a fairly low ball modern GPU and investing the saved money in 128-bit worth of GDDR5 would have paid off better for current multi platform titles. Of course, it's still doubtful just how much resources would have been poured into actually producing assets beyond what the HD-twins justified in order to produce something better just for the WiiU, if indeed Nintendo had chosen this approach.

So maybe they are right to go for a more economical balance both in terms of hardware cost and power draw. It allows PS360 ports comfortably enough, and allow them benefits going forward. It's damnably difficult without hardware specifics to make a decent evaluation of the potential of the system, and even with such specifics, one would have to be well versed in different rendering approaches and their cost given different memory hierarchies to be able to engage in any comparative estimations.

I haven't really done any graphics programming since GL became "Open". But I would be curious to hear someone with current graphics dirt under their fingernails ruminate a bit on approaches that would work well on a system generally similar to the size, shape and colour of the WiiU (*cough*) and how that relates to how things are typically done today.
 
Obviously the Wii U is capable of running ports, but if it's at an excess cost of quality, it could put a bit of a dent in the system's ability to sell multiplatform software and how dedicated developers will be in building "from the ground up" Wii U versions of games.

*If* 64 bit DDR3 is confirmed, then I think we will all be fairly disappointed. 128 bit and/or GDDR5 would've certainly given the system some growth, even with an RV730 level GPU. With 32 MB of eDRAM, can we still expect "almost free" MSAA and 720p z-buffering a la Xenos? Going with a current gen level of capability was already a pretty damning approach in the first place, and I don't see Nintendo getting away with it again. They better have the games and software to back their system up.
 
Well, in my case (again) Nintendo's system is not attractive to be an early adopter.

Will wait, but than chance is high that next gen arrives...
 
The WII U is using roughtly as much power like the ps360.

So say they get the most processing power from the power budget,so if the next gen mc of the ms/sony would be faster than the wiiu,then it has to use more poser as well.

I mean like twice/three times more power than the WIIu and three/four times faster than the WIIU.

So say the 200 watt power usage and 400 watt PSU for the ps4/xbox3
 
I think that is the main problem... I imagine most people were expecting something at least a little better than the 360.

Initial impression is that it is gimped by the CPU and low bandwidth...

I had an executive from a major publisher describe it to me as 90% of a PS3. That was 2 months ago.

I don't see the point of a little better, either you try and make a technical statement, or you position yourself for current gen ports, and make it as cost effective as possible.
 
I had an executive from a major publisher describe it to me as 90% of a PS3. That was 2 months ago.

I don't see the point of a little better, either you try and make a technical statement, or you position yourself for current gen ports, and make it as cost effective as possible.

~15GB/S to main memory + eDRAM does fit with this analysis. Even if the GPU is more bandwidth efficient than Xenos/RSX it still won't fully compensate for the reality that there is less bandwidth to go around.
 
The WII U is using roughtly as much power like the ps360.

So say they get the most processing power from the power budget,so if the next gen mc of the ms/sony would be faster than the wiiu,then it has to use more poser as well.

I mean like twice/three times more power than the WIIu and three/four times faster than the WIIU.

So say the 200 watt power usage and 400 watt PSU for the ps4/xbox3

You can easily feed an ivy bridge CPU and a strong Kepler GPU with a fanless tiny picoPSU, and such system would destroy the Wii-U or the ps360 by a huge margin if programmed directly without an OS like Windows. (I built such systems and I can tell you that the total TDP is so low, you can cool it silently.)
Well, I can see that perhaps it would be too expensive to use such high quality parts in mass productions, but I still think that 400W for a console would be too much nowadays.
 
I had an executive from a major publisher describe it to me as 90% of a PS3. That was 2 months ago.
By the same token, how many % of a PS3 is the 360? 70? 130? 94? 156?
Such numbers are useless. The system in its entirety is much too complex to be expressed as a single figure of merit. (Unless of course you port code over and see how fast it runs. But - that is obviously a flawed way to assess something new.)

I don't see the point of a little better, either you try and make a technical statement, or you position yourself for current gen ports, and make it as cost effective as possible.
I said pretty much the same thing above, but there are caveats.
One is that while multi platform titles are important, the identity of a platform is defined by its exclusive content. And this have been, and is, particularly so for Nintendo. It will be interesting to see how exclusive titles look a couple of years down the line, and if there is a marked difference compared to the ports.

Another is that technical statements are not the only ones that can be made, nor that they have to involve rendering performance. For instance, the Wii brought motion control and a focus on accessibility to the table. It is hard to imagine anyone today arguing that a table tennis game, or a golf game, or a sword fighting game and so on is better played with a classical controller, regardless of rendering power. And some of the things allowed by the WiiU controller are also orthogonal to rendering performance.

And of course a technical statement could be - "We feel that noisy power hogs are an abomination in the living room and less than desirable generally". :)

While this is Beyond3D, and a focus on graphical prowess is understandable, that particular tunnel vision has led to spectacular mispredictions of how the market will react to new consoles. The WiiU is obviously made to be able to receive ports, but it is equally obvious that it is not its main thrust, or Nintendo wouldn't have spent the time and money on the Wuublet in the first place.
 
The point here is that the general public already settled in the impression that it's significantly faster and more powerful than the current consoles. So any port that's inferior to the PS3/X360 version is attributed to lazy devs and publishers and not to Nintendo being cheap with the system.

It's the 360 vs. PS3 situation all over again.
 
anand thinks the ddr is 800 mhz https://twitter.com/anandshimpi/status/270289178242195456

he's also been tweeting about his teardown and he tore the heat spreader off. said he damaged the dies but was able to get measurements. which will be in his article.

The funny thing is its all downclocked to hell and buses, rops cut to a quarter because design. The costs for Nintendo arent any lower.. Even Nintendo magic couldnt bend the laws of physics
 
We still don't know the EDRAM bandwidth, size, and if it has the ROPs integrated or not. That could reduce the load on the main memory pool somewhat, but the memory speed is probably still a big bottleneck.
 
Anand said:
There are four 4Gb (512MB) Hynix DDR3-800 devices surrounding the Wii U's MCM (Multi Chip Module). Memory is shared between the CPU and GPU, and if I'm decoding the DRAM part numbers correctly it looks like these are 16-bit devices giving the Wii U a total of 6.4GB/s of peak memory bandwidth. That doesn't sound like a lot, but the Wii U is supposed to have a good amount of eDRAM for both the CPU and GPU to use.

http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Edit: As noted below, it looks like he forgot to take the DDR into account.
 
Last edited by a moderator:
That would be absurdly low, even 12,8 GB/sec is seriously pushing it.
I can't imagine that working as apparently BLOPS 2 is able to run at 60 fps, at some reasonable resolutions. It'd be insane to have only what, 100MB/frame of data traffic? That'd mean all textures, shadow maps and other data would have to be cached in the EDRAM...
Although screenshots seem to indicate that at least on multiplayer levels the shadows are completely static, so they're clearly bandwidth starved - but still, at least that ~200MB/frame that 12GB/sec allows has to be there.
 
Status
Not open for further replies.
Back
Top