Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I'm not aware of BLOPs 2 on the Wii U having higher res shadows,and DF doesn't mention it. Early screenshots showed the opposite, but that was probably fixed before release. Do you have a particular comparison in mind?

Nothing about the Wii U version is better then the 360, some Nintendo imagined it looked better, but were quickly corrected by DF and some trusted members like darkx who had both the 360 and WII U version which was analyzed by him.
 
Same resolution, and the Wii U won't have the copy-out overhead that the 360 does for at least 2 buffers.

(Duplicate info, didn't see terraincognita's post) One thing DF didn't mention, and this is according to Timothy Lottes, is that Frozenbyte also moved to the PC-spec FXAA on the Wii U. Trine 2 is actually one of the games that makes me doubt the 160 ALU theory. But, I think we really need to see more multi-platform games on Wii U to get a better grasp of it's processing power and bottlenecks.


Also, is it possible that some of these games aren't GPU bound, but are choosing not to increasing the res/AA/etc for other reasons? For example, if a game is CPU bottlenecked on Wii U, maybe the dev. is choosing to keep the GPU headroom so as to avoid the possibility of compound bottlenecks, and thus resulting in even larger and more frequent framerate drops. I feel like Batman:AC could be this way. If the developer was GPU bound on the Wii U, why would they have added FXAA?
 
Last edited by a moderator:
Actually it is a higher resolution some of the time. Frozenbyte removed the dynamic resolution on Wii U version, so it is fixed 720p. They also moved to the PC-spec FXAA on the Wii U. This is actually one of the games that makes me doubt the 160 ALU theory. But, I think we really need to see more multi-platform games on Wii U to get a better grasp of it's processing power and bottlenecks.

Trine 2 would be a perfect example, if it can run much better on HD 5550, then we can rule out 320 SP IMO.
 
It's been brought up like a million times now. If the Wii U games were were consistently CPU limited (including due to not enough bandwidth) and had GPU resources to spare they'd be running at higher resolutions. CPU utilization doesn't scale with resolution.

Another thing that's been brought up a million times is the fact that not only the CPU is weak but also the memory bandwidth.

The thing has a friggin' ~12GB/s bandwidth for CPU and GPU.. Of course the console can't do higher resolutions across the board.
 
Nintendo don't really have to be limited to commercially available configurations. RV730 (a.k.a. Mario) actually has a configuration of 320 shaders to 32 TMUs. If you're going to be running Xbox ports you're probably going to want 8 ROPs - and the ROPs are independent of the SIMD pipelines.

I'm not talking about TMUs, I'm talking about the ratio of pixels calculated by the shader clusters versus the amount of ROPs available to store them in the framebuffer. Obviously, the ROPs 'limit' the bandwidth to memory so when adding more shader units they can only be utilized fully if the shader program is big enough. The other way around, adding more ROPs or decreasing the number of shader units, leads to the ROPs idling because the shader units don't have enough time to complete the program.

That depends on what you anticipate your workload being. If your projections show you'll be texture bound with 80 shaders to 4 TMUs you'd probably want a different ratio. RV730 has 40 shader to 4 TMUs, and AMD's low end GPUs maintained this ratio right up to the HD 6350.
I get your point. I assume that 4 TMUs per cluster offer blending 2 texels in one clock twice. It makes sense with 4 or 5 ALUs per SIMD row. Am I wrong here?

I'm not aware of BLOPs 2 on the Wii U having higher res shadows,and DF doesn't mention it. Early screenshots showed the opposite, but that was probably fixed before release. Do you have a particular comparison in mind?

http://www.youtube.com/watch?v=nBFT2rACfWA skip to 1:56 and check the mic's shadow. I saw other footage of the same thing too but I can't seem to find it anymore.

CPU and GPU are on the same package, but different die and different clockspeeds, going by Marcan's figures. Wouldn't be interesting if the edram was clocked differently to the GPU. Renesas edram is supposed to go up to 800 mhz!
Yes but why would Nintendo go trough the hassle of adding additional logic to interface the busses while they can just as well run the GPU at 1/3 or 1/2 the CPU's clock?
 
Another thing that's been brought up a million times is the fact that not only the CPU is weak but also the memory bandwidth.

The thing has a friggin' ~12GB/s bandwidth for CPU and GPU.. Of course the console can't do higher resolutions across the board.

Yeah and that's not good when compared to the discrete GPUs that have 32MB of embedded DRAM.

Wait, there are no discrete GPUs with 32MB of embedded DRAM?

Never mind.

We don't know yet if 12.8GB/s is insufficient for textures (whatever they can't use the eDRAM for) after taking away whatever the CPU needs, and assuming that the CPU can (and is) streaming geometry directly to the GPU via a dedicated channel. But I'm skeptical that it isn't, especially if there's even moderate texture compression and overdraw elimination (deferred renderer, depth-only pass, etc). For 720p@30Hz you'd be looking at texturing of about 463 bytes per screen pixel to saturate 12.8GB/s. That's a whole lot of texturing.
 
function, you do have quite a comprehensive theory for Wuu's lack of performance. I don't however, see any proof that this bottleneck lies in the shader cores. (btw, it's still likely those are r700 directx 10.1 parts at heart. I don't know why people keep comparing it to directX 11).

On the subject of the lack of increased resolution, this is what I read after a quick search. Is it incorrect?

In the most general terms your CPU limits you at lower resolutions and your GPU limits you at higher resolutions (and/or higher detail settings, more AA, etc).

Basically at low resolutions your GPU isn't working that hard so it's more how many frames can your CPU push though. As your resolution/detail/AA is increased though, the GPU has to work harder and harder to keep up with the CPU and eventually becomes the limiting factor.

http://forums.anandtech.com/showpost.php?p=32393537&postcount=6

If anything, what we've heard is that devs consider the CPU a bottleneck. I don't think Nintendo were aiming for a resolution jump over 360. The system isn't designed for that. Other limitations, such as in alpha effects, could be coming from the eDRAM. If it is only, say, 70.4 GB/s and it is expected to act as a frame buffer, local render target, and even MEM1 to the CPU, it doesn't start looking like an unlimited resource.

And now there's talk of a laughably ridiculous lack of documentation on Nintendo's part? (I expect we'll hear more about that soon). Nintendo's tools aren't known to be the best either, so porting to the architecture that is somewhat better somewhat worse won't be trivial. I'm sure it's certainly more a pain in the arse than bringing a game from PC to 360.
 
Brazos vs. Latte

Hello.

I measured the surface with the logic (The areas without a visible structure of "Brazos" and "Latte") of the illustration on page 181, the picture posted by "Gipsel".

The difference between the two areas of "Brazos" (40nm, 40sp per block) and "Latte" is ~12.4%. (Brazos > Latte)

1.) So if "Latte" is at 40nm than it is more like ~320sp :?:

2.) If Latte is 55 nm or so than is like ~160sp :?:

3.) Or the scaling of the illustration is wrong.

And why is everyone talking about ~15Watt GPU in Wii U and not 20 or 25 Watt GPU ? Are the CPU and other components so energy hungry?
 
And why is everyone talking about ~15Watt GPU in Wii U and not 20 or 25 Watt GPU ? Are the CPU and other components so energy hungry?

First, please don't double-post, it's poor forum etiquette and will probably anger the mods...

As for your question, the wuu draws a total of 38ish watts from the WALL SOCKET while running a game, with a disc spinning in the drive. So assuming 85% efficient power supply, that leaves about 32W to power ALL components in the console, not just the CPU and GPU.

The BR drive probably draws no more than 6-8W or so, there's no thick power cables attaching to it, just a ribbon connector. There's two wifi modules in there, one for the tablet and one for an internet connection, I've seen estimates of a watt each, might be a little high I dunno, but for argument's sake let's go with that.

CPU is only like 4W in the wuu innit? We're already down to ~20W left of our budget. There's still the cooling fan (1-1.5W maybe), 8 RAM devices and various miscellaneous ICs and circuitry left to account for. So estimating ~15W power for the entire GPU is probably not too far off, considering some of the 20W we had left to work with will be lost in the onboard voltage regulators...
 
Apology

Sorry for the double-post. It is my first time ever i am using a forum. I had problems with my first post and realized to late that it was only delayed. :oops:
 
First, please don't double-post, it's poor forum etiquette and will probably anger the mods...

As for your question, the wuu draws a total of 38ish watts from the WALL SOCKET while running a game, with a disc spinning in the drive. So assuming 85% efficient power supply, that leaves about 32W to power ALL components in the console, not just the CPU and GPU.

The BR drive probably draws no more than 6-8W or so, there's no thick power cables attaching to it, just a ribbon connector. There's two wifi modules in there, one for the tablet and one for an internet connection, I've seen estimates of a watt each, might be a little high I dunno, but for argument's sake let's go with that.

CPU is only like 4W in the wuu innit? We're already down to ~20W left of our budget. There's still the cooling fan (1-1.5W maybe), 8 RAM devices and various miscellaneous ICs and circuitry left to account for. So estimating ~15W power for the entire GPU is probably not too far off, considering some of the 20W we had left to work with will be lost in the onboard voltage regulators...
From anandtech: http://www.anandtech.com/show/6465/nintendo-wii-u-teardown
Wii U Power Consumption
Wii U menu (no disc in drive) 31.2W
Wii U menu (disc in drive) 32.8W
Super Mario U 33.0W

The disc is constantly spinning in the Wii U iirc, With your estimation that would give the Wii U GPU ~6.5W more to work with, the total being ~21.5W. Considering TDP =! Power consumption, we are likely looking at atleast a 25W TDP part. 7GFLOPs/Watt with 160ALUs or 12.8GFLOPs/Watt with 320ALUs. Both should be reasonable given a 55nm R700 or 40nm R700 part, and is what we've seen from HD 4670 at 8.1GFLOPs/Watt at 55nm.
 
From anandtech: http://www.anandtech.com/show/6465/nintendo-wii-u-teardown
Wii U Power Consumption
Wii U menu (no disc in drive) 31.2W
Wii U menu (disc in drive) 32.8W
Super Mario U 33.0W

The disc is constantly spinning in the Wii U iirc, With your estimation that would give the Wii U GPU ~6.5W more to work with, the total being ~21.5W.

?
Why when the disc is constantly spinning we can discount its power usage? Or maybe I'm reading it wrong.
 
?
Why when the disc is constantly spinning we can discount its power usage? Or maybe I'm reading it wrong.

It's not discounted, it is counted at 1.6Watts in the anandtech article. I took the 6-8watt estimation and deducted that from real world testing, and ended up with ~4.4-6.4watts freed up that could in his estimation be left over for the GPU.
 
?
Why when the disc is constantly spinning we can discount its power usage? Or maybe I'm reading it wrong.

The difference between having a disc spinning or not is ~1,6W.

Honestly, it was a bit of an exaggeration to assume a BR drive would ever consume 8W. This isn't 1997 where we needed >10 000 RPM in CDROMs to get acceptable transfer rates.
Or 1.5W for a 40mm fan that's probably working at below 1000 RPMs.

So IMO it's a safe bet that the GPU is getting 20W or more.
 
He's mistaken then!

To be absolutely clear, it's about vastly lower frame rates and / or resolutions (the two are somewhat interchangeable) than on supposedly similarly powerful ~352 gflop, 8 ROP PC parts such as the HD 5550. The Xbox 360 <-> PC multi-platforms are just a standout indicator of how massively a 320 shader 8 ROP part can outperform the 360 even in a less than optimal setting.

To emphasise; it's *not* about the Wii U being slightly slower or faster than the Xbox 360. If a Wii U port had a slightly higher frame rate or resolution a few percent higher then it wouldn't change anything. We should actually expect that sometimes - its ROPs and TMUs are faster after all, and even the HD 6450 can give the 360 a run for its money. If for example Wii U ports started to run at a 50 ~ 75% higher resolution though, while still maintaining the same frame rate, then we'd be looking at compelling evidence of HD 5550 or "320 shader" performance.

So nope, the argument isn't based entirely on "lower frame-rates" (and I hope he didn't mean "than the 360")!

Edit: And you could actually have a 160 shader GPU that was stronger than the PS360. If the GDDR5 variant of HD 6450 had 16 TMUs and 8 ROPs it would probably be a very good candidate.

That card is clocked at 750MHz and has 240GFLOPs available to it... I've never denied that 160SPs is a possibility, but that card doesn't help your argument, it runs COD4 at unplayable framerates while offering nearly 50% more power than Wii U, while Wii U handles black ops 2 at the same configuration as the 360 with frame drops during alpha heavy scenes. It is just not a good comparison IMO.
 
First, please don't double-post, it's poor forum etiquette and will probably anger the mods...

As for your question, the wuu draws a total of 38ish watts from the WALL SOCKET while running a game, with a disc spinning in the drive. So assuming 85% efficient power supply, that leaves about 32W to power ALL components in the console, not just the CPU and GPU.

The BR drive probably draws no more than 6-8W or so, there's no thick power cables attaching to it, just a ribbon connector. There's two wifi modules in there, one for the tablet and one for an internet connection, I've seen estimates of a watt each, might be a little high I dunno, but for argument's sake let's go with that.

CPU is only like 4W in the wuu innit? We're already down to ~20W left of our budget. There's still the cooling fan (1-1.5W maybe), 8 RAM devices and various miscellaneous ICs and circuitry left to account for. So estimating ~15W power for the entire GPU is probably not too far off, considering some of the 20W we had left to work with will be lost in the onboard voltage regulators...
here are some number we have been working with. Take it you did some testing to get 38watts?

33 watts max from wall @ 90% psu ~30 watts now 32w +2 watts

Disk drive ~4 Watts now 1.6w +2.4w Have have photos of the blu ray drive itself. anatech posted them in their tear down.

wifi ~.5 Watts now 2w -1.5
-Fan 1w
Whats left ~18 now ~19 watts

My guesses, anyone have hard numbers
2GB DDR3 Ram ~2 Watts
Flash Storage .5 Watts


Leaves about 15 watts for the whole gpu chip now ~17 watts

This is the math we were using for the cpu: If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

DIdnt have a fan listed and your wifi number are double but i have no idea what is the correct number iwas just guessing.
 
Check this out: http://www.youtube.com/watch?v=0eH3DmUgokk

One of the first games from a dev that I respect technically for always getting a lot out of the hardware, and I'm thinking it may raise respect for Wii U hardware a tiny bit.

It's also brining a co-op mode to this game using the Wii U pad something that I, as a father and a gamer who often games with less experienced gamers, really love to see more of ...

Of course I'm hoping to see modes like this come to other platforms as well (maybe even PS3 + Vita mode), but if this kind of thing becomes common practice on Wii U versions and the next-gen platforms won't have something similar, it could definitely be a system seller.

Anyway, that was off-topic - but technically, this game looks pretty good on Wii U, and it's using all the PC's high-res textures and has better lighting as well from the looks of things, and seems to be one of the first games that is quite a bit better on Wii U than on PS3 or 360. Will be interesting to see a more detailed comparison by the time the game ships (in March). In particular I'm interested to see how the framerate holds up.
 
here are some number we have been working with. Take it you did some testing to get 38watts?

33 watts max from wall @ 90% psu ~30 watts now 32w +2 watts

Disk drive ~4 Watts now 1.6w +2.4w Have have photos of the blu ray drive itself. anatech posted them in their tear down.

wifi ~.5 Watts now 2w -1.5
-Fan 1w
Whats left ~18 now ~19 watts

My guesses, anyone have hard numbers
2GB DDR3 Ram ~2 Watts
Flash Storage .5 Watts


Leaves about 15 watts for the whole gpu chip now ~17 watts

This is the math we were using for the cpu: If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

DIdnt have a fan listed and your wifi number are double but i have no idea what is the correct number iwas just guessing.

Wifi numbers at .5w is likely correct: http://www.raspberrypi.org/phpBB3/viewtopic.php?p=95973 .49w for a wifi n dongle.

Also flash storage wouldn't need energy from the PSU, at least not unless it was being used. This should move it back up to ~19 watts and fit pretty well in a 25w TDP.
 
So we have a couple developers now which are using higher resolution textures in the Wii U version of a game (vs XBox 360/PS3). You don't need more shader resources (ALU/TMU/ROPs) to handle higher resolution textures, it simply takes more bandwidth. This means that they would probably not be anywhere close to bandwidth limited if using the same texture quality as XBox 360/PS3..
 
Status
Not open for further replies.
Back
Top