Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I don't remember anyone talking about GPGPU and the X360. I'm not even sure if the term was invented at all in 2005-2006.

Come on! :D Doing stuff other than transformation and pixels with the GPU has been thought about for a long time. Heck even the GS in PS2 was at some point experimented with because of it's at the time extreme fillrate.
GPGPU is just the current buzzword.
I remember people talking about doing animation, physics culling etc. on the 360 GPU. This was for the first time feasible on a console because of the higher programmability.

How can you be sure?

Well of course not a hundred percent sure, but every source (incl. official ones) tells us about an almost current gen Radeon GPU with some extra features. Just the great increase in shader units and greater bandwidth alone, should guarantee better performance.
 
Come on! :D Doing stuff other than transformation and pixels with the GPU has been thought about for a long time. Heck even the GS in PS2 was at some point experimented with because of it's at the time extreme fillrate.
GPGPU is just the current buzzword.
I remember people talking about doing animation, physics culling etc. on the 360 GPU. This was for the first time feasible on a console because of the higher programmability.

Okay, and now that the console has been out for 7 years how much non-GPU stuff has been done on Xenos? Probably not an awful lot. That's much more telling than anything anyone may have claimed about XBox 360 a long time ago.

Well of course not a hundred percent sure, but every source (incl. official ones) tells us about an almost current gen Radeon GPU with some extra features. Just the great increase in shader units and greater bandwidth alone, should guarantee better performance.

R7xx is where all the rumors have been pointing for a long time. 3-4 years old does not qualify as almost current gen. I've also not heard of any extra features except for the eDRAM. Could you provide any sources on this?

If you want to believe that the GPU will be used to pick up the slack in lack of vector FP on the CPU then that's where those extra shader units will be going instead of towards improving the graphics. You can't have it both ways.
 
I must say, I saw a video of the gamepad in action on miiverse and it's pretty cool. Each game has a community, it's easy to ask for help and everything is so well done. It's like wii facebook. I'm surprised the battery is so bad though and no multi touch is kinda sad at this point in time. I'm guessing later on the wii U's hardware potential should exceed ps360 in the long run and that's good enough for nintendo.

Next gen engines are all about scaling down to mobile so I'm guessing many multi platform titles will release on orbis/durango, ps360, wii U, iphone/ipad, android and win 8 phones.
 
Now let's say that this time around you can texture from eDRAM. With Wii U's limited main RAM bandwidth this is desirable. In fact, it could be entirely possible that you must texture from eDRAM, which AFAICR was the case in Flipper/Hollyoowd AFAIK, which would make a big bandwidth hit on it unavoidable. In this case it's easy to see how Wii U's GPU eDRAM bandwidth could end up a limiter long before Xenos's, while still providing tremendously more than what main RAM alone does.

That would make the WiiU some of the worst designed console hardware ever.

The EDRAM is not going to be big enough to hold a whole level's textures in it, that means you will have to load textures from the DDR3 RAM, thus eating twice the bandwidth for no reason (if it does run at only the same speed as the DDR3 RAM).

The FXAA in games like Arkham City is probably done in GPGPU; I don't think they'd be able to work that into the already overburdened CPU and main RAM load. This suggests at least a possibility of being able to texture from eDRAM. I don't think they'd add it if it hurt framerates further on a game that is already compromised; in this case the GPU probably had enough spare fillrate and ALU power but there could be additional overhead introduced in the extra resolve plus texturing bandwidth from main RAM that'd be involved if it can't texture from eDRAM.

If Wii U's GPU eDRAM were as capable as Xenos's then we should be seeing at least 2xMSAA on ports that didn't have it on XBox 360; it'd be free given that Wii U's GPU has more than 2x the eDRAM.

FXAA is a post process pixel shader, nothing makes it "GPGPU".
 
That would make the WiiU some of the worst designed console hardware ever.

The EDRAM is not going to be big enough to hold a whole level's textures in it, that means you will have to load textures from the DDR3 RAM, thus eating twice the bandwidth for no reason (if it does run at only the same speed as the DDR3 RAM).

It's only twice the bandwidth if you get zero reuse out of any texture. I don't know what texturing is like in games these days but I wouldn't expect that to always be the case. Of course you mention an entire level's textures, but what's relevant is what needs to be on one frame, because the texture assets will change relatively slowly.

Making it capable of rendering both from eDRAM and main RAM at full speed adds complexity to the hardware, hence one reason why Xenos didn't allow texturing from eDRAM at all.

Look at PS2, PSP, Gamecube, and Wii - texturing from relatively limited eDRAM is standard if not the only option. Or even better, look at N64, which could only texture from a measly 4KB RAM. So I'm pretty sure this wouldn't put Wii U in the running for worst.

Of course I don't think the eDRAM runs at the same speed as the DDR3, that doesn't really make sense.

FXAA is a post process pixel shader, nothing makes it "GPGPU".

That's purely semantics. I described it that way because on some other platforms (like PS3) it might actually make the most sense to do frame post-processing off the GPU.
 
That's purely semantics. I described it that way because on some other platforms (like PS3) it might actually make the most sense to do frame post-processing off the GPU.

FXAA doesn't fall under general purpose computing though, it's graphics work that would go on the GPU on any other platform than PS3. They would never have considered putting it on the Wii U CPU in the first place.
 
It's just image image processing. PPAA is actually a pretty good real world example of a successful GPGPU application. It's not actually something GPU designers planned for, but it's the kind of job that can efficiently be adapted to a GPU's limitations. The PPAA revolution started with an Intel engineer doing this MLAA stuff on a CPU, was adapted to the Cell by Pandemic before being evolved to a version well suited to running on a GPU.
 
It's just image image processing. PPAA is actually a pretty good real world example of a successful GPGPU application. It's not actually something GPU designers planned for, but it's the kind of job that can efficiently be adapted to a GPU's limitations. The PPAA revolution started with an Intel engineer doing this MLAA stuff on a CPU, was adapted to the Cell by Pandemic before being evolved to a version well suited to running on a GPU.

I would hardly call it a GPGPU application, if that is the case what about edge detect shaders? gaussian blur? do they also count?

Just because a different shader first came about as a CPU processed effect does not mean that FXAA counts as GPGPU.
If that was the case then TnL would also count as GPGPU (or any number of rendering techniques).
 
I must say, I saw a video of the gamepad in action on miiverse and it's pretty cool. Each game has a community, it's easy to ask for help and everything is so well done. It's like wii facebook. I'm surprised the battery is so bad though and no multi touch is kinda sad at this point in time. I'm guessing later on the wii U's hardware potential should exceed ps360 in the long run and that's good enough for nintendo.

Next gen engines are all about scaling down to mobile so I'm guessing many multi platform titles will release on orbis/durango, ps360, wii U, iphone/ipad, android and win 8 phones.


I could see them going multitouch if multitouch (capacative) touchscreens become more durable. As usual, Nintendo has to cater to the lowest common denominator: "Mr Butterfingers McImbecile" who will likely wedge his GamePad into his TV screen at some point, or jump up and down on the screen (possible in frustration over the OS loading times ;)) and Capacitive screen are far less duarble than resistive. Resistive screens are tougher and dont wear out as quickly (although its unlikely to reach its wear-down date in its lifetime in either case - they both last pretty long). Once multitouch resistive screens are mainstream, it wont be a problem of course.
 
I find it slightly comical that some of the same people who were touting the 360 GPGPU capabilities to help its CPU against Cell, is now downplaying the better implementation (most likely) of the same idea for the Wii U.
One can't help but think it's just ill will towards the name.
These sorts of posts really irk me. It's basically a personal attack against a perceived prejudice without naming names or presenting evidence in support of the view. Who are these people? I recall talk of stuff like physics on Xenos as a possibility, but I'm not sure it was ever presented as sure thing. I certainly couldn't name anyone proposing/supporting that idea. And as it pans out, they would have been wrong, it seems. Xenos does graphics, not GPGPU. So anyone who though Xenos was able to do physics back in 2005 and would in all fairness be able to change their mind as it turns out their belief in GPGPU was unjustified then.

If the rumored large eDRAM pools on both dies are true the smallish bandwidth will be more than made up for AND it will allow the design to scale better with improvements in fabbing.
Except we don't know the (practical) BW of the eDRAM.
 
Not completely true though. There are a number of things being done as 'GPGPU' functions on Xenos. There was a lot of work done to get HD DVD support running for instance, as Sebbbi explained, and apparently GPGPU logic is also used for Kinect analysis?
 
You can't just add bandwidth together like this. If eDRAM is used both for render targets and textures it's basically like a cache.

Isn't that what PC, the PS3 and the Dreamcast do though, store render targets and textures in video ram? The Dreamcast couldn't even texture from main ram either, come to think of it. If the WiiU is using edram for render targets, back buffer, z and some textures, and accessing the rest from main memory then it'll be in a similar (though not identical) situation to the PS3.

There's extra bandwidth overhead transferring to and from main RAM to eDRAM to fill it with textures and resolve render targets (unless the video output comes straight from eDRAM, in which case that takes bandwidth instead). I don't know how big a typical scene texture footprint is, but if you're streaming it into an eDRAM texture cache you'll need to basically double buffer it to have some part reserved for the incoming textures. This could cut down the amount of space by a fair amount.

From what I can remember of sebbbi's detailed description of virtual texturing, you're looking at single figure MBs of textures required per frame, normally with a high proportion of re-use between subsequent frames. So the write bandwidth used from copying in new texture data to the edram probably wouldn't be very high, so long as you were only copying in what you needed (and maybe a little extra).

And once you had your textures in edram you shouldn't be in any worse of a situation that you when texturing from PS3 vram. Unless it has less bandwidth, of course. :???:

Of course, it's possible that there's dedicated hardware that'll manage the eDRAM as an outer level texture cache.

Well without it you'd require developers to basically rewrite parts of their engines to manage texture in a virtual texture like way, in order to fit all the required texture data for each frame into a relatively tiny chunk of memory. So if the WiiU can only texture from edram then I'd have to presume Nintendo have added some rather fancy texture caching logic.

Without such custom hardware (and the ability to control it or not use it) I'm inclined to agree with TheD - I think only being able to texture from edram would be a distinct negative.

But isn't that what we're seeing, that in BO2 Wii U is typically behind XBox 360 and ahead of PS3?

Apparently not, according to Digital Foundry "It's clear that plenty of characters and full-screen transparencies are particular Achilles Heels for the Wii U". The video and frame rate graphs show the WiiU tanking compared to even the PS3 under these circumstances. Given that the PS3 has only ~21 GB/s of video memory bandwidth - and that it's doing at least some texturing from there (perhaps most or all of it) - then you have to ask what's going on with the WiiU.

The WiiU appears to be losing out to the PS3 and even Trinity and Llano here - and this is what caused me compare the PS3 and APUs to the WiiU and look at their various bandwidths. If the WiiU really does have a pool of high bandwidth edram then you have to ask what it doing in BO2.

I don't know how it's done in games but ideally I'd expect you to be post-processing frame N while frame N+1 is being rasterized, if you have the space for it.

That would make sense on the PS3 where you'd be doing it on the CPU, but the 360 wouldn't be able to do it because of it's limited edram. And only WiiU games are probably just doing what the 360 does, given that these are ports and launch titles etc. Processing two frames at once on the GPU would only seem to make the rendering time of two frames less predictable, but I guess if part of GPU is going idle it could be more efficient.

Edit:

Of course I don't think the eDRAM runs at the same speed as the DDR3, that doesn't really make sense.

You're right, it doesn't make sense! But neither does a 2012 console losing to the PS3 in bandwidth bottlenecked fillrate situations. That makes even less sense!

Nintendo wanting to cost save on a 128-bit off-chip bus (and the 8 ram chips that would require) makes sense though, as does the flexibility of a full read/write edram pool and using an existing memory controller instead of getting a custom super fat bus. And regardless of the actual specifics of the edram I've got a strong feeling it's going to be much closer to the PS3 BW than 360 BW.
 
Last edited by a moderator:
Not completely true though. There are a number of things being done as 'GPGPU' functions on Xenos. There was a lot of work done to get HD DVD support running for instance, as Sebbbi explained, and apparently GPGPU logic is also used for Kinect analysis?

That's the thing, I don't doubt we will see GPU compute used on the Wii for some things, just as it's been used on the 360.

What I don't believe is that Nintendo gave the WiiU a weak-sauce CPU because they planned for GPGPU to "take over" all the work in "CPU centric" PS360 ports. Nintendo gave the WiiU a cheap CPU because they wanted to put a cheap CPU in the Wii. :D
 
I must say, I saw a video of the gamepad in action on miiverse and it's pretty cool. Each game has a community, it's easy to ask for help and everything is so well done. It's like wii facebook. I'm surprised the battery is so bad though and no multi touch is kinda sad at this point in time. I'm guessing later on the wii U's hardware potential should exceed ps360 in the long run and that's good enough for nintendo.

I'm okay with multitouch, DS and 3DS don't have it. Multitouch can even be a liability : how can you even know you have to two-finger something, esp. if you have no experience with multitouch cell phones and tablets. Even then I guess the feature is only used by most people for zoom in/out.
 
The speed drop could be related to AI / particle system calculations overloading the CPU, too. AI could require a lot of raycasting for line of sight and cover related stuff, which is not transferable to the GPU.

Particles could maybe utilize the GPU though, and it might be that noone had the time to move the code there. Or maybe it's just the characters - apparently Mass Effect 3 also has serious slowdowns on the Citadel.
 
Not completely true though. There are a number of things being done as 'GPGPU' functions on Xenos. There was a lot of work done to get HD DVD support running for instance, as Sebbbi explained, and apparently GPGPU logic is also used for Kinect analysis?

I don't know, I think there is a DSP in the Kinect?, one of the main features is also the massive amount of work that was precomputed, they took zillions of pictures and number crunched them to grow their algorithm so the thing is partly a case of "according to the data's signature we're in the spatial configuration number 0xAE9434B"
 
I could see them going multitouch if multitouch (capacative) touchscreens become more durable. As usual, Nintendo has to cater to the lowest common denominator: "Mr Butterfingers McImbecile" who will likely wedge his GamePad into his TV screen at some point, or jump up and down on the screen (possible in frustration over the OS loading times ;)) and Capacitive screen are far less duarble than resistive. Resistive screens are tougher and dont wear out as quickly (although its unlikely to reach its wear-down date in its lifetime in either case - they both last pretty long). Once multitouch resistive screens are mainstream, it wont be a problem of course.

Resistive screen is probably more durable because of its plastic screen, but plastic wear down easily compared to glass.
Anyway, you can use plastic instead of glass for capacitive screen.

And I don't believe they will introduce multitouch because there is no point for them to add it.
 
That's the thing, I don't doubt we will see GPU compute used on the Wii for some things, just as it's been used on the 360.

What I don't believe is that Nintendo gave the WiiU a weak-sauce CPU because they planned for GPGPU to "take over" all the work in "CPU centric" PS360 ports. Nintendo gave the WiiU a cheap CPU because they wanted to put a cheap CPU in the Wii. :D

Cheap, low-power, and backwards compatible with Wii. The benefits, at least, are clear. ;)
 
Resistive screen is probably more durable because of its plastic screen, but plastic wear down easily compared to glass.
Anyway, you can use plastic instead of glass for capacitive screen.

And I don't believe they will introduce multitouch because there is no point for them to add it.

Might be the case. I dont know, I just know that its more durable and cheaper. Thats why commercial products (airport information screens, hotel help desks, seat-back screen on trains/planes, childrens touch screen devices usually use (until very recently at least) resistive touchscreens. Not saying it was the correct choice for Nintendo - that'll just be why they used them and why they probably wont switch to capacitive anytime soon - unless various things change.

And like you say - they dont need to right now as its unlikely there's enough need for it on a console right now. Portable is a different story. I'd imagine the next DS handheld will have it (I hope)*


*Not really an Edit: No it won't, who am I kidding?
 
Status
Not open for further replies.
Back
Top