But isn't that what we're seeing, that in BO2 Wii U is typically behind XBox 360 and ahead of PS3?
Performance is sometimes similar to PS3, but stressing scenes are much worse.
But isn't that what we're seeing, that in BO2 Wii U is typically behind XBox 360 and ahead of PS3?
I don't remember anyone talking about GPGPU and the X360. I'm not even sure if the term was invented at all in 2005-2006.
How can you be sure?
Come on! Doing stuff other than transformation and pixels with the GPU has been thought about for a long time. Heck even the GS in PS2 was at some point experimented with because of it's at the time extreme fillrate.
GPGPU is just the current buzzword.
I remember people talking about doing animation, physics culling etc. on the 360 GPU. This was for the first time feasible on a console because of the higher programmability.
Well of course not a hundred percent sure, but every source (incl. official ones) tells us about an almost current gen Radeon GPU with some extra features. Just the great increase in shader units and greater bandwidth alone, should guarantee better performance.
Now let's say that this time around you can texture from eDRAM. With Wii U's limited main RAM bandwidth this is desirable. In fact, it could be entirely possible that you must texture from eDRAM, which AFAICR was the case in Flipper/Hollyoowd AFAIK, which would make a big bandwidth hit on it unavoidable. In this case it's easy to see how Wii U's GPU eDRAM bandwidth could end up a limiter long before Xenos's, while still providing tremendously more than what main RAM alone does.
The FXAA in games like Arkham City is probably done in GPGPU; I don't think they'd be able to work that into the already overburdened CPU and main RAM load. This suggests at least a possibility of being able to texture from eDRAM. I don't think they'd add it if it hurt framerates further on a game that is already compromised; in this case the GPU probably had enough spare fillrate and ALU power but there could be additional overhead introduced in the extra resolve plus texturing bandwidth from main RAM that'd be involved if it can't texture from eDRAM.
If Wii U's GPU eDRAM were as capable as Xenos's then we should be seeing at least 2xMSAA on ports that didn't have it on XBox 360; it'd be free given that Wii U's GPU has more than 2x the eDRAM.
That would make the WiiU some of the worst designed console hardware ever.
The EDRAM is not going to be big enough to hold a whole level's textures in it, that means you will have to load textures from the DDR3 RAM, thus eating twice the bandwidth for no reason (if it does run at only the same speed as the DDR3 RAM).
FXAA is a post process pixel shader, nothing makes it "GPGPU".
That's purely semantics. I described it that way because on some other platforms (like PS3) it might actually make the most sense to do frame post-processing off the GPU.
It's just image image processing. PPAA is actually a pretty good real world example of a successful GPGPU application. It's not actually something GPU designers planned for, but it's the kind of job that can efficiently be adapted to a GPU's limitations. The PPAA revolution started with an Intel engineer doing this MLAA stuff on a CPU, was adapted to the Cell by Pandemic before being evolved to a version well suited to running on a GPU.
I must say, I saw a video of the gamepad in action on miiverse and it's pretty cool. Each game has a community, it's easy to ask for help and everything is so well done. It's like wii facebook. I'm surprised the battery is so bad though and no multi touch is kinda sad at this point in time. I'm guessing later on the wii U's hardware potential should exceed ps360 in the long run and that's good enough for nintendo.
Next gen engines are all about scaling down to mobile so I'm guessing many multi platform titles will release on orbis/durango, ps360, wii U, iphone/ipad, android and win 8 phones.
These sorts of posts really irk me. It's basically a personal attack against a perceived prejudice without naming names or presenting evidence in support of the view. Who are these people? I recall talk of stuff like physics on Xenos as a possibility, but I'm not sure it was ever presented as sure thing. I certainly couldn't name anyone proposing/supporting that idea. And as it pans out, they would have been wrong, it seems. Xenos does graphics, not GPGPU. So anyone who though Xenos was able to do physics back in 2005 and would in all fairness be able to change their mind as it turns out their belief in GPGPU was unjustified then.I find it slightly comical that some of the same people who were touting the 360 GPGPU capabilities to help its CPU against Cell, is now downplaying the better implementation (most likely) of the same idea for the Wii U.
One can't help but think it's just ill will towards the name.
Except we don't know the (practical) BW of the eDRAM.If the rumored large eDRAM pools on both dies are true the smallish bandwidth will be more than made up for AND it will allow the design to scale better with improvements in fabbing.
You can't just add bandwidth together like this. If eDRAM is used both for render targets and textures it's basically like a cache.
There's extra bandwidth overhead transferring to and from main RAM to eDRAM to fill it with textures and resolve render targets (unless the video output comes straight from eDRAM, in which case that takes bandwidth instead). I don't know how big a typical scene texture footprint is, but if you're streaming it into an eDRAM texture cache you'll need to basically double buffer it to have some part reserved for the incoming textures. This could cut down the amount of space by a fair amount.
Of course, it's possible that there's dedicated hardware that'll manage the eDRAM as an outer level texture cache.
But isn't that what we're seeing, that in BO2 Wii U is typically behind XBox 360 and ahead of PS3?
I don't know how it's done in games but ideally I'd expect you to be post-processing frame N while frame N+1 is being rasterized, if you have the space for it.
Of course I don't think the eDRAM runs at the same speed as the DDR3, that doesn't really make sense.
Not completely true though. There are a number of things being done as 'GPGPU' functions on Xenos. There was a lot of work done to get HD DVD support running for instance, as Sebbbi explained, and apparently GPGPU logic is also used for Kinect analysis?
I must say, I saw a video of the gamepad in action on miiverse and it's pretty cool. Each game has a community, it's easy to ask for help and everything is so well done. It's like wii facebook. I'm surprised the battery is so bad though and no multi touch is kinda sad at this point in time. I'm guessing later on the wii U's hardware potential should exceed ps360 in the long run and that's good enough for nintendo.
Not completely true though. There are a number of things being done as 'GPGPU' functions on Xenos. There was a lot of work done to get HD DVD support running for instance, as Sebbbi explained, and apparently GPGPU logic is also used for Kinect analysis?
I could see them going multitouch if multitouch (capacative) touchscreens become more durable. As usual, Nintendo has to cater to the lowest common denominator: "Mr Butterfingers McImbecile" who will likely wedge his GamePad into his TV screen at some point, or jump up and down on the screen (possible in frustration over the OS loading times ) and Capacitive screen are far less duarble than resistive. Resistive screens are tougher and dont wear out as quickly (although its unlikely to reach its wear-down date in its lifetime in either case - they both last pretty long). Once multitouch resistive screens are mainstream, it wont be a problem of course.
That's the thing, I don't doubt we will see GPU compute used on the Wii for some things, just as it's been used on the 360.
What I don't believe is that Nintendo gave the WiiU a weak-sauce CPU because they planned for GPGPU to "take over" all the work in "CPU centric" PS360 ports. Nintendo gave the WiiU a cheap CPU because they wanted to put a cheap CPU in the Wii.
Resistive screen is probably more durable because of its plastic screen, but plastic wear down easily compared to glass.
Anyway, you can use plastic instead of glass for capacitive screen.
And I don't believe they will introduce multitouch because there is no point for them to add it.