Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
That argument makes zero sense.
That argument makes zero sense to you" <--FTFY

People weren't exactly unfamiliar with 3D graphics by the time the 360 came out. We already had 2 previous generations of 3D polygonal graphics.

You claim MS was building from scratch from transitioning from SD to HD. But Nintendo isn't doing the same with THEIR OWN title? What has Nintendo themselves built up from the Wii that'll allow them to seamlessly transition into HD? You think Nintendo would be licensing engines from 3rd party developers?
You are obviously have no idea (or maybe had once, but completely forgot by now?) the process which brings us the final visual of a game. It's not polygons at SD vs HD! Developers are pre-baking a s**tton of visual enchantments into the scene and into the textures nowadays, a lot more than they did in 2005. They also use various deferred and other render techniques, and 3D algorithms which simply did not exist that time, let alone the insane boom of post process solutions (le.g.: SSAO, FXAA, etc) available nowadays.
Compilers, libraries and engines indeed develop amazingly over time, and consoles usually get much more powerful as they get older, but there are still a lot of parameters there defining certain limits, limits acting as guidelines for the professionals working with the hardware. For example: the maximum speed of the rasterizer won't change, it's nothing you can do about it. The conversation here in this thread tries to guess and figure out the limits and the capabilities of the GPU in the Wii-U, and not the limits of the developers.

You act like the jump from N64/PSX to GC/PS2 didn't come with hurdles; or even it's own experience from the previous generation to draw upon. Batman isn't even made on final hardware.
Rouge Squadron 2 was a launch title on the Gamecube, and you can easily compare it's visuals and engine speed with the very last games on the machine. Backing up your argument with examples instead of logical reasoning and facts never works imho.
 
This is just by me eyeing the screen shots, but I really don't see the concern over the Wii U GPU being underpowered during this stage in time.
The 'launch titles' argument doesn't hold any more assuming Nintendo are using conventional hardware, because 3rd party ports know exactly how to use that hardware. Comparing ME3 or AC3 should give a reasonable approximation of hardware performance in terms of creating a 'DX9' game, and they aren't showing major advantages on Wuu. Thus Wuu is looking to be in the same performance bracket as PS360, instead of being substantially faster which was an opportunity Nintendo had. And not being substantially faster, is Wuu going to have a strong enough USP to pull gamers away from PS360? If they were for a year the most powerful console in the world by far, offering more PC-like performance than existing console performance, they'd have stood out much better. Instead, Nintendo seem to be joining this generation late just when we're about to transistion. How much support will Wuu then get? So underpowered for the upcoming generation seems a valid description to me. Not that that'll necessarily bother Nintendo any more than Wii being underpowered did, but it will almost certainly means they won't get the core gamers they want who'll prefer the better graphics of the new boxes.
 
Comparing ME3 or AC3 should give a reasonable approximation of hardware performance in terms of creating a 'DX9' game, and they aren't showing major advantages on Wuu. Thus Wuu is looking to be in the same performance bracket as PS360, instead of being substantially faster which was an opportunity Nintendo had.

What really made me sad is Picmin3 tbh. If you take away the soft shadows, you end up with a Wii like dx7 graphics running in high res. That game is a perfect opportunity to show some very high quality dx10 stuff on a system with "lots of" memory. Small "organic" maps, where you could precalculate and pre-bake some impressive but still clean visuals into the scene, bringing the picmin-ish vegetation into life and they could even top it with some gorgeous post process effects. While it was cute a bit indeed, I did not see anything nextgen-like there.:cry:
I still play and enjoy old 16 bit games several times every week, and I honestly can't care less about graphics when it's about fun and gameplay and not about graphics whoring or eye-candy (which I also love). So I understand... but ignoring the development of the competition this much is a very risky move, me thinks.

edit: Picmin == Pikmin:oops:
 
Last edited by a moderator:
You can't play the "unfamiliar with the hardware" card with every machine. Titles get better as generations progress because people have more time to iterate on tech and improve tools and content generation techniques, not because they magically discover things about the hardware they didn't know.
Actually you CAN do that with launch titles for every single machine in every generation, because these titles will have been developed without access to final hardware for the (sometimes vast) majority of its development time. Hardware manufacturers release dev systems with an approximated level of performance, but especially intially that hardware is pretty much entirely different on both hardware and software levels compared to the final version.

The first dev system for Xbox 360 were multicore Mac Pros with add-on Radeon graphics boards. Other than still using PPC chips back then (hugely different, with pre-emptive execution and a different FPU), and an AMD-made GPU, nothing was the same.

If WiiU is on par with current gen machines all those tools and techniques are in place already, and the tech won't be different enough to matter.
Lots of ifs there, in that sentence. ;)

Problem is of course, guessing final performance and performance characteristics from the performance of the devkit; not exactly an easy task. Overestimate, or final performance gets tuned down at the last minute, and then you're in a real bind. Easier for you, and much safer for your stress levels, to underestimate. ...Which of course means launch games will look worse than what the hardware is actually capable of.

Also, all hardware have their own quirks and performance characteristics. Finding those quirks isn't always a straight-forward logical matter. Not everything is written down in the documentation, sometimes you have to program for the hardware and noticing one implementation is much slower or faster than another, and adjust game code accordingly. All this takes time, you don't find all this stuff out in the few months that final hardware is available, it takes years. That's why later-gen games look better than first-gen games, even today. It's not as if an Unreal Engine SDK being available for a new console from day one will fix all that and let launch games look as good as final gen. No wai.

This reminds me of Wii where for two or three years people were insisting that it was on par with 360
What people? You must be talking about completely clueless idiots populating boards for clueless idiots (so why were you there, listening to them? :p); you certainly did not find any such people here. I myself have never heard anyone even suggest something that absurd about the Wii.
 
Last edited by a moderator:
How do you reach that position from what forumaccount has stated?

Just because the WiiU is new doesn't mean that developers will un-learn everything they've learned developing on the PS360, or that they'll have to go back to Unreal Engine 2, or that artists will go back to PS2 era experience and tools.

If the WiiU is in the same ballpark as PS360 in terms of capability then the huge bump in graphics that came over several years (with developers learning how to use that level of capability and getting the tools to allow it) will already have happened. The WiiU will be much closer to the best it will ever do than the PS360 were at launch.

It's not about "un-learning". It's about that just because they understand these things now compared to back then doesn't mean they can't continue to improve on those techniques. His last two posts suggests that because they understand them now, they can't improve anymore.
 
CAT-DEV-V1-600x394.jpg


CAT-DEV-V1-2-600x395.jpg


http://www.vgleaks.com/wii-u-first-devkits-cat-dev-v1-v2/

is this the 1st time pictures of the 1st Wii U devkits been leaked?
 
That'd be silly. The PPC CPU in Wii is the same as used in the GC, it's a design that's like ten years old PLUS by now. That anyone would use it as a base for new designs is laughable at best. Even at 90nm as used by the launch Wii it only ate like 16 sqmm of silicon; ludicrous to consider for a new design, even if there's three of them now. Shrunken down to a today rather coarse 40nm that tri-core CPU would be so small it couldn't fit the pads it needed for I/O...

hm... Well, 19mm^2.

You may or may not be underestimating the size implication because you have to consider the differences in cache amount. Below, I have some pretty rough math for how it might be if we did assume the same cores, so bear with me as you read along.

On a side note, I'd guess the asymmetric L2 for WiiU that we've been hearing about has something to do with trying to make the chip more square-like (or just a happy coincidence). And there'd probably need to be some bloating of the pipelines to hit higher clocks too, but I'm not sure how that will affect die size on the whole.

This is a pretty roundabout way of finding how large L2 is, but... Xenon was 176mm^2 @90nm. Here's also a die shot of it @ 90nm. http://www.ibm.com/developerworks/power/library/pa-fpfxbox/figure1.gif

The four large groups of purple are the L2 that add up to 1MB, so each group is 256kB (the smaller ones are probably redundant or for testing?). Anyways, that image would imply (after all the pixel/mm^2 calcs) that 256kB is approximately 5.25mm^2 @90nm. I'd be hard pressed to think that the L2 density is different considering it's still IBM and 90nm, but feel free to correct me though the rest is based on what I think is a fair assumption).

If we were to triple the Wii cores naively, then we'd get ~57mm^2 @ 90nm. Then we need to add an extra 2.3MB of L2 to match the total L2 we've heard for WiiU i.e. 3MB - 3*256kB (that already exists per Wii core). The extra L2 turns out to be 63mm^2 @ 90nm.

So a triple core variant of the Wii CPU with the additional L2 would be in excess of 120mm^2 @90nm. Now, there's going to be extra stuff for multicores, hitting higher clocks etc, but my main point now is that shrinking >120mm^2 @90nm to 45nm seems a lot more feasible (not so pad-limited).

------

Of course, this could be all wrong, so carry on. :p I find it rather incredulous that Nint would do such a thing. Besides, we've heard differently from IBM last year (even if vague).
 
If Wuu CPU is in fact using eDRAM as cache, even 3MB is not going to eat vastly much more die area than the current Wii SRAM cache... Say, +50% extra at the high end? Someone sure knows this better than me, but methinks it won't be all that much. Combine with process shrinks and Wuu CPU could well end up even tinier than Wii's already miniscule chip.
 
If Wuu CPU is in fact using eDRAM as cache, even 3MB is not going to eat vastly much more die area than the current Wii SRAM cache... Say, +50% extra at the high end? Someone sure knows this better than me, but methinks it won't be all that much. Combine with process shrinks and Wuu CPU could well end up even tinier than Wii's already miniscule chip.

Right. Actually, if we look at the Power7 die, the 3MB eDRAM L2 should end up being around 7mm^2.

If we remove the 256kB L2 from Wii CPU, then we're left with around 14mm^2. Triple that... 42mm^2. One-quarter ideal reduction for 90nm->65nm->45nm... ~11mm^2. Add a bunch of fluff for multicore and other funny business with clocks...

Something around the same size? :p Anyways...
 
Actually you CAN do that with launch titles for every single machine in every generation, because these titles will have been developed without access to final hardware for the (sometimes vast) majority of its development time.
I disagree. Every generation prior to this, there has been a sea-change in how the hardware was developed for because the rendering pipelines were very different. In the earliest days devs had to code very low level to access the hardware. 3D introduced new ways of coding. PS2's generation brought about wholly different rendering architectures. This gen brought about a whole change from the PS2 generation. But now everything has settled down to sending triangles to a GPU to be rasterised and shaded using conventional, years old shader languages. Unless you are doing something pretty radical, like maybe tiling over limited eDRAM, or coding really low level and managing to eek out extra performance (which I question is possible on GPUs), working with one AMD/nVidia GPU is much like working with another. To the point where the same engine can be run on another GPU without too much performance drop due to hardware variations. Yes, optimisation is necessary to get from 25 to 30 fps from course ports, but it's not going to be the difference between launch games looking 5 years older than contemporary tech. How could ME3 be rendered at PS360 quality on Wuu if the hardware is 10x the power thanks to lack of optimisation? Only if Nintendo have their own shader language and rendering pipeline and provide a middleware porting engine for quick and dirty ports. That, or the Wuu has a monster GPU but the whole game is written in OpenCL. :p If Wuu has any normal AMD GPU in there, then DX9 level games from devs with years of experience in DX9 will be able to make a good job with it.
 
I acknowledge your stance, Shifty, but I still believe my point stands well founded. :) Rendering pipelines and whatnot aside, at the end of the day you still have to shoot for a specific hardware performance target, and aiming too high means a lot more extrra work than aiming too low.

Like with Crysis way back when, the game looked fabulous with everything turned up to max, but when your system couldn't handle the load and you started dialing settings back, the end result often ended up looking rather crap, certainly way less pretty than games that were built specifically for roughly your level of performance.
 
hm... Well, 19mm^2.

You may or may not be underestimating the size implication because you have to consider the differences in cache amount. Below, I have some pretty rough math for how it might be if we did assume the same cores, so bear with me as you read along.

On a side note, I'd guess the asymmetric L2 for WiiU that we've been hearing about has something to do with trying to make the chip more square-like (or just a happy coincidence). And there'd probably need to be some bloating of the pipelines to hit higher clocks too, but I'm not sure how that will affect die size on the whole.

This is a pretty roundabout way of finding how large L2 is, but... Xenon was 176mm^2 @90nm. Here's also a die shot of it @ 90nm. http://www.ibm.com/developerworks/power/library/pa-fpfxbox/figure1.gif

The four large groups of purple are the L2 that add up to 1MB, so each group is 256kB (the smaller ones are probably redundant or for testing?). Anyways, that image would imply (after all the pixel/mm^2 calcs) that 256kB is approximately 5.25mm^2 @90nm. I'd be hard pressed to think that the L2 density is different considering it's still IBM and 90nm, but feel free to correct me though the rest is based on what I think is a fair assumption).

If we were to triple the Wii cores naively, then we'd get ~57mm^2 @ 90nm. Then we need to add an extra 2.3MB of L2 to match the total L2 we've heard for WiiU i.e. 3MB - 3*256kB (that already exists per Wii core). The extra L2 turns out to be 63mm^2 @ 90nm.

So a triple core variant of the Wii CPU with the additional L2 would be in excess of 120mm^2 @90nm. Now, there's going to be extra stuff for multicores, hitting higher clocks etc, but my main point now is that shrinking >120mm^2 @90nm to 45nm seems a lot more feasible (not so pad-limited).

------

Of course, this could be all wrong, so carry on. :p I find it rather incredulous that Nint would do such a thing. Besides, we've heard differently from IBM last year (even if vague).

The part in bold is an interesting take. The core with 2MB of cache is supposed to be a "master core" (which I forgot) to the other two cores. So I don't know if it's intentional that they are making it more square-like.
 
The part in bold is an interesting take. The core with 2MB of cache is supposed to be a "master core" (which I forgot) to the other two cores. So I don't know if it's intentional that they are making it more square-like.

What makes it a "master core"? Is it specifically 4x the cache of the slaves or is it just "more cache"? Something else entirely?
 
It's not about "un-learning". It's about that just because they understand these things now compared to back then doesn't mean they can't continue to improve on those techniques. His last two posts suggests that because they understand them now, they can't improve anymore.
Well I think what forumAccount means is pretty clear.
SMP and programmable shaders are no longer unknown territory. There is a lot of experience now wrt to how to do things right.
Does that means that the system will be taped in 1 year, no but there is a lot less head-room than in 2005.
The next big step is striving away from the fixed graphic pipeline using compute GPU capabilities.
Devs are still experimenting here. Dice implemented their tiled deferred rendering through compute shaders.
It's doable to some extend on the 360 even more on the WiiU but then it's a matter of computing power. Programmable GPU are nice but it's ain't a magic bullet they need raw muscles I guess to push further.

Say you wisely selected your sources and the GPu in the WiiU has 8/10 SIMD clocked really low. It's twice the power of the ps360, more in real terms.
The system should have impressive fill rate thank to the lots of edram.
It has a nice amount of RAM (1.5GB).
I can definitely see the system keeps up with the pc world 2 or 3 years.
I just order a refurbished HP pavillon for 500$ it has a HD6750M inside (and a A8-3500M), I expect it to play "not that new games" well and to somehow keeps up from say 2 years.
I don't expect it to run in crazy quality but I expect it to run most games. I'm not interested actually by the most demanding games.
I would put the WiiU there (weaker CPU, less ram but crazy fill rate).

The strongest limitation may proved the lack of persistant storage. Nintendo may needs to come with the same solution as MS and allow instal on Sd card / USB key. I think this lacking may already cost them Dice support.
I believe that whatever Nintendo pushes out on the system is irrelevant, for cores gamers it's about third party support.
Nintendo needs to get Unreal up and running well. They have to get EA and Activision on board fast. That's if they want to have a chance to attract some "core" gamers to the system.
 
Last edited by a moderator:
Actually you CAN do that with launch titles for every single machine in every generation, because these titles will have been developed without access to final hardware for the (sometimes vast) majority of its development time. Hardware manufacturers release dev systems with an approximated level of performance, but especially intially that hardware is pretty much entirely different on both hardware and software levels compared to the final version.

I worked on launch-window title for Wii, it wasn't any different than Gamecube. We had final dev hardware for the entire development cycle. No quirks. No new tech to learn, just different performance characteristics and new a split memory strategy.

Was our next title better? Yes. Was it because we discovered new stuff about the machine? Not really. It was because we had time to iterate.

All this takes time, you don't find all this stuff out in the few months that final hardware is available, it takes years. That's why later-gen games look better than first-gen games, even today. It's not as if an Unreal Engine SDK being available for a new console from day one will fix all that and let launch games look as good as final gen. No wai.

The issue people have is that they're trying to make relative comparisons to 360 and PS3, which is a huge mistake to start with. Even making relative comparisons between PS3 and 360 is challenging. But people insist on making these mistakes, so let's go with it.

Some developers have mentioned challenges porting 360 and PS3 games to WiiU. Launch titles are looking underwhelming according to some. I see a few potential causes suggested for this.

* Maybe the developers are incompetent and/or unfamiliar with the hardware.

* Maybe this is just because developers don't have the final hardware and the final hardware will be powerful enough to resolve their problems.

* Maybe the hardware isn't capable of handling 360 and PS3 games without significant changes.

I'm suggesting that believing the first one is dumb. That leaves the second and third options. You seem to be suggesting the second. I think that's a little optimistic myself, but if one is a Nintendo fan they could be forgiven for choosing optimism over history.

What people? You must be talking about completely clueless idiots populating boards for clueless idiots (so why were you there, listening to them? :p); you certainly did not find any such people here. I myself have never heard anyone even suggest something that absurd about the Wii.

Not clueless idiots, just non-developers speculating wildly. I listen to them because they amuse me. I wouldn't find anyone like that on this forum either, of course not. Certainly not multiple threads full.
 
Reading some of these interesting posts. maybe the Wii-U's games might turn out better than the ps3 and 360's summation of launch to 2012's titles, it's a thought. I mean the Wii-U is benefiting from the upgrades epic has made in the past till now. (fxaa, improved lighting shaders, yata yata extra...)

you compile all that and you can basically say what took developers to finally get to for ps3 and 360's entire lifespan took the Wii-U's first few months to achieve, and you can expect more the Wii-U will be benefiting for when more future engines are being established.

however, this is not to say that the ps3 and 360 will be out of commission by the time the Wii-U hits it's first year of being out, they will most likely still be around since the tech for all three platforms are roughly in the same tender..........so 5 consoles to chose from by 2014.:smile2:
 
It's not about "un-learning". It's about that just because they understand these things now compared to back then doesn't mean they can't continue to improve on those techniques. His last two posts suggests that because they understand them now, they can't improve anymore.

Improvements come tiny and add up over years. PS3 and 360 games come with many many years of refinements at this point. If WiiU is on par technically with those machines its games will already have many or most of those years of refinements built in. This lowers the expectations for the scope of future improvements.
 
What makes it a "master core"? Is it specifically 4x the cache of the slaves or is it just "more cache"? Something else entirely?

About the best way I can say it is "pseudo-Cell" based on what lherre and someone else has said.

Does that means that the system will be taped in 1 year, no but there is a lot less head-room than in 2005.

But this is my point. There is still headroom, and that ignores the customizations made by Nintendo.

Improvements come tiny and add up over years. PS3 and 360 games come with many many years of refinements at this point. If WiiU is on par technically with those machines its games will already have many or most of those years of refinements built in. This lowers the expectations for the scope of future improvements.

This I agree with. My issue is that based on what we know (and don't know) on the hardware it's too soon to say "that's it".
 
Last edited by a moderator:
Unless there is something in there that lets things be done in a fundamentally different way than 360/PS3, then I wouldn't expect huge strides over the launch titles.

Most of the 360/PS3 improvements have been core rendering algorithms and understanding the production process at that scale.

The same learning curve will exist for PS4/720.

Sure people will get a better handle on what works and what doesn't with WiiU, but I wouldn't expect huge strides.
 
Status
Not open for further replies.
Back
Top