Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by AlBran, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Ika

    Ika
    Newcomer

    Joined:
    Jun 3, 2012
    Messages:
    62
    Likes Received:
    1
    That argument makes zero sense to you" <--FTFY

    You are obviously have no idea (or maybe had once, but completely forgot by now?) the process which brings us the final visual of a game. It's not polygons at SD vs HD! Developers are pre-baking a s**tton of visual enchantments into the scene and into the textures nowadays, a lot more than they did in 2005. They also use various deferred and other render techniques, and 3D algorithms which simply did not exist that time, let alone the insane boom of post process solutions (le.g.: SSAO, FXAA, etc) available nowadays.
    Compilers, libraries and engines indeed develop amazingly over time, and consoles usually get much more powerful as they get older, but there are still a lot of parameters there defining certain limits, limits acting as guidelines for the professionals working with the hardware. For example: the maximum speed of the rasterizer won't change, it's nothing you can do about it. The conversation here in this thread tries to guess and figure out the limits and the capabilities of the GPU in the Wii-U, and not the limits of the developers.

    Rouge Squadron 2 was a launch title on the Gamecube, and you can easily compare it's visuals and engine speed with the very last games on the machine. Backing up your argument with examples instead of logical reasoning and facts never works imho.
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,597
    Likes Received:
    11,003
    Location:
    Under my bridge
    The 'launch titles' argument doesn't hold any more assuming Nintendo are using conventional hardware, because 3rd party ports know exactly how to use that hardware. Comparing ME3 or AC3 should give a reasonable approximation of hardware performance in terms of creating a 'DX9' game, and they aren't showing major advantages on Wuu. Thus Wuu is looking to be in the same performance bracket as PS360, instead of being substantially faster which was an opportunity Nintendo had. And not being substantially faster, is Wuu going to have a strong enough USP to pull gamers away from PS360? If they were for a year the most powerful console in the world by far, offering more PC-like performance than existing console performance, they'd have stood out much better. Instead, Nintendo seem to be joining this generation late just when we're about to transistion. How much support will Wuu then get? So underpowered for the upcoming generation seems a valid description to me. Not that that'll necessarily bother Nintendo any more than Wii being underpowered did, but it will almost certainly means they won't get the core gamers they want who'll prefer the better graphics of the new boxes.
     
  3. Ika

    Ika
    Newcomer

    Joined:
    Jun 3, 2012
    Messages:
    62
    Likes Received:
    1
    What really made me sad is Picmin3 tbh. If you take away the soft shadows, you end up with a Wii like dx7 graphics running in high res. That game is a perfect opportunity to show some very high quality dx10 stuff on a system with "lots of" memory. Small "organic" maps, where you could precalculate and pre-bake some impressive but still clean visuals into the scene, bringing the picmin-ish vegetation into life and they could even top it with some gorgeous post process effects. While it was cute a bit indeed, I did not see anything nextgen-like there.:sad:
    I still play and enjoy old 16 bit games several times every week, and I honestly can't care less about graphics when it's about fun and gameplay and not about graphics whoring or eye-candy (which I also love). So I understand... but ignoring the development of the competition this much is a very risky move, me thinks.

    edit: Picmin == Pikmin:oops:
     
    #1203 Ika, Jun 10, 2012
    Last edited by a moderator: Jun 10, 2012
  4. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    Actually you CAN do that with launch titles for every single machine in every generation, because these titles will have been developed without access to final hardware for the (sometimes vast) majority of its development time. Hardware manufacturers release dev systems with an approximated level of performance, but especially intially that hardware is pretty much entirely different on both hardware and software levels compared to the final version.

    The first dev system for Xbox 360 were multicore Mac Pros with add-on Radeon graphics boards. Other than still using PPC chips back then (hugely different, with pre-emptive execution and a different FPU), and an AMD-made GPU, nothing was the same.

    Lots of ifs there, in that sentence. ;)

    Problem is of course, guessing final performance and performance characteristics from the performance of the devkit; not exactly an easy task. Overestimate, or final performance gets tuned down at the last minute, and then you're in a real bind. Easier for you, and much safer for your stress levels, to underestimate. ...Which of course means launch games will look worse than what the hardware is actually capable of.

    Also, all hardware have their own quirks and performance characteristics. Finding those quirks isn't always a straight-forward logical matter. Not everything is written down in the documentation, sometimes you have to program for the hardware and noticing one implementation is much slower or faster than another, and adjust game code accordingly. All this takes time, you don't find all this stuff out in the few months that final hardware is available, it takes years. That's why later-gen games look better than first-gen games, even today. It's not as if an Unreal Engine SDK being available for a new console from day one will fix all that and let launch games look as good as final gen. No wai.

    What people? You must be talking about completely clueless idiots populating boards for clueless idiots (so why were you there, listening to them? :razz:); you certainly did not find any such people here. I myself have never heard anyone even suggest something that absurd about the Wii.
     
    #1204 Grall, Jun 10, 2012
    Last edited by a moderator: Jun 10, 2012
  5. panyvino

    Newcomer

    Joined:
    Jun 10, 2012
    Messages:
    1
    Likes Received:
    0
    Then Wiiu CPU is not more powerful than the ps3 and xbox360? And graphics card that is known?
     
  6. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    It's not about "un-learning". It's about that just because they understand these things now compared to back then doesn't mean they can't continue to improve on those techniques. His last two posts suggests that because they understand them now, they can't improve anymore.
     
  7. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    55
  8. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,645
    Likes Received:
    5,754
    Location:
    ಠ_ಠ
    hm... Well, 19mm^2.

    You may or may not be underestimating the size implication because you have to consider the differences in cache amount. Below, I have some pretty rough math for how it might be if we did assume the same cores, so bear with me as you read along.

    On a side note, I'd guess the asymmetric L2 for WiiU that we've been hearing about has something to do with trying to make the chip more square-like (or just a happy coincidence). And there'd probably need to be some bloating of the pipelines to hit higher clocks too, but I'm not sure how that will affect die size on the whole.

    This is a pretty roundabout way of finding how large L2 is, but... Xenon was 176mm^2 @90nm. Here's also a die shot of it @ 90nm. http://www.ibm.com/developerworks/power/library/pa-fpfxbox/figure1.gif

    The four large groups of purple are the L2 that add up to 1MB, so each group is 256kB (the smaller ones are probably redundant or for testing?). Anyways, that image would imply (after all the pixel/mm^2 calcs) that 256kB is approximately 5.25mm^2 @90nm. I'd be hard pressed to think that the L2 density is different considering it's still IBM and 90nm, but feel free to correct me though the rest is based on what I think is a fair assumption).

    If we were to triple the Wii cores naively, then we'd get ~57mm^2 @ 90nm. Then we need to add an extra 2.3MB of L2 to match the total L2 we've heard for WiiU i.e. 3MB - 3*256kB (that already exists per Wii core). The extra L2 turns out to be 63mm^2 @ 90nm.

    So a triple core variant of the Wii CPU with the additional L2 would be in excess of 120mm^2 @90nm. Now, there's going to be extra stuff for multicores, hitting higher clocks etc, but my main point now is that shrinking >120mm^2 @90nm to 45nm seems a lot more feasible (not so pad-limited).

    ------

    Of course, this could be all wrong, so carry on. :p I find it rather incredulous that Nint would do such a thing. Besides, we've heard differently from IBM last year (even if vague).
     
  9. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    If Wuu CPU is in fact using eDRAM as cache, even 3MB is not going to eat vastly much more die area than the current Wii SRAM cache... Say, +50% extra at the high end? Someone sure knows this better than me, but methinks it won't be all that much. Combine with process shrinks and Wuu CPU could well end up even tinier than Wii's already miniscule chip.
     
  10. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,645
    Likes Received:
    5,754
    Location:
    ಠ_ಠ
    Right. Actually, if we look at the Power7 die, the 3MB eDRAM L2 should end up being around 7mm^2.

    If we remove the 256kB L2 from Wii CPU, then we're left with around 14mm^2. Triple that... 42mm^2. One-quarter ideal reduction for 90nm->65nm->45nm... ~11mm^2. Add a bunch of fluff for multicore and other funny business with clocks...

    Something around the same size? :p Anyways...
     
  11. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,597
    Likes Received:
    11,003
    Location:
    Under my bridge
    I disagree. Every generation prior to this, there has been a sea-change in how the hardware was developed for because the rendering pipelines were very different. In the earliest days devs had to code very low level to access the hardware. 3D introduced new ways of coding. PS2's generation brought about wholly different rendering architectures. This gen brought about a whole change from the PS2 generation. But now everything has settled down to sending triangles to a GPU to be rasterised and shaded using conventional, years old shader languages. Unless you are doing something pretty radical, like maybe tiling over limited eDRAM, or coding really low level and managing to eek out extra performance (which I question is possible on GPUs), working with one AMD/nVidia GPU is much like working with another. To the point where the same engine can be run on another GPU without too much performance drop due to hardware variations. Yes, optimisation is necessary to get from 25 to 30 fps from course ports, but it's not going to be the difference between launch games looking 5 years older than contemporary tech. How could ME3 be rendered at PS360 quality on Wuu if the hardware is 10x the power thanks to lack of optimisation? Only if Nintendo have their own shader language and rendering pipeline and provide a middleware porting engine for quick and dirty ports. That, or the Wuu has a monster GPU but the whole game is written in OpenCL. :p If Wuu has any normal AMD GPU in there, then DX9 level games from devs with years of experience in DX9 will be able to make a good job with it.
     
  12. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    I acknowledge your stance, Shifty, but I still believe my point stands well founded. :) Rendering pipelines and whatnot aside, at the end of the day you still have to shoot for a specific hardware performance target, and aiming too high means a lot more extrra work than aiming too low.

    Like with Crysis way back when, the game looked fabulous with everything turned up to max, but when your system couldn't handle the load and you started dialing settings back, the end result often ended up looking rather crap, certainly way less pretty than games that were built specifically for roughly your level of performance.
     
  13. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    The part in bold is an interesting take. The core with 2MB of cache is supposed to be a "master core" (which I forgot) to the other two cores. So I don't know if it's intentional that they are making it more square-like.
     
  14. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,645
    Likes Received:
    5,754
    Location:
    ಠ_ಠ
    What makes it a "master core"? Is it specifically 4x the cache of the slaves or is it just "more cache"? Something else entirely?
     
  15. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    Well I think what forumAccount means is pretty clear.
    SMP and programmable shaders are no longer unknown territory. There is a lot of experience now wrt to how to do things right.
    Does that means that the system will be taped in 1 year, no but there is a lot less head-room than in 2005.
    The next big step is striving away from the fixed graphic pipeline using compute GPU capabilities.
    Devs are still experimenting here. Dice implemented their tiled deferred rendering through compute shaders.
    It's doable to some extend on the 360 even more on the WiiU but then it's a matter of computing power. Programmable GPU are nice but it's ain't a magic bullet they need raw muscles I guess to push further.

    Say you wisely selected your sources and the GPu in the WiiU has 8/10 SIMD clocked really low. It's twice the power of the ps360, more in real terms.
    The system should have impressive fill rate thank to the lots of edram.
    It has a nice amount of RAM (1.5GB).
    I can definitely see the system keeps up with the pc world 2 or 3 years.
    I just order a refurbished HP pavillon for 500$ it has a HD6750M inside (and a A8-3500M), I expect it to play "not that new games" well and to somehow keeps up from say 2 years.
    I don't expect it to run in crazy quality but I expect it to run most games. I'm not interested actually by the most demanding games.
    I would put the WiiU there (weaker CPU, less ram but crazy fill rate).

    The strongest limitation may proved the lack of persistant storage. Nintendo may needs to come with the same solution as MS and allow instal on Sd card / USB key. I think this lacking may already cost them Dice support.
    I believe that whatever Nintendo pushes out on the system is irrelevant, for cores gamers it's about third party support.
    Nintendo needs to get Unreal up and running well. They have to get EA and Activision on board fast. That's if they want to have a chance to attract some "core" gamers to the system.
     
    #1215 liolio, Jun 10, 2012
    Last edited by a moderator: Jun 10, 2012
  16. forumaccount

    Newcomer

    Joined:
    Jan 30, 2009
    Messages:
    140
    Likes Received:
    86
    I worked on launch-window title for Wii, it wasn't any different than Gamecube. We had final dev hardware for the entire development cycle. No quirks. No new tech to learn, just different performance characteristics and new a split memory strategy.

    Was our next title better? Yes. Was it because we discovered new stuff about the machine? Not really. It was because we had time to iterate.

    The issue people have is that they're trying to make relative comparisons to 360 and PS3, which is a huge mistake to start with. Even making relative comparisons between PS3 and 360 is challenging. But people insist on making these mistakes, so let's go with it.

    Some developers have mentioned challenges porting 360 and PS3 games to WiiU. Launch titles are looking underwhelming according to some. I see a few potential causes suggested for this.

    * Maybe the developers are incompetent and/or unfamiliar with the hardware.

    * Maybe this is just because developers don't have the final hardware and the final hardware will be powerful enough to resolve their problems.

    * Maybe the hardware isn't capable of handling 360 and PS3 games without significant changes.

    I'm suggesting that believing the first one is dumb. That leaves the second and third options. You seem to be suggesting the second. I think that's a little optimistic myself, but if one is a Nintendo fan they could be forgiven for choosing optimism over history.

    Not clueless idiots, just non-developers speculating wildly. I listen to them because they amuse me. I wouldn't find anyone like that on this forum either, of course not. Certainly not multiple threads full.
     
  17. COPS N RAPPERS

    Regular

    Joined:
    Nov 2, 2008
    Messages:
    957
    Likes Received:
    32
    Reading some of these interesting posts. maybe the Wii-U's games might turn out better than the ps3 and 360's summation of launch to 2012's titles, it's a thought. I mean the Wii-U is benefiting from the upgrades epic has made in the past till now. (fxaa, improved lighting shaders, yata yata extra...)

    you compile all that and you can basically say what took developers to finally get to for ps3 and 360's entire lifespan took the Wii-U's first few months to achieve, and you can expect more the Wii-U will be benefiting for when more future engines are being established.

    however, this is not to say that the ps3 and 360 will be out of commission by the time the Wii-U hits it's first year of being out, they will most likely still be around since the tech for all three platforms are roughly in the same tender..........so 5 consoles to chose from by 2014.:smile2:
     
  18. forumaccount

    Newcomer

    Joined:
    Jan 30, 2009
    Messages:
    140
    Likes Received:
    86
    Improvements come tiny and add up over years. PS3 and 360 games come with many many years of refinements at this point. If WiiU is on par technically with those machines its games will already have many or most of those years of refinements built in. This lowers the expectations for the scope of future improvements.
     
  19. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    About the best way I can say it is "pseudo-Cell" based on what lherre and someone else has said.

    But this is my point. There is still headroom, and that ignores the customizations made by Nintendo.

    This I agree with. My issue is that based on what we know (and don't know) on the hardware it's too soon to say "that's it".
     
    #1219 bgassassin, Jun 10, 2012
    Last edited by a moderator: Jun 10, 2012
  20. ERP

    ERP Moderator
    Moderator Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Unless there is something in there that lets things be done in a fundamentally different way than 360/PS3, then I wouldn't expect huge strides over the launch titles.

    Most of the 360/PS3 improvements have been core rendering algorithms and understanding the production process at that scale.

    The same learning curve will exist for PS4/720.

    Sure people will get a better handle on what works and what doesn't with WiiU, but I wouldn't expect huge strides.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...