Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    I think we can all agree at this point (and probably would have for years) that Nintendo makes console hardware for Nintendo. Or at least this is clearly the case for Wii, DS, 3DS, and now Wii U. This has two obvious implications - the hardware gets the new peripherals that support their game ideas and the raw power is only as much as they feel they comfortably want to utilize for that generation (or maybe just a little more). But it seems like there could be more nuances to this.

    Nintendo, as a first party software developer, is very used to the Broadway core at this point. And a lot of the developers probably also have little experience developing on XBox 360 and PS3 since their work is exclusive. Giving them three of them clocked twice as fast and with better L2 cache is going to look like a huge improvement, and more than enough to facilitate a generational improvement in their games. And it's

    It's totally reasonable that Nintendo would want to stick with the levels of tech that they are for their own games, since AAA titles on other platforms cost far more to develop. So right now Nintendo is probably making much more profit per game sold, taking much less risk (although most of their titles are inherently pretty safe anyway), and selling almost as many copies. Given that they could be dead last in the console race and probably still consider the console an overwhelming success if they can keep selling the same quantity of first party titles. And I don't think that worse graphics and CPU is threatening that, if anything's threatening it it'd be Nintendo losing franchise appeal and stagnating. But that's a very different problem.

    In this light the alleged design actually makes a ton of sense. It's enough to support the budget levels they want to spend on games. It's an architecture they're very familiar with and have a lot of tool support in place for so costs little to move to. You get cheaper BC (putting a separate Broadway on the die might not be a huge area expense but it'd definitely cost some in engineering time) and you get a good licensing deal from IBM who seems about as eager to promote eDRAM on CPUs as pushing newer CPUs.

    I could actually see going for better hardware as being a minor disadvantage in some ways, even ignoring costs. Letting third parties produce much better looking games would put more pressure on Nintendo to increase their development efforts. And keeping the generational boosts in check gives them more room to grow next time. Eventually everyone will hit a wall so it makes sense that Nintendo would want to take things as slowly as possible. It could be that Nintendo has a backup plan to release the next console early if this one is in trouble and they think it'll help. It's possible they even had such a plan for Wii but went with it because they thought they could get away with it (and did).

    Sure, Nintendo could have probably offered the same performance using even smaller and lower power more modern cores, but if more performance isn't even desirable why bother? The power consumption difference is negligible with the GPU taking the bulk of it and the die area is also pretty negligible and possibly already pad limited (could in fact be a reason why they stuck with a 64-bit DRAM interface).

    But yeah, this is all a big downer for third parties and probably not good for the industry at large. But none of that necessarily matters that much to Nintendo. What I find really mind boggling is why they'd hamper the battery life on the controller. They seem really determined to screw up on battery life these days, which is pretty disappointing given the history with their older handhelds. There is no way that the dollar they saved on the battery is justifiable given the bad PR and bad reaction they'll get from users. People want devices they don't have to constantly remember to charge. No one wants to reserve a wall socket for their controller (or be plugged in to the wall for that matter). All it'll take is the thing dying enough times to make some owners want to use the thing less, and while they're using it less they end up buying fewer games.
     
  2. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    I don't know for sure, but I think it may could be too late :wink:
     
  3. N_B

    N_B
    Regular

    Joined:
    Sep 14, 2009
    Messages:
    684
    Likes Received:
    0
    Location:
    New Zealand
    COD and Halo raised resolution in their most recent installments ;)
     
  4. haihoo

    Newcomer

    Joined:
    Nov 2, 2012
    Messages:
    14
    Likes Received:
    0
    Are some people here really thinking they can value a CPU with its clock speed?

    I thought the people here have some knowledge about PC tech?
     
  5. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    At the cost of things like SSAO and dynamic lighting...

    And I'm pretty sure Blops 2 is lower then Blops 1?
     
  6. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    thats the choice of Nintendo, it is mainly a 3 core singlethreaded GC/Wii CPU with more cache and higher frequency (and maybe out of order who knows), how else and based on what criteria do you want people to value it ? :roll: with this CPU you are not going to get miracles at 1.24 GHZ, especially that we know its silicon budget....
     
  7. TheD

    Newcomer

    Joined:
    Nov 16, 2008
    Messages:
    214
    Likes Received:
    0
    No one is just straight comparing clock speed.

    We are knowledgeable enough to know that a very old design CPU with limited OoOE is unlikely to be able to compete with a CPU with about 2x the transistor count (assuming similar density) and running at nearly 3x the clock speed.

    BTW: This has nothing to do with PC.
     
    #3647 TheD, Nov 30, 2012
    Last edited by a moderator: Nov 30, 2012
  8. haihoo

    Newcomer

    Joined:
    Nov 2, 2012
    Messages:
    14
    Likes Received:
    0
    No one expects miracles to from the Wii U CPU, but to say it is "slower" than Xbox360-Xenos based on the clock speed is nonsense, if you look at the technical details of Xenos and the known specifications of the Wii U CPU.

    The CPU is truly the weakest part of the Wii U system. But with the DSP and the fast connection to GPU and eDRAM it has some big advantages.

    In fact with optimised code even the CPU alone should show more power than Xenos in games, but not much more. Code from Xbox360 ported to Wii U really needs optimizing, for now, most games are only straight ports from Xbox360 games and use timings that are good on a Xbox360 system but don't work efficiently with the Wii U system.

    The thing is: A straight port from Xbox360 is obviously very easy. Darksiders 2 developers mention that it took only five weeks and three people. And most developers only wanted to get the Wii U game finished before launch. The optimization of the graphics engines for the Wii U will take some time but once it is done the Wii U will show its full potential.
     
  9. TheD

    Newcomer

    Joined:
    Nov 16, 2008
    Messages:
    214
    Likes Received:
    0
    Nice selective quoting :roll:
    The parts you missed:
    He did not say it was slower based just on clock speed!
     
    #3649 TheD, Nov 30, 2012
    Last edited by a moderator: Nov 30, 2012
  10. Gerry

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    803
    Likes Received:
    170
    Nobody has said this. It's a classic strawman.

    How on earth to you come to that conclusion? I mean, it may or may not be true, but you seem to have simply stuck your finger in the air and come up with a complete guess.
     
  11. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Based on what?
     
  12. Squeak

    Veteran

    Joined:
    Jul 13, 2002
    Messages:
    1,262
    Likes Received:
    32
    Location:
    Denmark
    Some things to consider before saying anything about the CPU.


    • The cores are a very different architecture than the PPE style cores. Among other things they have OoE, shorter pipeline and a larger different instructionset.
    • The OS runs on a separate ARM processor. AFAIR on xenos the os uses an entire thread on one of the cores.
    • Also the sound, while not a huge part of the processing, is handled by a DSP, freeing up even more resources.
    • The cache is much larger.

    This means that the CPU is good at doing all of the non vector/SIMD stuff, leaving that to the GPU.

    The CPU could well be as powerfull or more at what it is designated to do, than the equivalent in PS3 or 360.

    It bears repeating because some people still don't seem to get it: Number of cycles/hz is not a very useful way of comparing architectures that are not very similar.

    Also high clockrates are one of the chief reasons for heat.
    The lower clockrate you can have while maintaining performance the better.
    You'll need less cooling hardware. Have less noise. Have a system that lasts longer and has fewer failures. Etc.
     
  13. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    lherre dropped the tidbit that Wii U GPU clock was raised from 400mhz to 550 over it's development.

    http://www.neogaf.com/forum/showpost.php?p=44912749&postcount=2163

    Kind of interesting, maybe the genesis of some of those wild fanboi stories of vastly improved wii u performance in various dev kits.

    Also I guess kind of double confirms the GPU clock.

    So hey, I guess Wii U performance did actually move in a positive direction at some point :razz:
     
  14. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    A2 has excellent SIMD peak throughput (256 bit SIMD with FMA x 16 cores) and excellent memory/cache performance (4x SMT memory latency hiding + huge 32 MB L2 cache). Both of these factors are very important for game performance.

    Yes, A2 relies a lot in parallelism (multithreading, SMT, SIMD and FMA are all forms of parallelism). But games often tend to have lots of parallelism. A2 has hardware transactional memory support built in it's big 32 MB L2 cache, it makes it easier to write efficient multithreaded code.

    I wouldn't compare A2 to Cell (someone made this comparison earlier), because A2 has unified memory access, automated (very large) caches and a single instruction set (Cell SPUs had a separate instruction set). You had to manually move memory from/to different computational units in Cell (and ensure synchronization and data correctness). A2 does all that automatically, and it has transactional memory as well, to make things even more efficient / easier (even compared to PC CPUs).
     
  15. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0

    but we know the following :

    1- the WiiU CPU bears a lot of similarities with a very old CPU design from the 1999-2000 era that is the GC/Wii CPU. Whatever the improvements IBM brought to the table, the WiiU CPU should still be considered partly as an ancient CPU. So it is not completely a new design and thus we should not expect unrealistic levels of improvements in efficiency.

    2- the WiiU CPU is a 3 core singlethreaded 1.24 GHZ, the xbox360 CPU is a 3 Core dualthreaded 3.2 GHZ. Unless the WiiU CPU Cores are 6 times more efficient than the xbox360 cores (very unlikely bearing in mind my first point), it would be safe to assume that the xbox360 CPU is more powerful than the WiiU CPU. I would even conclude that if the WiiU CPU turns out in practice not a lot less powerful than xbox360 CPU, that would still be considered as a technological achievement for IBM. (improving an ancient CPU design and at only 1.24 GHZ).
     
    #3655 DoctorFouad, Nov 30, 2012
    Last edited by a moderator: Nov 30, 2012
  16. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Incorrect. It may be the difference between original performance of the 1999 CPU and twice its performance is a simple tweak in the execution units. Being 'based on' doesn't tell us anything about how efficient the final design is. We'd need low-level details to know that.

    In total throughput, Xenon will be more powerful. But in terms of executing game code, a lot depends on the code devs are using. I believe Xenon will be more powerful because I believe devs are writing optimised code that makes efficient use of the processor, but it's wrong to compare the straight numbers. GHz*threads is not at all accurate!
     
  17. Squeak

    Veteran

    Joined:
    Jul 13, 2002
    Messages:
    1,262
    Likes Received:
    32
    Location:
    Denmark
    DoctorFouad:

    There is a lot of "ancient" technology in the computing world, both on the hardware and software side (I'd say most real ideas in use today (not fluff or variations) in the field are over 20 - 30 years old .
    That a core is old, doesn't necessarily equate with worse (the PPE core is very likely also based on old stuff). For example the core of the ARM and Intel ISAs date back to the mid 80's with key ideas and mechanisms going back much further.
    That a design is old on the contrary much of the time means, that it is more tightly/frugally designed and you can fit a lot of auxiliary stuff or other cores on the same die and run it at higher speeds.

    The xenos can't quite be equalled with five 1.6 Ghz CPUs. There is still an overhead for multithreading and some tasks are not good with it (there is a reason people use different cores and not just single CPUs with huge sets of registers).
    The Expresso could well be faster at branching and cases where IPS counts (some would call it general purpose code).
    xenos also handles a lot of SIMD, where the advanced GPU in the Wii U would be better for all of that.
     
  18. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Plus, isn't the 2nd thread on 360 cores sort of a "light" or hyperthreading type thread? It isn't a true 100% full thread, just something to grab some unused execution resources IIRC.

    FWIW Wii U CPU is 3 cores at 1.7 the Wii CPU clock. So 3X1.7=5.1X Wii CPU. Then with more cache maybe you can call it 6X. I got that from a GAF post, but I guess 6X isn't a horrible generational leap, though back in the PS360 day it wasnt a great one either.
     
  19. dumbo11

    Regular

    Joined:
    Apr 21, 2010
    Messages:
    440
    Likes Received:
    7
    In some ways, I think you could make a comparison between OOE and this dual-threading?
     
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    That's what threading is. If you have a thread that has its own parallel code stream and execution units, you've got a core. ;) Threading is about optimising use of execution units by running multiple streams of code through the processor. Depending on the number and type of execution units, threading can have no benefits at all.

    For Wii U's CPU capability, we need to know firstly the peek throughput in terms of execution units, and then compare its efficiency vs. code running on Xenon.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...