Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    I feel like the usual "CPU bound or GPU bound" dichotomy is too simplistic. A game will switch through a bunch of different kinds of workloads. A weak CPU can have an impact on minimum frame rates even when the average is barely affected.

    It would seem that workloads that were already being targeted for deployment on Cell SPEs or CPU threads could be easier to push to GPGPU, although not necessarily trivial, especially with pretty old GPU hardware that isn't really as flexible as it is today. The question is, do we know for sure that (at least some) games aren't already doing this?
     
  2. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    I think the elephant in the room is that the WiiU is a piss-weak machine unless you compare it only with the Wii and PS360. The discussion seems to have been directed along this line (perhaps subconsciously) as kind of diversion from the fact that for a mains powered console - ostensibly trying to attract core gamers - it quite amazingly struggles to stand nose to nose with a core gamer system launched seven years earlier.

    It's not like the 360 can't do GPGPU either (sebbbi has already talked about doing it in Trails Evolution), or that the 360 is a DX9 machine and WiiU is DX 10.1 so it automagically wins everything, or anything like that.

    And I don't think Nintendo are banking on GPGPU anyway. Nintendo are banking on being able to make a profit from a platform that is, in terms of processing, cheap (and therefore weak) and that's it. And good luck to them, it worked with the Wii. And yes, I still want one, and yes I'm still exited about Wuu Zelda but dammit the WiiU is the way it is because Nintendo don't value powerful hardware not because of GPGPU / unused edram / low latency / lazy devs not optimising for OoOE etc etc.
     
    #3622 function, Nov 29, 2012
    Last edited by a moderator: Nov 29, 2012
  3. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    It (Broadway) doesn't seem like a more advanced OoO design than Bobcat to me. It looks like there's nothing more than a single reservation station slots in front of the execution units, or two in the case of the load/store unit. This is combined with a six-entry (in order) completion queue that balances the six-entry instruction queue and six (total, across five execution units) reservation stations. Also balance with some minimal register renaming: there are a total of 6 GPR + 6 FPR + CR, LR, and CTR rename registers,

    So I'd actually call this pretty minimally OoO and far behind Bobcat's reordering capabilities.

    In terms of peak execution width it's somewhat similar - both are bottle-necked by dual-dispatch per cycle, although Broadway can fold a branch (not sure if Bobcat can). On the other hand, it only has a single load/store unit (Bobcat can co-issue a load and store) which can be a pretty major deficiency.
     
  4. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    I think he was saying that Bobcat seems like a more advanced OoO design (in addition to clocking higher).
     
  5. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    That makes more sense now :D But I don't regret my post... too much. Maybe someone wanted to know >_>
     
  6. MDX

    MDX
    Newcomer

    Joined:
    Nov 28, 2006
    Messages:
    206
    Likes Received:
    0
    Of course Nintendo values power.
    But who is going to buy it?

    What makes more sense, selling 75 million units of one console every 10 to 12 years
    or 150 million units from two consoles each every 10 to 12 years?

    When it comes down to it, Nintendo is only a game company.
    They have to take into consideration more than just how powerful to make their machines.
    Their machines need to be profitable, they need to be affordable, and they need to attract new customers to continue to stay in business.

    Their first party titles all have shown marked increase in visuals for each console generation. As long as they can continue to doing so, you cant criticize them for taking smaller iterations with their hardware.

    I used to make fun of the small black and white gameboy when I had the big powerful full color LYNX. I could never figure out why the Gameboy was the better selling machine. But guess who is still in business, not Atari. Now we got Sony on the ropes.
     
  7. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    I did!

    I tried reading an old M68000 assembly programming book earlier this year (dunno why) and it pretty much kicked my ass (I have no assembly programming experience, I'm a hobbyist C++/C# guy) but I maybe learned a couple of few things too. Seeing Bobcat compared to another small, low power processor is not only fun, I think I actually understood some of it too! Hurrah!
     
  8. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    I don't have a problem with Nintendo releasing weak and outdated hardware to try and make money. I just think it's odd that so many seem to deny this is what Nintendo are doing, or seek to justify it beyond Nintendo profit motives.

    I think it's also worth saying that it's not a binary "WiiU" or "PS4720" situation. Nintendo could have included another CPU core, or better CPU cores (break the bank on a behemoth like Bobcat), or used DDR3 1866 memory, or added additional shaders (the GPU is not a big chip by console standards even with the embedded memory) or gone batshit-insane-mega-power crazy and used a 6 cm fan instead of a 5 cm fan and pushed the CPU clock all the way up to a dizzying 1.3 gHz. Nintendo didn't because they - as a business - do not value powerful hardware. They don't think it brings in the returns necessary to justify itself. This isn't a recent things either - look at the SNES (incredibly weak CPU), the N64 (hardware Sega turned down two years earlier) or even the Gamecube (well below the Xbox). Even before they made the Wii / WiiU Nintendo knew hardware had to pay for itself and that R&D costs shouldn't commit you to a platform for a decade to make your money back.

    Nintendo are right to do what they think they should, but it doesn't mean that the hardware isn't .... what it is.
     
  9. Gitaroo

    Veteran

    Joined:
    Nov 10, 2007
    Messages:
    1,921
    Likes Received:
    62
    Question would be would these 80 millions Wii owners willing to pay for 300 for outdated hardware again base on their satisfaction from wii. Most early adopters are most likely die hard nintendo fans, which I doubt that even makes up ok 1/3 of the wii user base. And if they want to sell them to ppl outside of Nintendo fans, or ppl that never care for nintendo franchise what can they attract ppl with? It won't be getting next gen consoles games or look even close, no blu ray play back, 300 bucks with tiny libraries that 90% are going to be sloppy ports from current gen? Assume that porting gotten better even if Wii U starts getting superior version of multi platform games, would ppl rather get or already own 360 or ps3 that play that same games a long with its 6-7 years line up of games.
     
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Okay, can we contain Wii U talk in this thread to just the technology in it, and not Nintendo's business plan etc. There are plenty of other threads for that! ;)
     
  11. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    I would imagine that quite a lot of games are fill rate bound by this point in the life cycle.

    That's why native resolutions have been dropping slowly, COD being a prime example of this.

    Oh they are doing that, is it because they're worried about failing and ending up like Sega or because they care more about the money?

    Either way with the sheer profit they made from the Wii and are still currently making on the Wii they could of completely gone all out on Wii U, after all they can afford too!
     
  12. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I saw a first gen Lynx a few days ago in a store that sells and buys used stuff, with a collection of about 12 or 15 games in a bag made for them. My reaction was first, jaw dropping : it was the first time I've actually seen one. Then I saw it was really huge. bigger than a PSP, a game gear, maybe wider than Wuu pad. Then I remembered it probably had a low battery life, like the game gear - the standard used to be disposable batteries.

    The Game Boy had a good battery life on regular batteries (over 10 hours, and four batteries rather than six), rechargeable ones were probably quite expensive 20 years ago and were some crappy NiCad, plus with disposable batteries you can just play after not using it for one week or six monthes.
    It had a great form factors too and when it came to the games, they were fun (funnily, smooth scrolling relies on the LCD blur). Tetris, Gargoyle Quest, whatever (I liked the Batman game) then a few years later what was probably the greatest Zelda game.
    Then it even played the Pokémon crap for the kids but that was not my cup of tea.
     
  13. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    195
    Location:
    Stateless
    For me the sad part is that I don't think that cost explains the lacking of the system.
    If you look at the overall picture, they have used 189 mm^2 (33 for the CPU and 156 for the GPU) of silicon. It is not too shaby, not too mention that the CPU seems to use EDRAM and that the GPU includes Edram, so it is not the cheapest process available.

    What I think is disheartening is that the enhanced broadway core might be in the same ballpark as IBM PPC 470s which are more modern CPU designs. May they have chosen those CPU they may have been able to produce them on a possibly cheaper process and wafers.
    Those CPU are according to IBM own number ~4mm^2 on their 45nm process, it could be less on TSMC process. They may have give up on some cache but I'm not sure about the implication on perfs as those processors are "better" (wider, more advance out of order execution, etc.).
    Those CPU also use an interconnect fabric that make the connection of PCI type of device possible (from IBM paper).
    To me it sounds like those CPU cores along with the matching interconnect could have allowed Nintendo to design a SOC without having to spend humongous amounts or R&D, could be wrong though but people like you or Exophase would have a better opinion on the matter.

    On the other side you have the GPU significantly bigger than Redwood (104mm^2) or Turks (118mm^2). I think it is not pushing too far to assume that Nintendo could have put both one quad core based on PPC 470s and a redwood/Turks within a chip not bigger than the GPU they ended up using. It would have been produced at using TSMC 40 nm process.

    I think that is not pushing too far either to assume that going with a 128 bit bus to even DDR3-1600 may have been cheaper than using two chips on more expansive process, having a bigger silicon budget, using a MCM.

    The whole thing is that I've no issue with Nintendo business model and their will to not engage the fight on specs and largely subsidize their systems. I've an issue with them closing them selves some part of the market based on what seems to be bad design choices.

    I think that Nintendo could have been king of the hill for one year and that would not have hurted them in anyway, quiet the contrary . It may also have result in more high studio studio jumping on board. EA for example seems it is going FB 2 engine for more franchises going forward and it seems that it won't be ported to the WiiU.

    Really as I see it cost is not the issue, the issue is sucky design. Like I said months ago and Mize concurred lately, they should not longer design Hardware them selves and move to for example Google practices on the matter.

    I believe that the WiiU (a superior one on top of it) could have launched at 299$.
    I think they should have passed on 2 SKUs and have a better basic SKU.
    Looking at some early game troubles I think that they should have embark a bit more flash reserved for caching (8 more GB)
     
  14. MDX

    MDX
    Newcomer

    Joined:
    Nov 28, 2006
    Messages:
    206
    Likes Received:
    0
    I dont think that would fit with their latency objectives.
     
  15. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Higher clocked, more expensive RAM actually tends to give better latencies compared to standard bulk RAM. There's several articles over at Anandtech for example that shows this.
     
  16. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Or at the least the latency stays the same. "CAS" latency is expressed in clock cycles, so if you have ddr3 1333 CL9, 1600 CL10 or CL11 and 1866 CL11 (and 1066 CL7) the latency is roughly the same or a bit better, with marginal benefits when the memory controller runs faster.

    incidentally, ddr2 800 CL5, or ddr 400 CL 2.5 is still around the same ballpark in latency so processors need great caches and design to improve on their predecessors, or just rely on multithreading.
     
  17. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,411
    Location:
    Wrong thread
    Perhaps you're right, but I can't help thinking that some of the die area on the edram is used primarily to reduce costs related to the main memory bus and to allow for Wii BC (in the absence of a fast CPU and GPU). The Xbox 360's software emulation of Xbox 1 has perhaps made emulation powered BC look easy, but that was a massive and ambitious effort by someone who appears to be a genius OS guy, and it was by no means the complete solution that Nintendo have typically gone for in their portable consoles and in the Wii and WiiU.

    Regarding pure performance, Trinity desktop processors on a 128-bit DDR3 bus seem to offer far more performance than the WiiU and Xbox 360 (massively, massively more on the CPU front), so I do think it has to be a cost thing and BC thing.

    It's very kind of you to think my opinion is worth listening to, but you shouldn't put me on the same level of Exophase! You probably know more about processors and low level performance issues than I do - I know you follow this closely on B3D. I'm really just a console warrior who came to B3B a long time ago and over the years has gradually given it up (no doubt influenced by the atmosphere here) and actually started trying to actually learn stuff.

    I think, having followed Nintendo for 20+ years, that you could be correct and that there may have been other options that would have given Nintendo more performance for a similar cost per unit manufacured. But Nintendo value familiarity (who doesn't?) and they also understand the value of the right level of backwards compatibility for certain customers. Being a conservative company I think they plan BC in at an early stage (unlike MS) and plan to do it cheaply (unlike Sony who just include an almost complete version of the old system).

    I don't know what the R&D costs of doing that would be, but I get the feeling with Nintendo that the are also very conservative with R&D as part of the approach to minimising risks. You saw it with the N64 (originally offered to and tweaked by someone else) and the Wii (an overclocked GC, almost). I don't think Nintendo would spend hundreds of millions of dollars on a custom architecture like MS or Sony would.

    Do you know what that 3rd tiny die on the WiiU package is? I don't. What the hell is that? I think it's likely that MS or Sony would have spent the cash to integrate that component - whatever it is - into another chip from day one.

    I don't know for sure, but using an older process like 40nm and taken over a period of 6 ~ 8 years (when DDR3 will be expensive) I think getting a slightly larger GPU to minimise the number of memory chips and board complexity will probably pay for itself.

    Yeah, I don't have a problem with Nintendo wanting to be competitive either, but I think with the WiU they may have missed an opportunity by being a little too conservative on the hardware. A faster CPU and a relatively small bump in everything else would have seen them laughing off the PS360 and giving the impression (even if it wasn't true) that they could perhaps handle PS4720 ports.

    Of course, I thought the Wii was going to sell less than half what it did and come third so I'm certainly no Oracle ... :D

    I agree, it wouldn't have taken much to outperform the PS360 and if it had gotten more users, more developers and more engines on board then it couldn't have hurt. Nintendo seem to think (right or wrong) that money committed to building hardware is dead money and so they seem reluctant to do it.

    I think cost (and design cost) is part of the reason for sucky design. I agree with you and Mize btw; Nintendo should have got AMD to design them a console and given them a larger power budget (maybe 45W) and it would have been a single chip on a volume process and it would have crushed the PS360.

    Agree!
     
  18. MDX

    MDX
    Newcomer

    Joined:
    Nov 28, 2006
    Messages:
    206
    Likes Received:
    0
    Well I read an article that stated the following:

    http://www.neogaf.com/forum/showpost.php?p=44811066&postcount=412
     
  19. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    There is no contradiction. Grall talked about the latency in units of time (i.e. nanoseconds), not in clock cycles. The latter of course goes up roughly linearly with clock speed (often a bit slower, i.e. it results in a slight improvement of the latency measured in time) if nothing else changes fundamentally.
     
  20. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,715
    Likes Received:
    293
    Talk about shooting yourself in the foot if the clock speed is true. For the love of god Nintendo, boost the speed up some and add VMX units or something. Geez, no wonder the 4A guy was so disappointed. While I'm sure plenty of multiplatform titles never pushed the Cell or the Xenon to the max, having a highly refined memory system (supposedly) can only do so much. I'm a bit shocked the multiplatform titles on the Wii U are even capable of running at all. Any indications of huge CPU slow down on any games like AC3 or Darksiders 2?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...