Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    Likes Received:
    0
    Rumor has it that the eDRAM isn't just a framebuffer. Depending on what else you're using it for, you might simply not have enough space left for real AA (Batman and Trine use FXAA).
     
  2. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    this has so much the ring of truth, cause it's just so nintendo to shatter hardware expectations, in the wrong direction.

    i'd pretty much bet money it's the case.
     
  3. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    This issue of multi platform will hurt nintendo, they should have set out a gpu/cpu spec that is 50% better with 2gb ram, with the emphasis on future cross platform compatability.

    That way they can have top dog for a year to get good install base, then make wiiu an attractive easy port in future when it isnt the lead platform.
     
  4. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    It doesnt even seem like decent hardware is costly (same goes for Microsoft if they are actually considering Cape Verde over Pitcairn, I mean pitcairn is 210mm).

    Cape Verde pushes a teraflop plus and is 123mm. Too much? I'm sure they could have grabbed teraflop hardware from the r700 line even cheaper.

    It's just Nintendo is not interested, just like last time with Wii.
     
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Yep, just like Wii. Only this time they haven't even got BC as an excuse. We'll have to see what their launch price is like to see if they could have been more adventurous.
     
  6. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    Likes Received:
    0
    The problem is that Espresso also said the chip would be as fast as Xenon. I have no idea how you'd ever reach that level of performance with a low clocked (closer to 729MHz than 3.2GHz - also according to Espresso), single threaded (again: Espresso) three core 750.
     
  7. Does it make any tiny bit of sense to just glue together three 10-year-old CPUs?

    I'm pretty sure that general performance/transistor should be quite a bit higher on newer architectures.
    Would IBM sell a PowerPC 750 architecture license for much less than a Power6 or Power7?

    The Wii might have been an up-clocked gamecube with more memory (though some said they doubled the amount of TEV units?) so they could afford to just use the same core, but the Wii U is a whole different animal. Would they gain that much money on skipping technological advances in this?
     
  8. Prophecy2k

    Veteran

    Joined:
    Dec 17, 2007
    Messages:
    2,468
    Likes Received:
    379
    Location:
    The land that time forgot
    From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
    :wink:
     
  9. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    Likes Received:
    0
    Interesting. So the system isn't all that conventional after all, huh?
     
    #1249 wsippel, Jun 11, 2012
    Last edited by a moderator: Jun 11, 2012
  10. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    Well, that's... just sad (though things pointed in this direction). I wonder if that was the primary focus of the tweaking period the 5th kit went through.

    EDIT: Also goes back to what I said about Nintendo and their pursuit of "balance" in their hardware. Looks like it bit them even worse than I originally thought.

    EDIT 2: But I am glad to see the emphasis on a GPGPU. I didn't expect that from them.
     
    #1250 bgassassin, Jun 11, 2012
    Last edited by a moderator: Jun 11, 2012
  11. steviep

    Newcomer

    Joined:
    Mar 2, 2012
    Messages:
    141
    Likes Received:
    0
    If the CPU is literally Broadway x 3 @ 45nm + more cache, this would be the case.

    If it is, however, something in the 470 line then this wouldn't really be the case. The IBM embedded stuff like the 470S would be deceptive in that matter. Your friend could see a clock rate of "1.6ghz" or "1.8ghz" or something and assume that it's much worse than the 3.2ghz Xenon, but in reality the OOOE 470 line would thrash Xenon in most general purpose code. However, just looking at it on paper (i.e. "holy crap, it's only 1.6ghz!?!") wouldn't give any non-programmer that idea and they'd instantly assume that it is a far weaker unit.
     
  12. babybumb

    Regular

    Joined:
    Dec 9, 2011
    Messages:
    609
    Likes Received:
    24
    Now true believers will claim the GPU is so much better it dosent matter.. COMPUTE shaders believe

    Wii U will burn and crash next spring if support is much better on cheaper machines. Nintendo cant afford this CPU
     
  13. steviep

    Newcomer

    Joined:
    Mar 2, 2012
    Messages:
    141
    Likes Received:
    0
    And Jaguar cores in other consoles would make them right, wouldn't it?

    For yourself, Wii U has been crashing and burning for what... 2 years now?
     
  14. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    You might want to fix that sig unless you believe Tim Sweeney is also laughable. :wink:
     
  15. Warchild

    Newcomer

    Joined:
    Jun 8, 2011
    Messages:
    219
    Likes Received:
    119
    So much for ease of porting games. Has nintendo finalized the cpu yet? Or can we see a change before it launches? If not, why would Nintendo do this?
     
  16. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Maybe they're all masochists on the Nintendo board, they love to have pain and suffering inflicted upon themselves?

    I can't say that the prospect of the gamecube CPU making yet another appearance in a console appeals to me. Iwata's an electronics engineer, I have always respected him for that, but have the man simply gone INSANE? The basic design of that CPU's like fifteen years old now, or close to it anyway. Surely there are ARM cores today with both higher performance both per watt and per transistor than that old PPC nag.
     
  17. I don't believe even for a second that Nintendo decided to bet on GPGPU given its current state of development and adoption.

    I do believe that the presence of a fairly advanced dedicated sound DSP (X-Fi like?), along with a souped-up "Starlet 2" allowed for a simpler CPU.
    Maybe the "SoC" that many people keep talking about does have a fairly powerful ARM CPU (multi-core?) in it, that could offload some tasks from the skimpy "main" CPU.

    That would totally cock-block ports from X360 and PS3, which would go against Nintendo's main selling points for the new console -> again.
     
  18. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    'Weak' CPU is relative. Xenon and Cell are pretty weak for conventional code, but strong in floating point. It'd be possible for a less potent FP performer to still be stronger in useful code.

    If devs are offloading CPU vector maths onto the GPU, that would account for results being well below what the hardware could do as devs have to refactor their code once again. As if XB360 and PS3 ports weren't trouble enough. I can also see why they wouldn't bother with Wuu - the market for the same titles will be a tiny fraction of PS360's market. Is it really worth the effort? And if you're considering developing for Wuu and chatting with other dev mates in other studios you hear that they aren't going to bother, that only snowballs the effect.

    Of course, GPGPU would go against Nintendo's claims that they want to make the platform dev friendly. It'd also be a strange change for them.
     
  19. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    Pulling this totally out of my a** but what are the chances that the C1 core (the one with 2MB L2 cache) is in fact a module with 4 cores or at least 4 thread SMT? Or what would the huge L2 be good for otherwise?
     
  20. Nightz

    Newcomer

    Joined:
    Sep 25, 2003
    Messages:
    240
    Likes Received:
    19
    This is just insane, if true.

    In an ideal world Nintendo would just give IBM and ATI a set price point. And let them use their best technologies to come up with hardware without interfering. Nintendo aren't really at the cutting edge of 3D tech and their internal hardware dev team arguably don't always make the right choices with tech.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...