Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by AlNets, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. function

    function Wrong thread
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,197
    Question for anyone that knows: is anyone fabbing large, relatively fast processors with large amounts of edram on them other than IBM yet?

    Also: do we know for definite that the GPU has direct access to the edram?
     
  2. Earendil

    Newcomer

    Joined:
    Feb 8, 2012
    Messages:
    56
    @Shifty Geezer

    Thanks
     
  3. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    It is an SoC, but then again, so was Hollywood. Hollywood consisted of the GPU, an ARM core responsible for IO and security, and an audio DSP. The main CPU wasn't part of the SoC (and the ARM core couldn't access the GPU, but that's a different story).
     
    #523 wsippel, Feb 16, 2012
    Last edited by a moderator: Feb 16, 2012
  4. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    11,256
    In that case every chip is an SOC, but that's not what's meant by the term. Are you saying the CPU and GPU are on one die?
     
  5. darkblu

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,642
    As large as IBM's? Unlikely - the 1Mb edram macro making that possible is IBM's tech they've been perfecting for a few (power) generations now - power4 already relied on massive (off-die) edram L3. I suppose IBM are the de-facto market leader ATM.

    We don't, but I'd expect GPU's access to be at least as 'direct' as Xenos'. For reference, Yamato (Xenos' little cousin in the handheld space) has way more direct access to its sram/edram tile buffer.
     
  6. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    5,930
    And if it does, how will you look like?
    Without hardproof and admittedly being driven by some "anti-spec" bias, you can only go so far to defend your point of view.

    This discussion reignites when there's a nail of rumours and repeats itself for about a week. The same people with the same arguments..
    It's getting really repetitive...
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    11,256
    Wont e3 be grand though? :razz:

    I've been believing all along we'll get a real picture of the Wii U BEFORE E3, via some leaks either of game footage or specs, but who knows I guess.

    And yeah, if it is 640 SP+, obviously I'll admit I was "wrong" (though, if it's 640 exactly clocked at 400 mhz or less or something, it gets into a gray area whether thats impressive to me).

    I guess the other thing I'd worry is even if the GPU is ok, maybe it'll be crippled by slow RAM, considering the leaks point at DDR3.
     
  8. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,818
    Location:
    Polaris
    Well, presumably the slow ram is mitigated by the edram for the frame buffer (possibly texture ops as well).
     
  9. function

    function Wrong thread
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,197
    Thanks. I keep thinking about an IBM manufactured CGPU with the edram included on die, even though that's currently an unpopular hypothesis. We know IBM are making the CPU and edram, and that there's currently no GPU fab named, and that AMD are big into their Fusion stuff and worked on the 360S, and putting everything on one chip should be the most power efficient way of combining everything, and so ... ?

    This is baseless speculation again, but I could quite imagine the WiiU CPU doing fewer theoretical peak FLOPS than Xenon but being faster in practice and for less developer effort.
     
  10. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    No idea. I know a former AMD engineers is referring to the chip as "SoC", and there's once again an ARM core on the GPU die. That's all I know. Someone from IBM's Entertainment Processors division was working on a 32nm VLSI project, which might refer to the Wii U chip, or it might refer to a 360 shrink or Oban.
     
  11. Teasy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,563
    Location:
    Newcastle
    How's that?, every chip certainly doesn't include a graphics core, a CPU core, an audio DSP and a significant portion of embedded RAM like Hollywood.

    The leaks?, you're talking as if all leaks point at DDR3, they don't.
     
    #531 Teasy, Feb 19, 2012
    Last edited by a moderator: Feb 19, 2012
  12. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    5,930
    Maybe the "SoC" term is used for the fact that it has an applications processor, memory controller, embedded memory and GPU - even though the applications processor never gets to be used for applications per se.

    Could that be a confirmation that is has a "Starlet 2"?
    For the extra dedicated graphics+sound upstream, maybe a Cortex A5?
     
  13. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    There was this rumor all the way back in 2010.

    The PR I've seen from Marvell in past about the processor makes it sound like something Nintendo would want.
     
  14. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,363
    it's too weak, a damn slow CPU won't give you 1.5x an xbox 360.
    would you wish to make a console with a four core Atom, I guess not. also comes with a useless GPU.
     
  15. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    :?:

    It would be used as the I/O controller like the ARM core (Starlet) in Wii.
     
  16. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    5,930
    It also connects pretty well to AMD's last conference, where we have subtle claims that they're working on interconnecting their GPUs with ARM CPUs. That could be a byproduct of their current efforts with Wii U's "SoC".

    How efficient would it be to have a dedicated ARM CPU for running the graphics driver, memory controller and I/O?
    Would a very fast interconnect between a slow ARM CPU and the GPU be better than a slower interconnect between a fast PowerPC and the GPU, for some tasks?


    Looking at Marvell's portfolio, I can't really find anything that "fits". Armada 500 and 600 are out of question, since they all have 3D GPUs.
    Armada 168 is only ARMv5 (though that's just like Starlet, so BC would be maintained), but it does go all the way to 1GHz.



    It would be a replacement for the ARM9 in the Wii, not the PowerPC. Besides, main CPU is already confirmed to be coming from IBM.
    You might want to google for "Wii Starlet" to see what we're talking about.
     
  17. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    106
    This sounds like something Nintendo would do.
     
  18. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Actually the quad-core being talked about in the rumor article is the Armada XP. This is the one that I saw the PR on and sounded like something Nintendo would want for Wii U.

    http://www.marvell.com/embedded-processors/armada-xp/

    It also has a dual-core version.
     
    #538 bgassassin, Feb 20, 2012
    Last edited by a moderator: Feb 20, 2012
  19. DeadlyNinja

    Veteran

    Joined:
    Jan 3, 2007
    Messages:
    1,221
    I hate to say this, but I think you guys are going to be pretty disappointed this E3. I have a feeling Nintendo is not going to release their specs. They haven't done so for the Wii and even with the fairly respectable hardware like the 3DS, all we can rely on are leaks, vague leaks at that. I don't think Nintendo will give out specs for the Wii-U whether it's 1 billion times the power of current gen or 1 billion times weaker than your calculator.
     
  20. function

    function Wrong thread
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,197
    Nintendo seem to give out their specs when they make them look good.

    That's why they don't often give out their specs. :eek:
     
Thread Status:
Not open for further replies.

Share This Page

Loading...