Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by AlBran, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    How's it going guys?

    I'm quoting this post because it was the catalyst for the following "interview" I'm linking to. I joined to post these in case some of you missed it. In it lherre gives what info he can and I geared the questions around the GPU.

    This is where it starts.
    http://www.neogaf.com/forum/showthread.php?p=29833512#post29833512

    And this is where it essentially finishes at least with my responses.
    http://www.neogaf.com/forum/showthread.php?p=29863782#post29863782

    If you want to get the full context of it, just start with the first link and read through the exchanges beyond mine and lherre's.
     
  2. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    thanks.
    I should have written 785G, but would only be slightly less wrong.

    but still I wonder about the interface. do we expect it to be coherent? may we see a unified address space with common 32bit pointers?
    it's something like the FlexIO in the PS3, but actually working and more useful.
     
    #182 Blazkowicz, Aug 12, 2011
    Last edited by a moderator: Aug 12, 2011
  3. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil

    Nice try, I would go this way too and from what I understand you believe in the hypothesis of the GPU has 400 SIMD / stream processors?

    More questions come to mind ...just thoughts...

    But will the nintendo(i know they have other culture etc) based on impressions that developers its possible going might surprise us with 640 or even 800 SIMD?

    I read that the launch is scheduled for April 2012, even when they can "delay" its production schedule with companies responsible for manufacturing its next gen console (until November / December?)?

    I read once(im remember many talks) that sdk beta and final xbox360 was ready about 5 to 6 months before its release, so I guess a hypothesis that wii u may have changed their specs from last time.
     
  4. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    109
    Likes Received:
    9
    400 - 480, yes. I really can't think of more if it is in fact a SoC. Maybe 640 would be possible, but 800 is IMO out of sight (I am in no way an expert or insider, this is just based on looking @ transistor count / die size and the fact that it will be on 45nm).
     
  5. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    You're thinking of the CPU. Unless I missed it along the way there has been no information provided about the size of the GPU.
     
  6. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,135
    Likes Received:
    2,248
    Location:
    Wrong thread
    There's nothing ruling out CPU and GPU on the same die yet, and it could have a number of benefits. Even IBMs wording about what it is they're fabbing is vague, perhaps intentionally so...
     
  7. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
  8. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
    Same die 32nm anyone? Like x360s,ps2 slim (EE + GS and late eDRAM on same die)?
     
  9. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    If you check out what I posted earlier it won't be as easy to commit to it being any R700. Plus there are other things that show "borrowing" from at least the level of Evergreen. But yeah 40nm would be the most likely size used.
     
  10. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil

    Very interesting what was said Wii U to be a gpu like Frankenstein (mix many cores,simds etc) better), it would be interesting to see something similar with more developed in APUs ahead if sony or ms adopt them.
     
  11. brain_stew

    Regular

    Joined:
    Jun 4, 2006
    Messages:
    556
    Likes Received:
    0
    :wink:
     
  12. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I'm thinking we may see both edram L2 and edram "big chunk", so that the L2 may be low latency and tuned enough.
     
  13. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,059
    Likes Received:
    1,021
    From IBMs press release
    Incidentally, that's pretty much the only solid information we have directly from the horses mouth. I don't see much vagueness to be honest, they could hardly be expected to say "but please note that we will not integrate an AMD GPU on the die" even if they won't. It's a press release about an IBM processor, they won't be talking about what the product isn't! Of course, that does leave room for the idea that it might include a GPU, but...
    If the die does contain a GPU, it would be a first on SOI as far as I know, and somewhat newsworthy, and as IBM wants to bang their custom silicon drum I'd expect them to talk about it.
    Making a full custom chip like that with IP from two sides, and making it work, is one heck of a lot more complex than letting either company make their own chip, on familiar process nodes, and then letting the chips talk over an agreed upon interface. MUCH easier from a collaborative standpoint, less likely to run into deadline crushing snags, the individual dies are smaller and it is much safer to assume decent yields, and a problem on one end doesn't hold up either design work or production on the other.
    It is not a coincidence that the 360 didn't integrate CPU and GPU until two process nodes down the line.
    While nothing official explicitly denies integrated CPU and GPU, there is nothing that confirms it either:
    From AMDs press release
    Also note that there is NO mention of IBM in AMDs press release and NO mention of AMD in IBMs. For such a noteworthy collaborative effort as a joint design of a novel processor in service of a mutual customer, that's simply unheard of.
     
  14. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    Likes Received:
    0
    I/O (SATA, USB etc) is AMBA AHB based, so most likely handled by an ARM coprocessor.
     
  15. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    I agree. I'm not as technically advanced as most of you here (I've been working to improve that after a long time away from tech knowledge), but with him saying only one thing in the final kit could probably be taken as an R700 my first inclination as of now would be the SPUs and 800 SPUs an amount (although high) that could cause him to say something like that. The second would be memory as I get the feeling GDDR5 won't be used.

    True. But Nintendo loves their NDAs a lot. I can see something where Nintendo's CPU/GPU is based of the XCGPU, or vice versa considering Xenon used PPEs yet came out before PS3. I know wsippel and I have debated about this elsewhere, but incorporating some of his previous views I could see something where the CPU has a large amount of eDRAM for L2 cache (16MB so to speak) and there would also be another amount like 1T-SRAM (24-32MB) like Xenos' eDRAM.

    I once read somewhere, and I need to find it for my sanity, that IBM's process for implementing the eDRAM on the chip would allow either 1 or 3 Gigabits (or maybe it was listed as Gibibits, can't remember that either but doesn't seem to matter too much) to be placed on a chip. With the A2 already having 8MB for L2 cache, I would assume doubling that amount is a reasonable task.
     
  16. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,135
    Likes Received:
    2,248
    Location:
    Wrong thread
    If it's not a CGPU then I guess they could, but I rather feel that Nintendo won't be in a hurry to shrink or combine, and may never do it unless heat and cooling issues encourage them to do so.

    Compared to the 360S and the PS3 Slim, the WiiU will already be small, low power consumption and low heat output, and the benefits Nintendo could get from a shrink should be small compared to the 90nm and 65 nm PS360s. Perhaps smaller than for the current PS360s too.

    AFAIK, Nintendo didn't shrink the N64, GC or Wii even though they could have. But when your case is small, you don't generate much heat, your PSU and regulation stuff is already cheap and you already have small fans and cheap, quiet cooling then the cost benefits aren't there and so the penalty of using a newer and more expensive process isn't worth it.

    Unless Nintendo have accidentally made something that's a bit of a first gen Xbox 360 (overheating, needing 4 tons of heatsinks + emergency heatsink, self destructing, fans maxxing out and causing ear-bleed) I think Nintendo will be happy to sit back and let the launch system run its course. Mind you, there have been those stories about overheating ... if they can't sort them I guess it could be a downclock now or a shrink later (or both).

    I still find it odd that IBM didn't refer to making a CPU or "central processor"; they talk about making a processor that is the "heart" of the system. This is a press release for general consumption too. They have no problem talking about having made the Gamecube "central processor" in the same press release. That feels like deliberate vague-ness to me. But perhaps I'm reading too much into this.

    I think it'd be a pity if the GPU couldn't make use of all the edram that's supposed to be on the IBM processor at the "heart" of the Wii.

    I thought Llano was SOI btw?

    I don't think a WiiU CGPU would be as big as an all-in-one 90 nm or 65 nm 360. AMD didn't have experience of developing Fusion products then either. IBM have also cut their teeth on the "45nm CPU + GPU SoC" now.

    I don't think AMD (or IBM) would dare confirm something that Nintendo wanted secret! But they wouldn't be dishonest about it either. Has anyone been able to directly ask AMD or IBM about any of this?

    In the case of a CGPU, AMD wouldn't be able to mention it for the same reason IBM mumbled something about making the microprocessor which is the heart of the WiiU (but yeah they made the central processor for the Gamecube). However cool it might be, if Nintendo wanted it secret (and Nintendo like secrets) it won't be talked about.
     
  17. RudeCurve

    Banned

    Joined:
    Jun 1, 2008
    Messages:
    2,831
    Likes Received:
    0
    Makes sense, if you start out small you can't really significantly shrink it much. Also if your console isn't designed to have long legs it doesn't make sense to spend money to shrink it even if you could.
     
  18. sfried

    Regular

    Joined:
    Apr 9, 2006
    Messages:
    542
    Likes Received:
    2
  19. MDX

    MDX
    Newcomer

    Joined:
    Nov 28, 2006
    Messages:
    206
    Likes Received:
    0
    What about MoSys Bandwidth Engine?

    for a SoC or SoP?


    http://siliconcowboy.wordpress.com/2010/06/15/mosys-fires-up-the-bandwidth-engine/
     
  20. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,965
    Likes Received:
    4,554
    MoSys 1T-SRAM it is, then?


    I guess this nullifies the chances of getting high amounts of RAM (>=2GB) using dirt-cheap G/DDR3.

    Though it's still unkown if the 1T-SRAM will be for graphics, system or both.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...