Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Nightz

    Newcomer

    Joined:
    Sep 25, 2003
    Messages:
    240
    Likes Received:
    19
    Do we know yet how much embedded RAM there is? And what the bandwidth might be on that.
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Nope. Rumours have placed it at 32MBs consistently but given the POWER7 confusion, that may be a figure from nowhere. It's what I believe is in there though. If significantly less (eg. 10 or 16 MBs), Wii U will have lost much of its main redeeming feature regards system design.
     
  3. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
  4. babybumb

    Regular

    Joined:
    Dec 9, 2011
    Messages:
    609
    Likes Received:
    24
    Edram is shared with CPU?

    I expected the memory bw to be slightly up from 360 to help the tablet burden. But this is really something else that will badly affect performance.

    GPU is somewhat irrelevant now other than "features"
     
  5. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Wow, this entire launch seems like a disaster.

    As pointed out on GAF, it's now entirely possible Wii U will be in a worst position relative to PS4/720 than Wii was to 360/PS3 (considering it appears Wii U may be at or below current gen, whereas presumably Wii was >Xbox and certainly PS2).

    I did say way back when I figured Nintendo would find a way to badly bottleneck the thing...

    And there's a 5GB, one hour plus download out of the box for basic functionality. If Microsoft launched a console in this state the internet would have burned down from the rage.
     
  6. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    Likes Received:
    0
    I don't know what it's for, but it seems there might be another memory chip on the bottom side of the PCB? Close to the MCM, manufactured by Hynix.


    The VGleaks specs are apparently spot-on and pretty much copied and pasted from the Nintendo SDSG website (warioworld.com). Assuming they are indeed correct, it's 32MB, and those 32MB are the "main RAM" - MEM1, as Nintendo calls it. The 2GB DDR3 are secondary memory (MEM2).
     
  7. Ika

    Ika
    Newcomer Subscriber

    Joined:
    Jun 3, 2012
    Messages:
    74
    Likes Received:
    17
    Nice picture, thanks. Looks like the Wii-U uses some anisotropic filtering at least., because it looks a bit sharper.
     
  8. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0

    Wii main ram latency:roughtly 6ns,random,the texture+frame buffer ram requiring 4ns latency.

    So based on this there has to be 32 megs of edram in the gpu,at leat on 500 mhz,and with at least 4096 bit wide memory interface.

    Simply the main memory latency is not good enough for the wii hw emulation. (that is slower than the embeded 24megs edram of the wii)
     
  9. Hey Alstrong became an internet celebrity for calculating the total bandwidth :)

    Yeah the floor textures seem a bit sharper, but the lower res shadowmaps are a lot more noticeable, making the Wii U version look worse.
    I don't know what resolution the X360 is outputting (sub-720p or not?) but it's certain that the Wii U isn't doing any better, which is a big let down by itself.
     
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    That's not proven anywhere. All the components are on the same package, but we don't know how they are connected.
     
  11. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    I doubt it, DDR3-1600 or whatever it is has quite a bit higher latencies for random access, especially if you don't just look at the memory, but also include the memory controller. It is probably closer to 60ns than 6ns :roll:. GPUs can usually only dream of 60ns memory latency, it's far higher (and for the Wii U GPU probably, too).

    And stop making something up with no base other than some almost irrelevant latency numbers!
     
  12. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    Indeed.
    I can understand if they have gone with a 64-bit interface to DDR3, but damn, that makes the achievable results extremely dependent on how well the MEM1 memory pool is utilized. That's bound to bite them, hard, on ports where the focus understandably will be on things other than ripping out and rewriting otherwise perfectly well functioning game code.
    I wonder if it would be possible to get a thread going on eDRAM usage and consequences.

    The WiiU design is extremely GPU-centric compared to any previous console design. It will be interesting to see how its future competitors are balanced.
     
  13. wsippel

    Newcomer

    Joined:
    Nov 24, 2006
    Messages:
    229
    Likes Received:
    0
    The latency figures come from Nintendo:

    http://www.nintendo.de/Kundenservic...Technische-Daten/Technische-Daten-619165.html
     
  14. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    Yes [edit: you linked some game cube specs, but close enough, the higher clock speed of the Wii reduced the latency to ~4 ns]. But he was answering a post about the Wii U 64bit DDR3 main memory (that's why I thought he referred to the Wii U, I probably stopped reading his latency posts very thorough :oops:). He tries to establish requirements for the emulation of the old Wii and claims the latencies are too high to get it done. I think this leads nowhere.

    And btw., the Wii also had 64MB GDDR3 as the largest memory pool with a very likely much higher latency. Even the 24MB 1T-SRAM had probably a higher latency than the much smaller texture and framebuffer pools (over a certain size, latency tends to be dominated by the wiring, not the memory cells itself), especially it is not on die but one needs to go over an external interface to a second die to access it.
    left: Broadway, right: Hollywood with its components Vegas (GPU with 3 MB integrated 1T-SRAM) on top and Napa (1T-SRAM chip with 24MB) on the bottom

    [​IMG]
     
    #3214 Gipsel, Nov 18, 2012
    Last edited by a moderator: Nov 18, 2012
  15. TheLump

    Regular

    Joined:
    Jul 13, 2012
    Messages:
    280
    Likes Received:
    9

    Yeah the launch hasn't exactly been peachy has it. That download and the slow o/s arent the best first impressions! Let's hope they sort the latter out.

    Must admit I fail to see how the RAM speed changes WiiUs position going into the next gen that drastically. Not saying it doesn't matter - but it's what a lot of people were claiming it would be, isn't it?


    Edit: Plus I think its confirmed now that it's a 1GB download. Still massive though.
     
    #3215 TheLump, Nov 18, 2012
    Last edited by a moderator: Nov 18, 2012
  16. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    Great,the WII 24 megs of embedded memory has less than 10ns latency

    So it is not possible to use the main memory pool for wii hw emulation.

    But it means that the edram has to be big enough to fit the 24megs of edram ,so the minimum size is 32 megs (if we consider the frame buffer + the texture buffer as well)

    there are four possible scenario after this.
    a,they using the texture cache as texture buffer for legacy wii games, in that case the minimum bandwith is defined by the frame buffer data with - 64 bit/1megs of edram.
    So in that case it is 2048 bit,on 500 mhz(minimum)
    b,they using the edram for the texture buffer as well,and instead of 1 megs, they using 4 megs (low 256 kbyte on each megabyte) with 128 bit bandwidth /1megs - 4096 bit interface, 500 mhz minimum
    c,2 megs texture+2megs frame+24 megs main +4 megs free-> 256 bit/megs bandwidth 8192 bit 500 mhz minimum.
    d.,1 megs texture+2megs frame+24 megs main +5 megs free-> 512 bit/megs bandwidth 16384 bit 500 mhz minimum.

    All of this considering that they didn't implemented same fancy pre-complier for the wii games, with per-game microcode management.
     
  17. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    It doesn't make any sense.
    The GC has been designed to be able to read/write 4 times the memory on every frame.
    In this case the WIIU can read only the quoter of the main memory on every frame.

    Why they implemented this much memory ?
    It is simply wast of the money.512 megs should be enough.
    The best part of the memory will serve as a cache for the DVD reading.
     
  18. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    Your "arguments" are not making any sense. :roll:
     
  19. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    Is there any "disassembly"? iFixIt?
     
  20. Ika

    Ika
    Newcomer Subscriber

    Joined:
    Jun 3, 2012
    Messages:
    74
    Likes Received:
    17
    I'm not here to state if the Wii-U is more powerful or not, because I can't provide proof for you, but I'm sure that these games are using ported engines, engines not built up from scratch with the Wii-U in mind.
    The other consoles, where the majority of the games and engines will come at the beginning are all GPU bond hardware, while the Wii-U is a CPU bond/limited console, so one would need an entirely different approach and solutions to get the best out of it. (It would be nice to see Rage on the Wii-U for example, I wonder if it would be possible to use GPU-transcode to "help" the CPU, like how it's possible on the PC)

    I'm sure that sooner or later devs will learn how to achieve the same or better visuals what you can see on the xbox360 or on the PS3 because the GPU is probably more powerful (I assume).
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...