Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. Zeross

    Regular

    Joined:
    Jun 3, 2002
    Messages:
    289
    Likes Received:
    26
    Location:
    France
    I would say earlier than 2015, at least for tablets check this article : http://www.anandtech.com/show/6877/the-great-equalizer-part-3

    An iPad4 is already competitive with G7x GPU in modern benchmarks with complex shaders. Consoles still have the advantage of being fixed hardware and much lower level access thus gaining a better level of optimization (and we know the Cell comes in handy to help the RSX) so this is not an apples to apples comparison, but it gives a broad idea.

    If the iPad 5 in 2013 doubles the graphics performance of the iPad 4 and the iPad 6 does the same in 2014, the PS3/360/Wii U triplet will be left in the dust.
     
  2. IMO, there will be no need for eDRAM:
    [​IMG] [​IMG]

    By 2015 we'll probably have LPDDR4 matured enough to do 34GB/s and not much time later, WideIO2 could double that.
     
  3. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    Doesn't the iPad 4th generation already have 17 GB/s bandwidth?

    [​IMG]
     
  4. Apple A5X and A6X use a quad-channel memory controller.

    That said, the A5X uses quad-channel LPDDR2 800MT/s for 12.8GB/s and the A6X uses quad-channel LPDDR2 1066MT/s for 17GB/s.

    I don't know of any other popular mobile SoC that uses quad-channel memory, though.
     
  5. MystWalker

    Newcomer

    Joined:
    Sep 19, 2012
    Messages:
    21
    Likes Received:
    0
    I didn't see this posted yet, but it's pretty interesting. According to this, applications and games don't have direct access to the 32MB of eDRAM (MEM1) on the GPU.

    http://www.vgleaks.com/wii-u-memory-map/

    I assume, then, it must act as a (very) large cache and cannot be directly written to or read from as many of us had assumed. Surely, the eDRAM would have to be accessible by the GPU in some way though, even if it is completely managed by the graphics driver (not sure if this is the right term). I can't see 360/PS3 games running as well as they are if the GPU was confined to only working out of MEM2. What do the rest of you think the implications of this could be?
     
  6. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    You still need stacked memory on interposer if you want to get really fast (and it's a bit more power efficient too). Though that chip might need to be in a Ouya or similar to not get throttled down because of power and heat limitations.
    I bet such a mobile chip made in the late 2010s would get about half as fast as an Xbox one.
    (Yep it's the Wide IO 2 you mention, which is not really classical memory, I feel like it's half-way between external memory and eDRAM)
     
  7. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    Being managed by the graphics library doesn't actually say that much. It could still be possible to explicitly allocate render buffers and textures into it. It's also possible the library can return raw pointers to objects allocated in it.

    It's pretty messed up that Wii U has 2GB of virtual memory but somehow doesn't make it all accessible in the virtual address space, despite having lots of room for it..
     
  8. Thesuffering

    Newcomer

    Joined:
    Jan 29, 2013
    Messages:
    5
    Likes Received:
    0
    http://hdwarriors.com/wii-u-specs-f...gpu-several-generations-ahead-of-current-gen/

    These guys will do an interview with Shinen's Manfred Lizner soon. While i dont doubt the WiiU is no match for the ps4 or the Xbone, I think it has stronger hardware than the ps3 or 360 as proven by EA's ports from its chameleon engine being better graphically.
     
  9. Brad Grenz

    Brad Grenz Philosopher & Poet
    Veteran

    Joined:
    Mar 3, 2005
    Messages:
    2,531
    Likes Received:
    2
    Location:
    Oregon
    This isn't really news. Xbox 360 was a first generation unified shader design. WiiU uses a 4th gen iteration of the idea. Ergo it is "several generations ahead". That doesn't mean it is significantly more powerful. Otherwise they're saying really obvious things. No shit you should use the EDRAM. And you shouldn't ignore the L2 caches? Not exactly profound.
     
  10. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    Nobody ignores L2 cache since it works on your behalf automatically. But most programmers don't carefully size their buffers and tune their algorithms to try to fit within a fixed L2 size. They may have for XBox 360/PS3, again out of necessity.. but moving to Wii U the cache size and partitioning is quite different. The L2 is probably also substantially lower latency, meaning you can afford to lean it on more directly instead of trying to size for L1 cache.

    This might seem like something obvious but it can be a lot of work to really get working best.
     
  11. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    You don't really optimize for cache size. Unless your still working on N64 or PS1 with really basic cache architectures.

    You optimize for cache LINE size, and occasionally you chose to prefetch data or to not write to the cache to prevent a read.
    Assuming cache lines are the same size you would likely end up with exactly the same set of optimizations.
     
  12. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    The quote in your post reads entirely like PR fluff aimed at hyperventilating wuu fanboys. How a 12GB/s memory subsystem and CPU cores from the 1990s, and a GPU with shader cores half a decade old already fit together "perfectly" to form something supposedly 'several generations ahead' is totally beyond me. Nothing (and I do mean literally nothing) shown on wuu so far suggests anything of the sort.

    In fact there's not a single game shown yet that even comes close to the best on PS360, much less 'several generations ahead.' This is all lies and BS.
     
  13. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    From Wii to Wii U there's several GPU generations difference.
    From PS360 there's a number of GPU generations difference, when does "several" start ? Was the translation accurate ?
    (PS3 is a GF7/G70, Wii U is supposedly HD4-5, that's 4-5 generations apart.)

    As for the Wii U not matching PS360, the memory subsystem is quite different, and I assume that those 32MiB fast RAM are critical to get high performance...

    I agree that so far ports have, more often than not, been unimpressive :(
     
  14. Thesuffering

    Newcomer

    Joined:
    Jan 29, 2013
    Messages:
    5
    Likes Received:
    0
    First of all he says the GPU is several generations ahead. Now he is referring to the ps3 and 360 GPU's. Each year is like a generation in GPU years right?(or am i wrong about that?). A 4650 or 4670(http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed) is a few generation ahead of the 360 which is considered round about Geforce 7800/7900 level. As for the CPU i agree its rubbish.

    Actually a few multiplatform games perform better on the WiiU according to the devs themselves. These games are NFS and Deus Ex HR. As for games pushing the system i dunno maybe the free roaming X from monolithsoft if you consider the original Xenoblade Chronicles looked nothing really special on the wii.

    [​IMG]
    [​IMG]

    Not the best image for either of them but they were the best i could find unless you wanted emulator shots.
     
  15. SedentaryJourney

    Regular

    Joined:
    Mar 13, 2003
    Messages:
    478
    Likes Received:
    28
    I have a hard time seeing any GPU advantage the Wii U might possess. It seems the Wii U struggles to keep parity on certain titles and the few titles where it does well aren't significantly better than PS360. NFS:MW is almost identical aside from higher resolution textures.

    If this is the level of performance Nintendo were expecting from the Wii U, I don't see why they emphasized energy efficiency over cost efficiency. I feel that for $50 - $100 less they could have easily sold out to the early adopters instead of hovering under the 4 million mark.
     
  16. pc999

    Veteran

    Joined:
    Mar 13, 2004
    Messages:
    3,628
    Likes Received:
    31
    Location:
    Portugal
    Visuals today are more on the budget of the game than in the machine, anyway...


    Well isnt the xgpu a pre 2600 Radeon part? And Wi u gpu a 4600 part?

    So nothing new. The cpu according to some is relatively on par, but lets remember that the gpu also do more of the cpu jobs and there is a audio cpu which should offload some of the cpu work too. So at the end I woud expect them to extract a bit more too.

    So no wonder it should be at the very least on par with ps360, but quite a different architecture even looking at a glance.

    Personally I would expect that any 360 game could run at 720p, better fremerate, higher rez textures and a few visual, AI, physics extras. Nothing big but noticeable better, if they take the time to do it.


    Still I think that some interesting art and exclusive games could do wonders even in lower budget games.
     
  17. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I'd say the Wii U GPU is two generations ahead of the X360, and three generations ahead of the PS3. Yes, Xbox's 360 GPU is one generation better than the PS3's one. Or it is not..
    See, you had ATI's R300/R420 gen, and nvidia NV4x (including the PS3) which is quite a gen better than it, but ATI's R500 is one gen better than NV4x while being only one gen better than R300 (ignoring the R520/580 side gen)

    Wii U is R700 gen, so two gens better than R500. But R600 (including both Radeon HD 2000 and 3000 series) is only an evolution of R500, and R700 an evolution of the R600 gen (same stuff with fixed ROPs and gddr5 support). So maybe it's only one and a half gen better.

    Sorry if that all reads stupid, I think it is. I would say Nintendo took the bandwth utilisation improvements of having a newer GPU, and used them to make a bandwith starved console that still holds its own (even if barely) against the previous consoles, just to save money.
     
  18. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    I would say that they are about one gen ahead of the PS3, half a gen ahead of the 360 and one behind the PS4/One. I consider the move to unified shaders to be one generation and GCN to be a generation ahead once again. The Xbox 360 is like unified shaders 1.0 whereas the Wii U is like version 1.5.
     
  19. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    People need to clarify what they mean by generation. Annual refreshes are different to 'DirectX' level capability improvements. SM3 parts fit into the same generation in that respect, and SM4 is another gen. Ultimately I think using the term generations causes more trouble than it's worth. How many architectural advances is Wuu over XB360 relative to AMD's product line-up since 2005?
     
  20. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    IMO, the only generation that matters in this scenario is the DX10 advantage wuu supposedly has, but what game shown so far actually leverages that? Of nintendo's own software, the best-looking game is pikmin, and what in that could not be rendered just as well on current consoles? Since 3rd party support is almost nonexistent I'm sure finding better examples from another developer could be very difficult. Ray-man is also a very pretty game, but it is releasing on other consoles so probably no difference in graphics there either probably.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...