Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Going by roadmaps/goals as stated and their (IMHO surprising) ability to meet those goals as of yet, I'd say by this time of year in 2015, we'll have surpassed the triplets (PS3/360/Wuu) in mobile devices.

If I had to give odds: 100% tablets meet/exceed; 85% top shelf phones meet/exceed

I would say earlier than 2015, at least for tablets check this article : http://www.anandtech.com/show/6877/the-great-equalizer-part-3

An iPad4 is already competitive with G7x GPU in modern benchmarks with complex shaders. Consoles still have the advantage of being fixed hardware and much lower level access thus gaining a better level of optimization (and we know the Cell comes in handy to help the RSX) so this is not an apples to apples comparison, but it gives a broad idea.

If the iPad 5 in 2013 doubles the graphics performance of the iPad 4 and the iPad 6 does the same in 2014, the PS3/360/Wii U triplet will be left in the dust.
 
That's possibly true. BW on Wii U is pitiful. If any mobile chipsets go the eDRAM route, they are likely to surpass it.

IMO, there will be no need for eDRAM:
sVcpauA.jpg
FjsyYGs.jpg


By 2015 we'll probably have LPDDR4 matured enough to do 34GB/s and not much time later, WideIO2 could double that.
 
Doesn't the iPad 4th generation already have 17 GB/s bandwidth?

Apple A5X and A6X use a quad-channel memory controller.

That said, the A5X uses quad-channel LPDDR2 800MT/s for 12.8GB/s and the A6X uses quad-channel LPDDR2 1066MT/s for 17GB/s.

I don't know of any other popular mobile SoC that uses quad-channel memory, though.
 
I didn't see this posted yet, but it's pretty interesting. According to this, applications and games don't have direct access to the 32MB of eDRAM (MEM1) on the GPU.

http://www.vgleaks.com/wii-u-memory-map/

I assume, then, it must act as a (very) large cache and cannot be directly written to or read from as many of us had assumed. Surely, the eDRAM would have to be accessible by the GPU in some way though, even if it is completely managed by the graphics driver (not sure if this is the right term). I can't see 360/PS3 games running as well as they are if the GPU was confined to only working out of MEM2. What do the rest of you think the implications of this could be?
 
IMO, there will be no need for eDRAM:
http://i.imgur.com/sVcpauA.jpg http://i.imgur.com/FjsyYGs.jpg

By 2015 we'll probably have LPDDR4 matured enough to do 34GB/s and not much time later, WideIO2 could double that.

You still need stacked memory on interposer if you want to get really fast (and it's a bit more power efficient too). Though that chip might need to be in a Ouya or similar to not get throttled down because of power and heat limitations.
I bet such a mobile chip made in the late 2010s would get about half as fast as an Xbox one.
(Yep it's the Wide IO 2 you mention, which is not really classical memory, I feel like it's half-way between external memory and eDRAM)
 
I didn't see this posted yet, but it's pretty interesting. According to this, applications and games don't have direct access to the 32MB of eDRAM (MEM1) on the GPU.

http://www.vgleaks.com/wii-u-memory-map/

I assume, then, it must act as a (very) large cache and cannot be directly written to or read from as many of us had assumed. Surely, the eDRAM would have to be accessible by the GPU in some way though, even if it is completely managed by the graphics driver (not sure if this is the right term). I can't see 360/PS3 games running as well as they are if the GPU was confined to only working out of MEM2. What do the rest of you think the implications of this could be?

Being managed by the graphics library doesn't actually say that much. It could still be possible to explicitly allocate render buffers and textures into it. It's also possible the library can return raw pointers to objects allocated in it.

It's pretty messed up that Wii U has 2GB of virtual memory but somehow doesn't make it all accessible in the virtual address space, despite having lots of room for it..
 
When I reached out to Shin’en Multimedia, to interview them, I wanted to gain a better understanding of how they go about putting such a great looking game such as Nano Assault Neo on the Wii U, to find out more about the Wii U’s inner workings, and how it ‘Punches above its weight’. Thanks to Manfred Linzner and the Shin’en team, I do understand a bit better. But wow, am I impressed with how hard working and humble they are. But I know, I know. You want to hear about the capability. Here is the first interesting tidbit. There is much more to come from this interview!
Iran White: Are there any crucial modern GPU features that the Wii U is lacking?
Linzner:
“The Wii U GPU is several generations ahead of the current gen. It allows many things that were not possible on consoles before. If you develop for Wii U you have to take advantage of these possibilities, otherwise your performance is of course limited. Also your engine layout needs to be different. You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores. Especially the workings of the CPU caches are very important to master. Otherwise you can lose a magnitude of power for cache relevant parts of your code. In the end the Wii U specs fit perfectly together and make a very efficient console when used right.”

http://hdwarriors.com/wii-u-specs-f...gpu-several-generations-ahead-of-current-gen/

These guys will do an interview with Shinen's Manfred Lizner soon. While i dont doubt the WiiU is no match for the ps4 or the Xbone, I think it has stronger hardware than the ps3 or 360 as proven by EA's ports from its chameleon engine being better graphically.
 
This isn't really news. Xbox 360 was a first generation unified shader design. WiiU uses a 4th gen iteration of the idea. Ergo it is "several generations ahead". That doesn't mean it is significantly more powerful. Otherwise they're saying really obvious things. No shit you should use the EDRAM. And you shouldn't ignore the L2 caches? Not exactly profound.
 
And you shouldn't ignore the L2 caches? Not exactly profound.

Nobody ignores L2 cache since it works on your behalf automatically. But most programmers don't carefully size their buffers and tune their algorithms to try to fit within a fixed L2 size. They may have for XBox 360/PS3, again out of necessity.. but moving to Wii U the cache size and partitioning is quite different. The L2 is probably also substantially lower latency, meaning you can afford to lean it on more directly instead of trying to size for L1 cache.

This might seem like something obvious but it can be a lot of work to really get working best.
 
Nobody ignores L2 cache since it works on your behalf automatically. But most programmers don't carefully size their buffers and tune their algorithms to try to fit within a fixed L2 size. They may have for XBox 360/PS3, again out of necessity.. but moving to Wii U the cache size and partitioning is quite different. The L2 is probably also substantially lower latency, meaning you can afford to lean it on more directly instead of trying to size for L1 cache.

This might seem like something obvious but it can be a lot of work to really get working best.

You don't really optimize for cache size. Unless your still working on N64 or PS1 with really basic cache architectures.

You optimize for cache LINE size, and occasionally you chose to prefetch data or to not write to the cache to prevent a read.
Assuming cache lines are the same size you would likely end up with exactly the same set of optimizations.
 
These guys will do an interview with Shinen's Manfred Lizner soon.
The quote in your post reads entirely like PR fluff aimed at hyperventilating wuu fanboys. How a 12GB/s memory subsystem and CPU cores from the 1990s, and a GPU with shader cores half a decade old already fit together "perfectly" to form something supposedly 'several generations ahead' is totally beyond me. Nothing (and I do mean literally nothing) shown on wuu so far suggests anything of the sort.

In fact there's not a single game shown yet that even comes close to the best on PS360, much less 'several generations ahead.' This is all lies and BS.
 
The quote in your post reads entirely like PR fluff aimed at hyperventilating wuu fanboys. How a 12GB/s memory subsystem and CPU cores from the 1990s, and a GPU with shader cores half a decade old already fit together "perfectly" to form something supposedly 'several generations ahead' is totally beyond me. Nothing (and I do mean literally nothing) shown on wuu so far suggests anything of the sort.

In fact there's not a single game shown yet that even comes close to the best on PS360, much less 'several generations ahead.' This is all lies and BS.

From Wii to Wii U there's several GPU generations difference.
From PS360 there's a number of GPU generations difference, when does "several" start ? Was the translation accurate ?
(PS3 is a GF7/G70, Wii U is supposedly HD4-5, that's 4-5 generations apart.)

As for the Wii U not matching PS360, the memory subsystem is quite different, and I assume that those 32MiB fast RAM are critical to get high performance...

I agree that so far ports have, more often than not, been unimpressive :(
 
The quote in your post reads entirely like PR fluff aimed at hyperventilating wuu fanboys. How a 12GB/s memory subsystem and CPU cores from the 1990s, and a GPU with shader cores half a decade old already fit together "perfectly" to form something supposedly 'several generations ahead' is totally beyond me. Nothing (and I do mean literally nothing) shown on wuu so far suggests anything of the sort.

In fact there's not a single game shown yet that even comes close to the best on PS360, much less 'several generations ahead.' This is all lies and BS.

First of all he says the GPU is several generations ahead. Now he is referring to the ps3 and 360 GPU's. Each year is like a generation in GPU years right?(or am i wrong about that?). A 4650 or 4670(http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed) is a few generation ahead of the 360 which is considered round about Geforce 7800/7900 level. As for the CPU i agree its rubbish.

Actually a few multiplatform games perform better on the WiiU according to the devs themselves. These games are NFS and Deus Ex HR. As for games pushing the system i dunno maybe the free roaming X from monolithsoft if you consider the original Xenoblade Chronicles looked nothing really special on the wii.

wqwj6c.jpg

xenoblade00003_35233_640screen.jpg


Not the best image for either of them but they were the best i could find unless you wanted emulator shots.
 
I have a hard time seeing any GPU advantage the Wii U might possess. It seems the Wii U struggles to keep parity on certain titles and the few titles where it does well aren't significantly better than PS360. NFS:MW is almost identical aside from higher resolution textures.

If this is the level of performance Nintendo were expecting from the Wii U, I don't see why they emphasized energy efficiency over cost efficiency. I feel that for $50 - $100 less they could have easily sold out to the early adopters instead of hovering under the 4 million mark.
 
Visuals today are more on the budget of the game than in the machine, anyway...


Well isnt the xgpu a pre 2600 Radeon part? And Wi u gpu a 4600 part?

So nothing new. The cpu according to some is relatively on par, but lets remember that the gpu also do more of the cpu jobs and there is a audio cpu which should offload some of the cpu work too. So at the end I woud expect them to extract a bit more too.

So no wonder it should be at the very least on par with ps360, but quite a different architecture even looking at a glance.

Personally I would expect that any 360 game could run at 720p, better fremerate, higher rez textures and a few visual, AI, physics extras. Nothing big but noticeable better, if they take the time to do it.


Still I think that some interesting art and exclusive games could do wonders even in lower budget games.
 
I'd say the Wii U GPU is two generations ahead of the X360, and three generations ahead of the PS3. Yes, Xbox's 360 GPU is one generation better than the PS3's one. Or it is not..
See, you had ATI's R300/R420 gen, and nvidia NV4x (including the PS3) which is quite a gen better than it, but ATI's R500 is one gen better than NV4x while being only one gen better than R300 (ignoring the R520/580 side gen)

Wii U is R700 gen, so two gens better than R500. But R600 (including both Radeon HD 2000 and 3000 series) is only an evolution of R500, and R700 an evolution of the R600 gen (same stuff with fixed ROPs and gddr5 support). So maybe it's only one and a half gen better.

Sorry if that all reads stupid, I think it is. I would say Nintendo took the bandwth utilisation improvements of having a newer GPU, and used them to make a bandwith starved console that still holds its own (even if barely) against the previous consoles, just to save money.
 
I would say that they are about one gen ahead of the PS3, half a gen ahead of the 360 and one behind the PS4/One. I consider the move to unified shaders to be one generation and GCN to be a generation ahead once again. The Xbox 360 is like unified shaders 1.0 whereas the Wii U is like version 1.5.
 
People need to clarify what they mean by generation. Annual refreshes are different to 'DirectX' level capability improvements. SM3 parts fit into the same generation in that respect, and SM4 is another gen. Ultimately I think using the term generations causes more trouble than it's worth. How many architectural advances is Wuu over XB360 relative to AMD's product line-up since 2005?
 
IMO, the only generation that matters in this scenario is the DX10 advantage wuu supposedly has, but what game shown so far actually leverages that? Of nintendo's own software, the best-looking game is pikmin, and what in that could not be rendered just as well on current consoles? Since 3rd party support is almost nonexistent I'm sure finding better examples from another developer could be very difficult. Ray-man is also a very pretty game, but it is releasing on other consoles so probably no difference in graphics there either probably.
 
Status
Not open for further replies.
Back
Top