Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Status
Not open for further replies.
So it's starting to look like the WiiU GPU might not even be faster than Xenon in every way.

But that's not really the most worrying thing IMO - a video Ika linked to in the Digital Foundry thread shows Darksiders 2 taking much longer to load than the Xbox installed version, and it even manages to be slower than the vanilla Xbox DVD version.

http://www.youtube.com/watch?v=5nha4XiXnSg

Anyone want to take a punt on the speed of the BluRay (but not) optical drive in the WiiU? 2X CLV like the PS3 and most BR players? I'd expected something like a 4X CAV drive, but that should be faster than the Xbox DVD drive iirc. Presumably it's not a data layout and seeking issue when they've got so much more space on the drive.

From NG: http://www.neogaf.com/forum/showpost.php?p=44519731&postcount=56

Posting anonymously, just because.

Speaking as a developer who's worked on the PS3, the Xbox 360 and the WiiU. The CPU on the WiiU has some nice things on it. But its not as powerful as the Xbox360 chip. I think N went to IBM and asked them: 'What's cheap to put on the chip?' and IBM said 'Well we have this sh*t that no-one wants.' and N said 'we'll take it.'. It does have better branch prediction than the PPCs in the PS3 and Xbox360.

The Espresso chip doesn't have any sort of vector processing. It does have paired singles, but that's a pain, a real pain to use. The floating point registers are 64 bit doubles, so when people talk about paired singles I assumed you split the register in two. No the registers are actually 96 bits wide, its actually a double and a single. To load it you have to load in your single, do a merge operation to move that single to the upper 32 bits, and load in your second one. This makes the stacks explode, because to save a floating point register in the callee takes three operations, and 12 bytes no matter what.

While the WiiU has 1 gig of RAM available to the game to use, the RAM is slow. The cache on the chip is also slow. We had tested out memory bandwidth between cache and main memory on the xbox360 and the WiiU. The main memory access on the Xbox360 is about 2x-4x times as fast as accesses the cache on the WiiU. Yes I mean that the external to the chip RAM on the Xbox360 is faster than the cache memory on the WiiU. I don't remember the full results but I think we figured out accessing the hard drive on the Xbox360 was faster than the RAM on the WiiU too.

The optical drive is also slow. I don't know for sure but it feels like the same drive that went into the PS3. And on the PS3 we used the hard drive to cache things to improve load speeds. Without a hard drive on the WiiU we can't do that.

I won't go into the OS, and the programming environment, but let me just say I hate programming for Windows, and I prefer programming on the Xbox360 to the WiiU.

While the GPU in the WiiU is better (probably because ATI doesn't make anything worse these days), they don't have the CPU and RAM to back it up. Who knows maybe things will be better from launch, but I'm glad to leave the WiiU behind.
If it really is 2x without HDD for installs...
 
How can the cache be slower than main memory access on the 360, or main memory be slower than a hard disk access?

That doesn't compute for me.
 
How can the cache be slower than main memory access on the 360, or main memory be slower than a hard disk access?

That doesn't compute for me.
I wonder if that person is speaking of bandwidth and not latencies (I can the the later make sense in any way).
I researched a while ago the characteristic the L2 interconnect in ppc 470. The interconnect provide up to 25.6 GB/s of bandwidth, that is for the interconnect at its highest speed ie 800MHz for a core running at 1.6GHz.
As the CPU is running slower than that and the interconnect works at 1/2, 1/3, 1/4 the core clock speed, well could it be that they measure less bandwidth to the L2 than what 10.8GB/s Xenon has to the northbridge and down the line the gddr3? Either way hardware is buggy :(

For the HDD vs ram I don't know either how much bandwidth the HDD provides but could it be that it provides more than 12GB/s in the best cases?
EDIT, I did search for transfert rate about HDD, I fail to see how this could happen... either way the thing is completely or partially fake, or the chip Nintendo is selling is badly broken/buggy, that would be a failure of epic proportion.
Avoiding the gloom and doom, I would go for fake or partially fake, but the noise about hard locking system is still scary (and might be the reason behind the noise about the brinked system during about ie the update is slow but is not the reason why the systems brick).
 
Last edited by a moderator:
Maybe these RAM beating speeds are when reading out of the HDD's cache? But even then the interface to the HDD would be nowhere close to allowing multiple GB/s..

Maybe he mixed up "RAM" with the flash "memory"? *cough*

Yeah, something like this makes the most sense, and is easily plausible.
 
From NG: http://www.neogaf.com/forum/showpost.php?p=44519731&postcount=56

If it really is 2x without HDD for installs...

I strongly doubt that. This example is not a good one - it's on-demand streaming, which would never show the speed of the drive properly, as the transfer speed takes a little while to get going (even a 10x drive starts out at 4x). Streaming-wise, you may also have longer latency for instance if it stops spinning the disc when not in use, which is a typical power saving measure, but very bad for streaming games, and something Nintendo could fix.

That the RAM is slow, and that the cache pipeline sucks, though, who knows may be true. Can be a design issue, or an engine issue, where reading from memory is shared with the GPU and the GPU has most of the bandwidth, or maybe more likely a matter of lower clock speeds (CPU and bus and cache), and has more of a chance of not being fixable. But it's early days with new hardware - workarounds for bottlenecks are often found after a while.
 
Even a top-end device (edit: SSD) on SATA 6 is going to have 16 times less bandwidth than main memory.
Thanks though I did researched my self HDD transfert... after posting which I should not have done... stupid me.
Yeah, something like this makes the most sense, and is easily plausible.
So how about the bandwidth to the cache vs the RAM?
I don't remember if Xenon has 10.8GB/s read and write to the main, 10.8GB/s read or write, or if it is 10.8GB/s read and 10.8GB/s write.

To get there (say 10.8GHz) and using IBM data for the PPC 467x, for a core running 1.2 GHz the interconnect would have to work below 337MHz which could mean that the interconnect operate at 1/4 the core clock speed.

IF xenon total bandwidth to the main RAM of 21.6 GB/s then all it takes is for the interconnect to work at half speed (that get you around 19.6GB/s to the caches).
 
Last edited by a moderator:
As far as I knew GPGPU are designed to take the reliance off of CPUs and allow the GPU to be used for some of the processes the CPU would usually be required for and more fully take advantage of parralel processing (using OpenCL) In theory this negates the need somewhat for a more powerful CPU. My thinking was that if the other two new consoles also went down the same route of using a GPGPU based system then porting from them would be more straight forward than if the the games being ported were based on platforms with a heavy reliance on a powerful CPU.

See, no need to put any words there ;)

GPGPU is only good for some very limited cases, which apparently exclude even ingame physics. You certainly can't use it for any gameplay related tasks like AI, pathfinding, simulation of various game systems like weather, economy, and so on.
It's more about very limited physics based simulations of particle systems or water, or image processing - stuff that's more like an extra feature in a game and not a principal element. So it can't be viewed as a CPU replacement, only as a sort of a helping hand (but more like a very strong little finger).

I also never really heard anything about Nintendo having any related ideas before reading it here as an apology for the weak CPU. Is there any official info on that? Or is it just someone seeing that the original ATI GPU offers support so it must be a Wii U feature as well?
 
See, no need to put any words there ;)

GPGPU is only good for some very limited cases, which apparently exclude even ingame physics. You certainly can't use it for any gameplay related tasks like AI, pathfinding, simulation of various game systems like weather, economy, and so on.
It's more about very limited physics based simulations of particle systems or water, or image processing - stuff that's more like an extra feature in a game and not a principal element. So it can't be viewed as a CPU replacement, only as a sort of a helping hand (but more like a very strong little finger).

I also never really heard anything about Nintendo having any related ideas before reading it here as an apology for the weak CPU. Is there any official info on that? Or is it just someone seeing that the original ATI GPU offers support so it must be a Wii U feature as well?

Ok but I wasn't referring to it as a magic bullet ;) (namely because I'm not sure enough about what it does - as demonstrated!) Thanks for explaining. There was so much hype surrounding it a few years ago with regards to gaming, how it will change devices etc that it's confusing for a layperson such as myself...

Anyway, it was mentioned by Nintendo themselves in a Nintendo Direct. Specifically said GPGPU :???:
 
Usually the approach should be to not believe any kind of hype :) And if it was Nintendo talking about it, well that was just hype as well.

But GPGPU is for example good for us in the CGI business at times, there's some rendering stuff and specialized FX simulation tools that can utilize GPUs. Heck, even Weta's lighting system used from Avatar and on is primarily GPU based, before the cached data gets fed to Renderman.
But in the end everyone prefers good old fashioned CPUs in the render farm, there's just no replacement for that...
 
If there was no fear that Nintendo will have the PS2 equivalent for next gen, people would not care about how powerful it is. I dont see anybody getting hung up on the OUYA specs.
The number of threads focusing on the WiiU clearly shows people fear WiiU's influence on MS and Sony if its popular. Wii has done this already.

Most of the threads are lamenting it's lack of power, or it's buggy launch problems. I fail to see how that is good. You have threads like these CPU comments really dominating a shocking amount of the Wii U discussion on neogaf or here.

From day one the Wii U has been hugely dogged with questions about how powerful it is. In a way the Wii avoided. It's not good news for it, because to me it shows it's playing in the core space (that Wii did not). In the core space, power is key. If Wii U is going to compete in the core space, it's a goner imo.
 
Yeah, there's practically nothing about the touch screen features at all in any discussions. And as it's been mentioned, Smartglass can already provide most of its features for free basically - if you have a tablet or a smart phone.
 
So how about the bandwidth to the cache vs the RAM?

Memory bandwidth measurements can be really touchy. If you look at mobile SoCs you'll see that bandwidth tests for them often come nowhere remotely close to peak.. sometimes they only get a small fraction. One reason for this is cache policies causing extra cycles on misses (for instance, if you have to also evict some other line, or if you can't stream straight from L2 without first taking miss penalties from L1, etc). Another reason is that they could be struggling to hide all of the latency of the access. These simpler cores tend to have weaker hardware prefetching and need more software prefetching to hide latency, but even then it might not be fully attainable using only one core. Or it could never be sustainable at all, if the cache simply doesn't support enough outstanding misses.
 
I disagree, I wish consoles would just go back to being exactly that..

That's why I loved PS1 & PS2 so much, no frills, no stupid gimmicks....just put your disks in and play...

Now it seems that if your console doesn't have the ability to flash advertising in your consumers face it's not doing its job properly.

You can have your personal opinion, that's fine. But if your opinion is out of step with what the majority of the users are asking for, you are unlikely to be catered too.

I'd say the features that Nintendo has put into the WiiU (belatedly because the Wii was simply incapable of providing those features), demonstrates that their customer research shows that being a simple unitasker won't fly in this era. Most particularly telling, is that one of their most highly anticipated apps, is TVii. Of course, that's not something that actually exists at this time. But it shows the push away from a unitasker and Nintendo's acknowledgement that its a necessity.
 
Again, read my post. I'm not trying to defend their decisions, I'm just trying to find the justification. Unless you think there is none?

Isn't that what I said above? I do believe I mentioned how I admire your ability to steadfastly attempt to find reasons and hope for what is clearly a lost cause a totally flubbed attempt at a console launch.

By the way, on my side of the pond, "justify" and "defend" are synonyms.

Just as you said earlier, "it's not a popularity contest, it's a numbers game." In terms of impact of numbers of the WiiU sold and the willingness of developers to produce games.

Also, synonyms.

Strawman? No. This is rather you responding to each criticism of your over-reaching optimism by saying "I'm not doing X. I'm just doing X."
 
You're putting words in my mouth there. I haven't said it was a magic bullet. I was just saying what Nintendo was probably banking on

So, am I really the only one who thinks this is hysterical?

It's not a magic bullet, it's just what Nintendo was probably banking on in order to make their console function at anything resembling a decent level? Not a magic bullet, just the keystone of the design.
 
I wonder what was Nintendo's goal when designing Wii U's internals..Decent GPU, underpowered CPU and limiting memory bandwidth. I thought N was aiming for ps360 visual at 1080p but that might not be possible given all the limitations that developers face.
Nintendo may have gone too far being too cost conscious.
 
Nintendo may have gone too far being too cost conscious.
Wuu isn't even cheap, so it's just overall a fail of beyond epic proportions. I expected the harware to be flawed (software too tbh), but that everything is this fail is just...staggering.

So many YEARS they've had to work on this system, what on earth have they been doing? Been roaring drunk the whole time or what? Seems like they've pissed away most of the time and just thrown something - anything! - together largely in a panic after wii sales started dropping. The concept of the wuu, with the tablet and so on may have existed for a while, but the hardware is so bizarrely underpowered and just plain BAD (like the shitty CPU with its incredibly wonky 96-bit FPU registers and no SIMD, which has been standard on PCs for almost a decade and a half now)...what's one to think?

I can't think of any explanation that makes sense other than they were all fucking drunk off of their asses when they designed this piece of crap.
 
I've almost never seen a consumer pick a product for socially conscious reasons save a bit of Fair Trade. It's not a high priority except in a niche demographic. People will pick over higher priority factors (cost, performance, quality, aesthetics, popularity, etc.) and then, if it happens to be socially conscious, give themselves a pat on the back. Nintendo have been performing miserable on Greenpeace's corporation greenness ratings for ages, and haven't cared one just, and haven't suffered in market success as a result.

Yeah - especially among gamers, who are generally about as far from the 'save the earth' crowd as is possible.

And anyway, electricity is dirt cheap, if I ran my 46" LCD TV for 8 hours each day it'd only cost me $80 a year, my MacBook's yearly powerbill comes to something like $10.

Compare this to the cost of water, heating, fuel, food, housing, telephony etc and it's pretty much amazing value considering how vital it is.
 
Status
Not open for further replies.
Back
Top