Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Status
Not open for further replies.
Isn't that what I said above? I do believe I mentioned how I admire your ability to steadfastly attempt to find reasons and hope for what is clearly a lost cause a totally flubbed attempt at a console launch.

By the way, on my side of the pond, "justify" and "defend" are synonyms.

Just as you said earlier, "it's not a popularity contest, it's a numbers game." In terms of impact of numbers of the WiiU sold and the willingness of developers to produce games.

Also, synonyms.

Strawman? No. This is rather you responding to each criticism of your over-reaching optimism by saying "I'm not doing X. I'm just doing X."

There's a distinct difference between "justifying" and "trying to find their justification". Why would I be justifying it? I didn't design it.

it's a strawman scenario because your implying I'm being overly optimistic when I'm not actually claiming anything will happen. So you're wrong. And you're baseing your whole argument on that incorrect assumption about what I'm saying. I'm just discussing the situation and trying to do that in a constructive manner. I've no vested interest in the outcome one way or the other. You don't seem capable of that and are convinced I'm saying things I'm not. You seem to think I agree with Nintendo's design choice, when I've said the opposite several times.

Just calm down.

So, am I really the only one who thinks this is hysterical?

It's not a magic bullet, it's just what Nintendo was probably banking on in order to make their console function at anything resembling a decent level? Not a magic bullet, just the keystone of the design.

It being a part of their design (which it is) and it being some magical solution to all their problems (which it isn't) are totally different things, friend. I suggest you look up what the term "magic bullet" means before trying to make an argument out of nothing.
 
Last edited by a moderator:
Which speaks to the ridiculousness of the entire idea, let alone that it was a marketing decision Nintendo based their design decisions on. But again, I give Lumpy credit for attempting to find any bright spot, or any potential angle that makes the WiiU design a logical, intelligent decision.

The Lump is engaging in what's called 'moving the goalposts'.
For example, he'll say the 720/PS4 aren't going to be that much better than the Wii U, we'll say the leaked devkit specs show it'll be at least an order of magnitude more powerful, then he'll counter with: 'on paper yes, but actually the differences won't be that noticeable' (ignoring the fact that the weak CPU in the Wii U will severely limit the gameplay experiences possible when compared to PS4/720 - as DICE has just pointed out)

Similarly, since he can no longer point to any benefits the Wii U may have had in terms of performance over the 360/PS3 (since knowledge of slow RAM and CPU nullify the better GPU), and since from the myriad issues with the software at launch, he can't say it offers a particularly hassle-free or pleasant user experience.

So, instead, he has to point to increasingly esoteric and insignificant 'benefits' such as the few quid a year it'll save on the user's power bill, or construct unlikely, worst-case scenarios for the 720/PS4 which'll mean the Wii U will come out on top.

To me, it all smacks of drinking too much of the Nintendo Kool-Aid and not seeing things as they are (just like all the PS3 users who complain of 'lazy devs' when some MP title runs better on 360).
 
Last edited by a moderator:
I just cant believe the same company that brought the gamecube, which was brilliantly designed piece of hardware, came up with Wii U...
 
The Lump is engaging in what's called 'moving the goalposts'.
For example, he'll say the 720/PS4 aren't going to be that much better than the Wii U, we'll say the leaked devkit specs show it'll be at least an order of magnitude more powerful, then he'll counter with: 'on paper yes, but actually the differences won't be that noticeable' (ignoring the fact that the weak CPU in the Wii U will severely limit the gameplay experiences possible when compared to PS4/720 - as DICE has just pointed out)

Similarly, since he can no longer point to any benefits the Wii U may have had in terms of performance over the 360/PS3 (since knowledge of slow RAM and CPU nullify the better GPU), and since from the myriad issues with the software at launch, he can't say it offers a particularly hassle-free or pleasant user experience.

So, instead, he has to point to increasingly esoteric and insignificant 'benefits' such as the few quid a year it'll save on the user's power bill, or construct unlikely, worst-case scenarios for the 720/PS4 which'll mean the Wii U will come out on top.

To me, it all smacks of drinking too much of the Nintendo Kool-Aid and not seeing things as they are (just like all the PS3 users who complain of 'lazy devs' when some MP title runs better on 360).

Lol, you're just being silly now. I haven't said any of that. I've said the opposite in some cases.

Read my posts before replying because you're having an argument with thin air.


Im fully aware PS4 and 720 will be an order of magnitude more powerful and will allow for a big leap in graphical performance. The only thing I've said on that subject was that the leap from this gen to next won't be as big as from last gen to current. That wasn't relating to WiiU in anyway and doesn't change the fact that WiiU will be comfortably outpaced and will likely be in the same scenario as Wii was this generation.

Your first paragraph is the exact definition of a strawman argument. Your saying I said somthing which I didn't, then arguing against it to drive your own opinion home.

I also wasn't arguing that the "eco friendliness" was a selling point or that it even existed (as I said, if you bothered reading) and was simply explaining how environmentally friendly hardware doesn't benefit the individual consumer in any way. I then got into a separate discussion about how buying energy efficient products is quite popular in the UK, how it's a marketing tool and an important factor in modern product design. I never said it was a reason to buy a WiiU, or that an average gamer will care.

You can make up what I'm saying all you like, but I've been saying the above this whole time.


I'll accept an apology when you're ready ;)
 
Last edited by a moderator:
I was thinking about the cache bandwidth some more. I was wondering if anyone can provide more information on the following..

I remember there being rumors that the CPU would contain 2MB + 2x512KB of L2 cache. 3MB of SRAM L2 on a 33mm^2 45nm die seems like it'd be a tight fit. Very rough number, but Apple's A5 had 1MB of (not particularly fast or wide) L2 cache took about 1/10th of a 122mm^2 die area. Samsung's 45nm should be very similar to IBM's, and given those numbers 3MB of L2 cache would take up about the entire die area, which obviously can't be the case.

I also remember comments saying that the GPU and CPU both have eDRAM. Given this, if it really does have 3MB of L2 cache it could be implemented as eDRAM. This would mean it'd have much higher latency than a conventional SRAM-based L2. For a GPU eDRAM is fine, and for L3 cache it's less problematic, especially if you have a strong OoOE and good prefetchers. But if Nintendo is basically using a tweaked Broadway CPU as the base their capabilities in these areas will be limited, and it could be a poor fit for really high latency L2 cache.

Hence why devs may currently be struggling to utilize its bandwidth.

I just cant believe the same company that brought the gamecube, which was brilliantly designed piece of hardware, came up with Wii U...

Kind of an ironic thing to say if Wii U's CPU really is a not-very-heavily-modified Broadway core (much like Wii's CPU/GPU were a minor update of Gamecube's) Seems like Nintendo's problem would be that they can't move away from Gamecube's design..
 
To me, it all smacks of drinking too much of the Nintendo Kool-Aid and not seeing things as they are (just like all the PS3 users who complain of 'lazy devs' when some MP title runs better on 360).

Clearly. And I admire his ability to hold firm with his belief that this abortion will somehow manage to hold its own against next gen consoles, while it is being shown to be lacking against the current generation.

It really is cute to say "Okay, so it's not as powerful as the next Gen, but the next gen won't see a graphical leap so that is irrelevant."

"Okay so the CPU is underpowered, but they're really betting on offloading most of those tasks to their superior GPU and it's a benefit that the CPU is underpowered because it saves on energy costs and that's an important selling point!"

"You guys are looking at it all wrong. There won't be a lot of AAA games next generation because they cost too much, so the WiiU will actually have an advantage by not being expected to be so great and being able to have a huge catalog of crappy games."

"I'm not trying to defend their decisions or say this console is a good one, I'm just trying to have a discussion on why they made the design decisions they did."

I think the best answer to the last one has already been given. Their engineers were rip roaring drunk for the past three years.

Or, honestly, a better answer if Lumpy actually wants an explanation:

The "Wii kids" are all adolescents and teenagers now. The "Wii Parents" who bought those "Wii kids" the Wii did so out of nostalgia, most likely not from the Gamecube but from the N64 or earlier consoles. The "Wii Parents" have now bought those "Wii adolescents and teenagers" a PS360 in the meantime as they grew from Kids so they could play CoD and MW and Madden online with their friends.

Those "Wii Kids" aren't old enough to have children of their own yet, so there's no motivating factor for them to buy a system for their non-existent offspring.

In short, the reason Nintendo made the design decisions they did was because they knew the console wouldn't sell well because of the generational gap. Yet they also knew they had to release a new console in order to stay relevant and maintain some level of revenue (or even profit).

So they shortchanged the WiiU purposefully because of the generational gap and the lack of a market, preferring not to take risks on loss leading hardware with this current offering, realizing they'll have a better opportunity in 5 years or so when those "Wii kids" become "Wii Parents" and the ole' nostalgia kicks in and they are making purchasing decisions for their own children.

Makes as much sense as any Lumpy's justifications for their actions. To me, anyway.
 
Ok I've broken this down into chewable chunks for you, as you're clearly incapable of remembering who's said what or reading a post correctly. I probably won't bother after this, as you seem to have decided who I am and what I'm saying already.

Clearly. And I admire his ability to hold firm with his belief that this abortion will somehow manage to hold its own against next gen consoles, while it is being shown to be lacking against the current generation.

I don't think this at all. Never said that either

It really is cute to say "Okay, so it's not as powerful as the next Gen, but the next gen won't see a graphical leap so that is irrelevant."

I don't think this at all. Never said it either

"Okay so the CPU is underpowered, but they're really betting on offloading most of those tasks to their superior GPU and it's a benefit that the CPU is underpowered because it saves on energy costs and that's an important selling point!"

Dont think this. Never said it. You're misquoting me like a cheap tabloid paper.

"You guys are looking at it all wrong. There won't be a lot of AAA games next generation because they cost too much, so the WiiU will actually have an advantage by not being expected to be so great and being able to have a huge catalog of crappy games."


Also never said this. I've said the fact it probably will have a decent install base might help it along in the next year, but it will still go the same way as the wii as they've gimped the cpu too much. (which also had aecent 3rd party support for its first year or so)

"I'm not trying to defend their decisions or say this console is a good one, I'm just trying to have a discussion on why they made the design decisions they did."

And what's wrong with that? It's a forum!


Anyway, nice of you to ignore my last post and not bothering to answer what I've called you up on. Hopefully you read this, realise your mistaken about what I'm.saying and move on. I don't want to derail the thread anymore.
 
Lol, you're just being silly now. I haven't said any of that. I've said the opposite in some cases.

Read my posts before replying because you're having an argument with thin air.

I was commenting on RancidLunchmeat's comment about you always being able to find some spin or conjure some unlikely scenario that makes Nintendo out to be competent and the Wii U actually a great product.

For example, with regards Wii U power vs the PS4/720 you certainly started off as concerned with the effect of all the media about the Wii U being 'humiliated' by the PS4/720 as they were merely going by putative 'on paper' differences and the 'real world' differences wou;dn't matter so much:

http://forum.beyond3d.com/showpost.php?p=1678045&postcount=3075
http://forum.beyond3d.com/showpost.php?p=1678047&postcount=3076

Then as the discussion proceeded, you ended up with admitting there'll be noticeable graphical differences between the 720 and the Wii U, but it won't be as pronounced as last gen (which is obvious of course, and no one will disagree with that.)

Will 720 wipe the floor with WiiU in the real world? Well, not in the same way 360 could vs Wii thats for sure. Not only do diminishing returns come into play, but there simply isn't as much of a difference between the two as there was last gen. Sure 720 will comfortably outpace WiiU (and probably PS4) in every area processing wise, but I'm not so sure we'll see a 'generational' difference as with last gen
http://forum.beyond3d.com/showpost.php?p=1679149&postcount=299

Or look at your debate with Shifty and Brit on how Nintendo's social/online system stacks up to PSN and XBL.

http://forum.beyond3d.com/showthread.php?p=1678101#post1678101
http://forum.beyond3d.com/showthread.php?p=1678110#post1678110
http://forum.beyond3d.com/showthread.php?p=1678944#post1678944
http://forum.beyond3d.com/showthread.php?p=1678957#post1678957
http://forum.beyond3d.com/showthread.php?p=1678958#post1678958
 
Last edited by a moderator:
"Okay so the CPU is underpowered, but they're really betting on offloading most of those tasks to their superior GPU and it's a benefit that the CPU is underpowered because it saves on energy costs and that's an important selling point!"

Yeah, since, of course, he can't actually admit that the hardware is gimped he has to put some spin on it to make it seem like it isn't actually how it looks and GPGPU is the future. Of course, then Laa-Yosh pointed out that GPGPU can't completely compensate for a weak CPU.

But rather than, in light of this, accepting that the CPU (and slow RAM) gimps the system he'll just ignore it and move onto extolling the next spurious benefit the Wii U has that we're simply not seeing.
 
Last edited by a moderator:
I was commenting on RancidLunchmeat's comment about you always being able to find some spin or conjure some unlikely scenario that makes Nintendo out to be competent and the Wii U actually a great product.

For example, with regards Wii U power vs the PS4/720 you certainly started off as concerned with the effect of all the media about the Wii U being 'humiliated' by the PS4/720 as they were merely going by putative 'on paper' differences and the 'real world' differences wou;dn't matter so much:

http://forum.beyond3d.com/showpost.php?p=1678045&postcount=3075
http://forum.beyond3d.com/showpost.php?p=1678047&postcount=3076

Then as the discussion proceeded, you ended up with admitting there'll be noticeable graphical differences between the 720 and the Wii U, but it won't be as pronounced as last gen (which is obvious of course, and no one will disagree with that.)


http://forum.beyond3d.com/showpost.php?p=1679149&postcount=299

Or look at your debate with Shifty and Brit on how Nintendo's social/online system stacks up to PSN and XBL.

http://forum.beyond3d.com/showthread.php?p=1678101#post1678101
http://forum.beyond3d.com/showthread.php?p=1678110#post1678110
http://forum.beyond3d.com/showthread.php?p=1678944#post1678944
http://forum.beyond3d.com/showthread.php?p=1678957#post1678957
http://forum.beyond3d.com/showthread.php?p=1678958#post1678958


I'm not sure that backs up your argument or has anything to do with you claiming I've said things I haven't or believe things I don't. You've just listed a load of posts I've made about wiiu. None of which show me saying the WiiU will compete graphically next gen or that WiiUs hardware design is brilliant, or that wiiu cpu isn't shit, or that not having AAA games is a good thing, or that it'll sell loads cos it's low wattage, or that gpgpu is the future and is wiius magic bullet. Cos I never said any of that. You just made it up in your head.


And this part you quoted
[Will 720 wipe the floor with WiiU in the real world? Well, not in the same way 360 could vs Wii thats for sure. Not only do diminishing returns come into play, but there simply isn't as much of a difference between the two as there was last gen. Sure 720 will comfortably outpace WiiU (and probably PS4) in every area processing wise, but I'm not so sure we'll see a 'generational' difference as with last gen

I.don't see what's wrong with that. In fact you've concurred with me in your above post where you said "it's obvious and noone would disagree with that" [that the leap won't be as big as before]. But either way that's unrelated to the things you're accusing me of saying.

Yeah, since, of course, he can't actually admit that the hardware is gimped he has to put some spin on it to make it seem like it isn't actually how it looks and GPGPU is the future. Of course, then Laa-Yosh pointed out that GPGPU can't completely compensate for a weak CPU.

But rather than, in light of this, accepting that the CPU (and slow RAM) gimps the system he'll just ignore it and move onto extolling the next spurious benefit the Wii U has that we're simply not seeing.


I've said several times in this very exchange that the cpu has gimped the system. And if you don't believe me, Look! I said it just then!

There are no spurious benefits you're not seeing. No one is saying that. If I've given you the impression that I think there are, then you're mistaken. Youre just arguing with yourself.
 
Last edited by a moderator:
I
And this part you quoted

I.don't see what's wrong with that. In fact you've concurred with me in your above post where you said "it's obvious and noone would disagree with that" [that the leap won't be as big as before]. But either way that's unrelated to the things you're accusing me of saying.

The point is, you started off by saying the 720 isn't going to have significantly better graphics or a generational leap in visuals over the Wii U, later you changed that to 'it won't be as pronounced as from last gen to this gen" which is true, but not quite the same as your earlier position (otherwise why did I start arguing with you in the first instance).


I've said several times in this very exchange that the cpu has gimped the system. And if you don't believe me, Look! I said it just then!

I can only find this recent post where you say that: http://forum.beyond3d.com/showthread.php?p=1681208#post1681208

In fact, it must be a position you've taken up quite recently, since just previously you attempted to spin the DF ME3 face-off feature as countering the comments of the Metro devs:
http://forum.beyond3d.com/showpost.php?p=1681067&postcount=113

They actually sight the WiiU as performing better than the PS3 version overall (not really an achievment) but given the PS3's usually considered pretty CPU heavy spec wise its an intersting insight into the difficulty of judging a system's capability based on ports. It at least gives some perspective to the Metro devs comments on the CPU. It can obviously handle more than some are giving it credit for, unless the GPU is taking all the load off? Who knows.

Until Brit and Arwin explained that the article didn't show any such thing:

The part I find damning to Nintendo is the following "but across the overall run of play it's somewhat disappointing to see a vintage 2005 console with slower GPU and less RAM match and indeed exceed the quality of the experience found on the brand new Nintendo hardware."

Obviously if a game like Mass Effect isn't CPU bound in the least, then a weaker CPU isn't going to matter. And in multi-platform cases, the PS3 is only at a CPU advantage when CPU tasks are efficiently divided over all its cores, which, because they are heterogenous and not that easy to work with, still doesn't always happen.

Lead platforms will always do best. If Wii wanted parity or better, it needed to spec better in all areas. Clearly, that's not what it's doing.

But otherwise, great, glad you agree they gimped it with the CPU.

There are no spurious benefits you're not seeing. No one is saying that. If I've given you the impression that I think there are, then you're mistaken. Youre just arguing with yourself.

Well it's not just me, RancidLunchmeat seems to think so too (and i'm sure, others too).

Your position is very different to someone like Grall for instance, who was also clearly looking forward to the Wii U and is now (quite understandably) disappointed by Nintendo's ineptness.

You come across as if you believe that any questionable decision made by Nintendo or non-feature of the Wii U, must have some benefit to it and seem to go to great lengths to suggest what these 'benefits' may be or construct unlikely scenarios where they might become actual.
 
Last edited by a moderator:
Bottom line? Nintendo isn't interested in selling consoles to people who want cutting edge technology.

They want to sell consoles based upon nostalgia and their unique IP.

Their unique IP all just happens to be cartoony that sells primarily to kids.

The problem Nintendo fans have is they have grown up wanting to play the unique Nintendo IP and at the same time want to be able to play the CoD and MW and Madden, etc. And now, they want to be able to do it in a cohesive online environment. Something which MS leads at, Sony realized they needed to improve on with the PS3 and Nintendo is now only trying to get into the game because they were focused on the live friends in the living room environment.

PC gamers, as Lumpy has repeatedly categorized himself, have all the more reason (not less) to wish and hope and ask Santa to put under the tree for christmas, for Nintendo to do this because the PC offers the majority of what the PS360 offers and often times does it in a superior fashion.

So PC gamers have the least reason to buy a PS360, the most to buy a new Nintendo console that offers a different experience.

The rationalization that "I'm a PC gamer, not a Nintendo fan" is actually extremely hypocritical and counter intuitive.
 
Last edited by a moderator:
Decent GPU, underpowered CPU and limiting memory bandwidth.
I wouldn't call the memory bandwidth limiting yet. For the cpu it should be more than enough (if you can actually get it there which is an entirely different matter), and for the gpu the 32MB edram should really help a lot imho. The Xbox360 couldn't really fit all that much into its edram (at 1080p and no AA it would be just little more than the depth buffer, with no hope of fitting any color buffer fully in there too though granted I think it had more clever schemes so it could handle partial buffers in there). But with 32MB edram it should help avoid quite a bit more main memory traffic. At least it doesn't look as limiting as the seemingly slow cpu to me.
 
I also remember comments saying that the GPU and CPU both have eDRAM. Given this, if it really does have 3MB of L2 cache it could be implemented as eDRAM. This would mean it'd have much higher latency than a conventional SRAM-based L2. For a GPU eDRAM is fine, and for L3 cache it's less problematic, especially if you have a strong OoOE and good prefetchers. But if Nintendo is basically using a tweaked Broadway CPU as the base their capabilities in these areas will be limited, and it could be a poor fit for really high latency L2 cache.

Hence why devs may currently be struggling to utilize its bandwidth.

http://en.wikipedia.org/wiki/Blue_Gene#Blue_Gene.2FQ

IBM makes 1.6GHz CPU with edram L2 for supercomputers, one of which reached 16 sustained petaflops. built on 45nm and with ddr3 controllers.

You have a lot of similarity with Wii U CPU right there. I believe the CPU design shares common blood, too. What the Wii U CPU appears to lack is the nice SIMD (but it has to have something, maybe half the width) and four-way multithreading. (and even the ddr3 controllers if we believe this topic)

Very high perf/watt there, but absolute perf isn't necessarily very good. Supercomputers are made for heavily parallel code, high perf/watt and interconnexions are somewhat more important than CPU speed.

the Broadway rumour needs to die, the design dates back to the mid 90s.
 
Well what's the credence for saying it's PPC A2 derived instead? The Broadway rumor is supported by a developer saying that it's using paired singles for SIMD. Which makes sense given Nintendo's historic decisions for backwards compatibility. It'd be pretty bizarre to retrofit some other CPU with this capability. And why would they remove SMT from a design that clearly benefits from it?

That said, would a 1.6GHz BlueGene/Q with pared down SIMD and no SMT really be a big improvement over a 1.6GHz Broadway with redesigned L2 cache? We're talking about a pretty modest in-order design with a fairly long pipeline; I wouldn't be surprised at all if in simple integer operations Broadway tends to be faster clock for clock (given competitive caches). The SMT and heavy FP capabilities are the driving point of the processor.

And on my original point regarding the latency of using eDRAM as L2 - with 18 4-way cores all sharing the same 32MB you have a ton more opportunity for latency hiding than in Wii U's case. So even if it really is PPC A2 derived it'd still run a risk of having to deal with worse L2 latency than may be ideal.

So I'm curious, when exactly does the design become too old to use? Not in 2006, but in 2012? If you're going to tell me Broadway is too old then I'm going to counter with BlueGene/Q being too new; a chip that debuted in 2012 being used as the CPU in a 2012 Nintendo console is reality bending. This is the company that used ARM11s in their 2011 handheld.
 
Last edited by a moderator:
Yeah - especially among gamers, who are generally about as far from the 'save the earth' crowd as is possible.

And anyway, electricity is dirt cheap, if I ran my 46" LCD TV for 8 hours each day it'd only cost me $80 a year, my MacBook's yearly powerbill comes to something like $10.

Compare this to the cost of water, heating, fuel, food, housing, telephony etc and it's pretty much amazing value considering how vital it is.


Too be fair, it's "dirt cheap" in the USA. I want to say in Europe, it can be like 4X as expensive.

I pay about 10c/KWhr, I believe I read some Europe person on Neogaf say they paid 40c per.

Which makes me wonder how they afford basics like central cooling, but then again, they probably dont...

But hey, they brag a lot online about how crappy our cell phone data plans are compared to theirs :p
 
I thought I had linked to this article, with an identical section on Bluegene Q.
http://en.wikipedia.org/wiki/PowerPC_A2

Well A2 is only a core afterall, and two processors use it. PowerEN (made for high traffic networking and combining router/web frontend loads) and Bluegene/Q.
The first one debuted in early 2010.

I take your word that you get better latency hiding with the four way SMT and stuff. I didn't necessarily disagree with you about the high latency.
Just it seems to me Wii U CPU maybe is closer to A2 than Broadway, at least utilizing a lot of its tech (cache hierarchy is significant stuff) even though it may be some kind of a stop-gap chip, and yes retrofitting what looks like 3Dnow! to a modern design is probably not a big deal. Gecko/Broadway itself was adding a SIMD to an existing design.
 
Too be fair, it's "dirt cheap" in the USA. I want to say in Europe, it can be like 4X as expensive.

I pay about 10c/KWhr, I believe I read some Europe person on Neogaf say they paid 40c per.

Which makes me wonder how they afford basics like central cooling, but then again, they probably dont...

But hey, they brag a lot online about how crappy our cell phone data plans are compared to theirs :p

I was bitten by the electricity bill, KWhr is about 10 cents too but there are a lot of bullshit taxes too and it's said in fineprint they are kilowatts-hour based.. so the real cost is tripled.
Still I live in a european country with low electricity cost, because of the nuclear reactors and historical state monopoly (being chipped away)

Cooling? you can close the wooden or metal window lids (or window flaps? I don't know what the english word is) during the summer day, especially at maximum heat/sunshine, then at night you leave everything wide open. A/C cooling is only for datacenters, malls and one room for the elderly.
 
btw what's the problem with paired singles? I've just looked this up in the manual of ppc750cl (which is supposedly very very similar), and the chip appears to have "load/store paired singles" instructions, so you wouldn't have to do the load/merge/load thing. The manual suggests using a normal single load also broadcasts the value to both high and low part (which looks quite useful to me because "vec op scalar" is something quite common so you can avoid the shuffles (merges)). Paired load/store can even do conversion from/to 8/16bit ints while using these.
The "simd" instructions look quite reasonable overall to me, it can do mad, it's got some horizontal instructions (add lower and upper part) etc. What's lacking in comparison to other simd instruction sets is as far as I can see the ability to handle integers, and of course it's only 2-wide not 4-wide (though it should be mentioned until intel core and amd k10 x86 chips only really had physically 2-wide simd units too). Not sure how fast those ops are though...
And of course being a ppc750 derivative this probably can't hide the fact it is a very very old design (first appeared 1997...).
 
Too be fair, it's "dirt cheap" in the USA. I want to say in Europe, it can be like 4X as expensive.

I pay about 10c/KWhr, I believe I read some Europe person on Neogaf say they paid 40c per.

Which makes me wonder how they afford basics like central cooling, but then again, they probably dont...

But hey, they brag a lot online about how crappy our cell phone data plans are compared to theirs :p

OT but:

Why would you need central cooling if most of europe doesn't even have 4 weeks of 25+ degrees weather a year? I don't think there even are regions that have weather that is so hot most of the year that they would need central cooling.

Instead we have central heating, which uses gas boilers to pump hot water around the house.
 
Status
Not open for further replies.
Back
Top