Does Cell Have Any Other Advantages Over XCPU Other Than FLOPS?

Lazy8s said:
Mobile phones are the highest volume market for processors, closing in on a billion devices a year and with a lot of China still left to tap. For the next several years, ARM and SuperH are already positioned to dominate the CPU side, and MBX/SGX have it for the co-processor, projected to power hundreds of millions of units annually (while some people apparently still wait for PowerVR to make a "comeback".)

Other cost sensitive embedded markets like the millions of cars which will eventually ship with info/nav systems installed and the millions of digital TVs and consumer electonics which will need video processing are also already on their way to going to ARM, SuperH, and PowerVR in their latest models.
ARM, SuperH, and PowerVR lacks the absolute performance of Cell, and perhaps the relative performance/watt too. Will ARM get enough performance for future automobiles with image recognition for collision avoidance, or intelligent suspension control by road surface simulation?
 
ihamoitc2005 said:
It is better to make sensible statements when possible my friend than childish comments and insults. It is you who are hurt by such comments..

Well this should be fun. Somehow I don't feel any pain. In fact, comments like: "Apparently you enjoy being a tard" are somewhat refeshing and relieve the stress of conversing with someone such as yourself who is such a tard.



Again you resort to childing insults. This is a bad sign of your character and it is wise to change this.

To you they may be insults and hurt. That is just because you haven't come around to the truth of the universe yet nor gotten a clue. When you do, you'll realize that it was spoken in an atempt to wake you up out of your moronic slumber and get help for your issues. This for you must be a painfull process but once you finally get educated and are capable of thinking independantly and deductively, you'll be much better off.

As for Xenos, as I said in previous post, picking out only portion of Xenos means that Xenos is no longer Xenos but new "hack job" chip of your imagination. So it seems you are unsatisfied with Xenos as companion processor of PPE and prefer new chip using select components of Xenos. It is ok to change your mind but it is better to change mind with apology instead of insults.

And picking the SPE and solely the SPE for your comparison is any different, how? As I believe has been already said, get a f*cking clue.



This is very silly and wrong statement. You should not embarrass yourself like this so I ask you to read SPE documentation more closely next time.

I ask you to get a clue. The SPE without the EIB, L2 and associated coehrency logic, Memory controller, misc chip control logic, and of course the PPE can't do anything. It can't even execute a single instruction without the EIB. Yet you want to take the computational performance per area of a single SPE and compare it to the computational performance of the entire Xenos/C1 die. This is, as should be obvious to anyone skilled in the art or with half a brain, an unfair, illogical, and incorrect comparison.

Aaron Spink
speaking for myself inc.
 
ihamoitc2005 said:
SPE (single vector processor):
25.6 Gflops ...
from 14.5 Sq. mm = 1.77 Gflops/Sq. mm
from 21m transistors = 1.22 Gflops/million transistors

Xenos (as vector processor, no edram):
192 Gflops ...
from ~200 (?) Sq. mm = .96 Gflops/Sq. mm
from 232m transistors = .83 Gflops/million transistors

Oh and you might, just might, want to use the correct flops number for Xenos. Just a guess.

Aaron Spink
speaking for myself inc.
 
aaronspink what are u trying to prove?

I don't know what you are trying to prove aaronspink, when u and a lot of people of this forum know that the combined power of the SPEs is much more powerful than the XCPU....stop wasting your time and energy.
 
supa50 said:
I don't know what you are trying to prove aaronspink, when u and a lot of people of this forum know that the combined power of the SPEs is much more powerful than the XCPU....stop wasting your time and energy.

The issue currently being discussed isn't about the SPE's vs XENON but the SPE's vs XENOS/C1. Follow the conversation please.

And the combined power of a trillion 8086 is faster than a P4. But that and a dollar will get you a bag of chips.

Aaron Spink
speaking for myself inc.
 
one said:
ARM, SuperH, and PowerVR lacks the absolute performance of Cell, and perhaps the relative performance/watt too. Will ARM get enough performance for future automobiles with image recognition for collision avoidance, or intelligent suspension control by road surface simulation?

But ARM suits the market now, whereas Cell does not. So Cell doesn't become viable until ARM beefs up enough that it has the same power envelope as Cell?
 
aaronspink said:

Your argument seems to be working a little something like this:

You: SPUs are no good because they aren't real processors.
Us: What do you mean they're not real.
You: It needs other logic to operate. Er.. I invoke Turing! And use some big words I hope you wont understand!
Us: (ignoring the verbal slight of hand) What, you mean like every other chip in existence?
You: Er... no. I have decided on absolutely no evidence at all that this case is different, because otherwise I would have no argument.
Us: But any differences are just design choices, not inherent flaws.
You: This isn't true but I can't tell you why... besides, there are many things it can't do. Like: {list of badly defined examples}
Us: But it can do those things. It *is doing* those things. Right here, right now. Some of them are even in this manual you obviously haven't bothered reading.
You: Ahhhh... but it's doing those things using {mumble mumble} technology not {cough} technology like a real CPU would. So it's not a real CPU.
Us: That's just semantics!
You: {distracting insult}

Rinse, repeat...

You're almost implying that I could take an ordinary CPU out of its packaging, put it on my table and it'd be sitting there processing away on it's own like magic. Cell meanwhile needs all this extra stuff to help it, like electricity, and is therefore crippled beyond repair.

I'm not sure what your point is. All you're really saying is "Cell's architecture is different to other current designs and it does things in different ways". Which is kind of stating the blindingly obvious. You haven't managed to come up with anything an SPU can't do, except by using semantics to attempt to limit the definition of the task or method.

So what if an SPU *instruction* can't directly access memory, but instead has to have that memory copied locally first? The compiler could even hide that if it wanted to. Lots of CPUs have limited instruction sets that make the programmer/compiler jump through hoops to use efficiently. Lots of new architectures required programmers to do things differently in order to work efficiently. Otherwise we'd still all be using 8 bit values and doing floating point calculations in software.

So to summarise, Cell is not flawed, just different.
 
aaronspink, I notice is on some kind of crusade against CELL, and I'm not sure why. I guess he is here to save us from ourselves through some ambiguous education program. Just a few more insights from him, and we will all be saved from the big nasty piece of garbage that CELL is.

Kinda of reminds me all the PS2 Emotion Engine haters, and yet in the end, some really good results have come out of it. Now the EE has no where near the overall capability, ease of programmability, and of course overall performance of CELL, so they should be able to provide some amazing things with it. But then again, we have been reminded that if you do not synchronize your data transfers, the whole thing comes to a grinding halt. Now never assume developers can ever solve these problems, as they want kindergarten ease of programmability here! Developer intelligence you see is going down with time! They are on the verge of being semi-retarded.

btw, I should know about the EE haters, as I was one of them, but after recently picking up a PS2, I'm amazed at what it can do. All that with the "crappy" Graphics Synthesizer which I thought was flawed for so long also.
 
Last edited by a moderator:
Edge said:
Developer intelligence you see is going down with time! They are on the verge of being semi-retarded.

I can barely tie my own shoe laces. I had to get my mum to type this post for me.
 
aaron may or may not be on a crusade, but it's a matter of fact that the truth often comes through the way of an intelligent argument.

i know the console forum has lots of cell proponents, i personally am more than curious about this chip, but nevertheless i always listen to what experienced and intelligent people have to say, even if that comes in the form of somewhat 'holier-than-thou' package.

and last but not least, many of the not so contributive posts aaron has made recently on this topic or similar topics have come as a result of provocations on the part of others, so before accusing anybody of being on a crusade ask yourself if that is not a self-fullfilling proffecy.

bottomline being, can we get back to civillized discussion?
 
one said:
ARM, SuperH, and PowerVR lacks the absolute performance of Cell, and perhaps the relative performance/watt too. Will ARM get enough performance for future automobiles with image recognition for collision avoidance, or intelligent suspension control by road surface simulation?

The issue around the cell phone market is that first you have to fit within the power envelope and then you can start talking about performance. Currently ARM fits well within the cell phone power envolope. We're talking 1W and below peak power consumption for this end of the market which Cell isn't really geared at.

Cell might be able to find a market in the automotive sector but that will primarily have to come on a cost basis. Right now a lot of the processors available for the auto market handle the scenarios very well. We'll see if Toshiba, IBM, or Sony end up making a Cell with the extended operating range required for the automotive market. Right now they don't have a part that can really meet the requirements of the market but then again, they are focussed on the console market currently, as well they should be.

Aaron Spink
speaking for myself inc.
 
Indeed the processors currently in the systems work well, otherwise they wouldn't be there.

Cell however is currently offering a lot of power/watt, power/sqmm, cars with radar systems, vis light recognition and other types of safety features are coming, getting these running cheaply is something cell can perhaps do well as the SPEs are well designed for this sorts of tasks (small, repetitive etc...), its DSP nature and lack of OOO components also make it better than say a P4 for this task (in DSP work time consistancy often > sheer speed), as it doesn't do anything fancy... it does what it says on the box.

Comparing an SPE to the whole chip is of course silly, however for the sake of the scenario you could assume that the data for the task is already loaded into the LS/Cache of the processors (otherwise we need memory, a data source etc... and then we get silly because if my data source is a massive radar and yours is a small camera then its not really fair). The SPEs do come with an overhead to function (a PPE + some stuff), however on a /SPE basis thats pretty small, each SPE is perhaps 50% larger to accomodate the functionality (SPEs are roughly 2/3 of the DIE iirc), that still gives in your example (25.6 GFlops/SPE, 14.5mm^2), a performance ratio of >1GFlop/mm^2... which isn't half bad I would say.
 
MrWibble said:
I can barely tie my own shoe laces. I had to get my mum to type this post for me.

ROFL. Guess thats why they(the developers) can get paid six figure salaries these days, becouse retars deserve the money.

I dont think they are getting dumber, I think it's harder to coordinate large teams, now that we have large proyects. Before, a single programmer could develop a game(atari), now its impossible.
 
For the embedded markets with orders of magnitude greater restrictions on cost/size/power/heat such as portables and automotive systems, advanced functions in the near future like image, voice, and condition analysis can be accelerated by SGX co-processing and its highly scalable, data local, and multithreaded programmability which exceeds Shader Model 3.
 
compres said:
ROFL. Guess thats why they(the developers) can get paid six figure salaries these days, becouse retars deserve the money.
Hmmm. If people got paid what they deserved, most managers would be on the bottom end of the salaries!
I dont think they are getting dumber, I think it's harder to coordinate large teams, now that we have large proyects. Before, a single programmer could develop a game(atari), now its impossible.
I actually think they are getting dumber. Not because devs that were smart are becoming stupid, but because more and more average people are entering the programming sector as a career move despite not being well suited to the job. There are plenty of stories I hear from my coder friends about the trashy code some of their colleagues write, and I know when I went through uni a lot of classmates in the Comp Sci course who passed aren't at all suited to programming, just as I'm not suited to Biochemistry despite having a degree in it.

It used to be that to be a programmer, you needed talent. Now, especially with high turn-around of junior programmers, you just need a qualification and an ability to string instructions together, without being able to do that well. Someone willl employ you (though now with what seems to be a saturated market it's no longer a given a programmer will find employment, least in the UK) and you can write mediocre code that, as part of a team, can bring down the average quality quite nicely from those veterans and talented individuals who really understand what they're doing. I think it's just a case of needing programmers that companies can't be choosy. Better some naff programmers then no programmers if you're a software developer. But because of this the mean average of developer talent is dropping.
 
Shifty Geezer said:
Hmmm. If people got paid what they deserved, most managers would be on the bottom end of the salaries!
I actually think they are getting dumber. Not because devs that were smart are becoming stupid, but because more and more average people are entering the programming sector as a career move despite not being well suited to the job. There are plenty of stories I hear from my coder friends about the trashy code some of their colleagues write, and I know when I went through uni a lot of classmates in the Comp Sci course who passed aren't at all suited to programming, just as I'm not suited to Biochemistry despite having a degree in it.

It used to be that to be a programmer, you needed talent. Now, especially with high turn-around of junior programmers, you just need a qualification and an ability to string instructions together, without being able to do that well. Someone willl employ you (though now with what seems to be a saturated market it's no longer a given a programmer will find employment, least in the UK) and you can write mediocre code that, as part of a team, can bring down the average quality quite nicely from those veterans and talented individuals who really understand what they're doing. I think it's just a case of needing programmers that companies can't be choosy. Better some naff programmers then no programmers if you're a software developer. But because of this the mean average of developer talent is dropping.


Hmm, I really wonder which category I'd fall into. Computer science isn't my major, but I am attempting minor in it.
Back in high school, I never got into independent coding (like say some of those people who makes mods for games), but I was easily the most advanced in my high school programming classes, even above the teacher. Most of the students could barely write a working program...even one that didn't do anything. That said, my code was still complete crap, so I'm wondering just how many people are cut out to be good programmers. First year of Java I wrote an RPG battle system (using Java swing for the graphics) which was basically just a ton of nested for loops. Second year of Java I just played around with Java 3D for my final project, which was a waste of time. And in my Visual Basic class, I made a multiplayer version of Bomberman, which basically means I never bothered to try and program AI. I also never bothered to put walking animations in, as I initially made the graphics an 8 by 8 grid and the characters would walk in place, and then move. (it would have be simple to rewrite how the graphics were displayed since the 8x8 grid was just an arbitrary restriction, but I didn't bother) Also, performance was crap on any computer with less than 512KB of cache, I believe because I was declaring arrays of images within the game loop (which I was doing because I got tired of how I had 3 different arrays of images, but I had written my graphics class to only accept 1)
 
Shifty Geezer said:
Hmmm. If people got paid what they deserved, most managers would be on the bottom end of the salaries!
I actually think they are getting dumber. Not because devs that were smart are becoming stupid, but because more and more average people are entering the programming sector as a career move despite not being well suited to the job. There are plenty of stories I hear from my coder friends about the trashy code some of their colleagues write, and I know when I went through uni a lot of classmates in the Comp Sci course who passed aren't at all suited to programming, just as I'm not suited to Biochemistry despite having a degree in it.

It used to be that to be a programmer, you needed talent. Now, especially with high turn-around of junior programmers, you just need a qualification and an ability to string instructions together, without being able to do that well. Someone willl employ you (though now with what seems to be a saturated market it's no longer a given a programmer will find employment, least in the UK) and you can write mediocre code that, as part of a team, can bring down the average quality quite nicely from those veterans and talented individuals who really understand what they're doing. I think it's just a case of needing programmers that companies can't be choosy. Better some naff programmers then no programmers if you're a software developer. But because of this the mean average of developer talent is dropping.

I agree with you 100%, seems like I though about actual programmers getting dumber, not the new ones. Some of the blame i will put on those new languages, like .NET which tries to do everything for the programmers.

How much can we optimize java BTW? i would not use java in performance intensive tasks, maybe am mistaken.
 
You run away again.

aaronspink said:
Well this should be fun. Somehow I don't feel any pain. In fact, comments like: "Apparently you enjoy being a tard" are somewhat refeshing and relieve the stress of conversing with someone such as yourself who is such a tard.

To you they may be insults and hurt. That is just because you haven't come around to the truth of the universe yet nor gotten a clue. When you do, you'll realize that it was spoken in an atempt to wake you up out of your moronic slumber and get help for your issues. This for you must be a painfull process but once you finally get educated and are capable of thinking independantly and deductively, you'll be much better off.

And picking the SPE and solely the SPE for your comparison is any different, how? As I believe has been already said, get a f*cking clue.

Read your post yourself and you will see how you must work to improve your character. It is unfortunate that you are unable to converse with logic an honor. If you feel you have "hurt" me, you do yourself some flattery for in reality it is you who have been hurt by your uncivilized behaviour in a public forum.

I ask you to get a clue. The SPE without the EIB, L2 and associated coehrency logic, Memory controller, misc chip control logic, and of course the PPE can't do anything. It can't even execute a single instruction without the EIB. Yet you want to take the computational performance per area of a single SPE and compare it to the computational performance of the entire Xenos/C1 die. This is, as should be obvious to anyone skilled in the art or with half a brain, an unfair, illogical, and incorrect comparison.

Once again you run away from reality which I presented information you asked for and reality was not to your liking. I presented the facts to suit your own latest desired comparison of only Xenos (not including edram) with entire CELL, including all components such as EIB and memory controller. I only leave out CPU and L2 because those are common to both setup of PPE and Xenos which your proposed and to PPE and SPE(s) which is design choice of STI.

As the number results of that comparison show, CELL design of STI with 7 active SPE is superior to Xenos in performance and capabilty, and CELL design with 8 active SPE is significantly more performance and capability even with extra baggage of all extra non-computational components added, increasing average die area per active SPE.

This has been shown many times yet you pretend I have not shown this. Flipping back through the pages of this thread will refresh your memory.

Of course, original comparison you suggested was Xenos vs SPE due to claim that because you feel Xenos is more "specialized" it has higher performance density than SPE. In reality SPE has almost double performance density of Xenos and much more flexibility so again you have been futile in proposing alternatives to SPE as companion processor of PPE.

If you have forgotten the numbers to this original comparison, then flip back to refresh your memory, I have posted it too many times and will refrain from reposting yet again.

Bottom line is you have now lied and changed your mind many times and resorted to using insults when your poor thinking has resulted in contradictory statements and failure to come to terms with your past mistaken statements. It will not surprise me if you respond with further savage comments as it is clearly your nature at this stage of life, however, I suggest that one day you stop running away from self-justifying delusion to fulfill preconceived notion. Then you will begin to learn what you are trying to discuss and then you will be able to speak without savagery and in manner of civilization.
 
why hasn't cell caught on yet? when will sony use it outside of ps3? it seems to be good for tvs, dvd players, etc? why did apple turn it down? why isn't ibm pushing it? maybe there is a chance it's not all it's cracked up to be. Getting the 9th power out of it is maybe too costly for devs.
 
dukmahsik said:
why hasn't cell caught on yet? when will sony use it outside of ps3? it seems to be good for tvs, dvd players, etc? why did apple turn it down? why isn't ibm pushing it? maybe there is a chance it's not all it's cracked up to be. Getting the 9th power out of it is maybe too costly for devs.
That's an interesting question, but my new question is, why don't ATI use unified shader architecture in their primary business?
 
Back
Top