Why Do Cell Workstations Still Need Their Own Zip Code?

Titanio said:
It would be interesting to know if the new kits were 3.2Ghz, though, given that the E3 kits were at least mostly 2.8Ghz according to reports.
Our best guess comes from this thread
http://www.beyond3d.com/forum/showthread.php?t=31925
3.2 GHz are out there. I don't see why Sony wouldn't have any of them.

And regards salvaging this thread, I see we've got it from a 1 star rating to 2 stars! Excellent work :mrgreen:
 
Titanio said:
It would be interesting to know if the new kits were 3.2Ghz, though, given that the E3 kits were at least mostly 2.8Ghz according to reports.
Well just shows the quality of most reports, AFAIK there has never been a dev kit at 2.8Ghz. And certainly not the ones of E3. I personally setup something like 15 of the things at E3, so have a fair idea of what was in them...
 
Well just shows the quality of most reports, AFAIK there has never been a dev kit at 2.8Ghz.
IIRC, there can't even be such a speed in an actual machine. In the lab verifying the chip is a different matter, but all the examples I've seen in an actual workstation/server/devkit... the CPU speed is the same as the effective XDR bus speed. And XDR, at the moment, only comes in 100 MHz grades (100 MHz grades at the DRAM, which means 800 MHz at the bus). 2.8 GHz would imply a 50 MHz step.

The only report I know of that said a 2.8 GHz speed was one single weird interview with a purported developer who seemed to be off his rocker. And then the Inq. took that to wit and made a series of FUD articles off that.
 
Last edited by a moderator:
DeanoC said:
Well just shows the quality of most reports, AFAIK there has never been a dev kit at 2.8Ghz. And certainly not the ones of E3. I personally setup something like 15 of the things at E3, so have a fair idea of what was in them...

Hah, well it just goes to show who I should and should not be listening to. Thanks much, this is useful to know.
 
ShootMyMonkey said:
IIRC, there can't even be such a speed in an actual machine. In the lab verifying the chip is a different matter, but all the examples I've seen in an actual workstation/server/devkit... the CPU speed is the same as the effective XDR bus speed. And XDR, at the moment, only comes in 100 MHz grades (100 MHz grades at the DRAM, which means 800 MHz at the bus). 2.8 GHz would imply a 50 MHz step.

The only report I know of that said a 2.8 GHz speed was one single weird interview with a purported developer who seemed to be off his rocker. And then the Inq. took that to wit and made a series of FUD articles off that.

Interesting, I never really thought about that. It more or less confirms the notion that Cell won't be changing speeds before launch (if you're an optimist or a cynic that subscribes to the idea of it being a possibility, that is). It'd either be 2.4ghz or 4ghz... and I doubt we'd be seeing such drastic changes in either direction at this point -- 3.2ghz seems to be more or less the only logical choice at this point.
 
Finalalised development kits

Hello,

Can someone confirm if the final kits development are actually out with developers now ?
- Final development kits as in hardware with final CELL & RSX built to specifications.
- Not concerned about the changing software levels on the kits.

If the kits do not contain the actual final RSX what do they contain to emulate it ? Dual 7900's ?


Thanks,



Archy
 
archy121 said:
Hello,

Can someone confirm if the final kits development are actually out with developers now ?
- Final development kits as in hardware with final CELL & RSX built to specifications.
- Not concerned about the changing software levels on the kits.

If the kits do not contain the actual final RSX what do they contain to emulate it ? Dual 7900's ?


Thanks,



Archy


Is no one answering above becuase of NDA's ? or is it my typos ?




Archy
 
Demirug said:
Blade Servers are nice but unfortunately useless if you need a bunch of image processors for your cave/wall. I would be more impressed from a Cell PCIe card. I heard that IBM have many 6 SPU Cells and don’t know what to do with them.

It is quite possible Toshiba will take these off their hands. Toshiba anouned that they are going to put a Cell chip in every HDTV they manufacture, and it probably won't need to be as powerful as a PS3.
 
I missed that announcement. Or are you referring to this old one?
http://homeentertainment.engadget.com/2005/01/05/toshiba-ces-cell-processors-for-all-tvs-in-2006-and-more
It's mid-2006 and there hasn't been any word of Cell in Toshiba's HDTVs. I can't find any such announcement at Toshiba's website, and haven't heard of any talk about Cell final appearing in CE goods outside of PS3. To date, after Cell talk for a couple of years, there hasn't been any indication of Cell actually appearing in CE goods. The varous HD movie players and TVs etc. of Toshiba and Sony are still using non-Cell tech. Though that market is supposed to exist for these processors, it doesn't look yet like either party is actually going to use them, both making new devices without including Cell.
 
archy121 said:
Is no one answering above becuase of NDA's ? or is it my typos ?

Archy

No, it's more likely NDAs than typos... positively confirming details about devkits (especially when discussing what might be latest and/or final, which is a minefield for misinterpretation comparing devkits and consumer hardware) is slightly dangerous territory...

I guess I'm ok saying that devkits have had RSX chips in some form for a while, in place of the graphics cards that the earlier units had. No unit has ever had a dual card setup - they only physically ever had one slot for one thing, and using two cards is non-trivial on the software side, so it wouldn't be that useful for early work - better to have a slower single card and know roughly what the scale-up will be, rather than try to use different hardware in an attempt to equal the power.
 
MrWibble said:
No, it's more likely NDAs than typos... positively confirming details about devkits (especially when discussing what might be latest and/or final, which is a minefield for misinterpretation comparing devkits and consumer hardware) is slightly dangerous territory...

I guess I'm ok saying that devkits have had RSX chips in some form for a while, in place of the graphics cards that the earlier units had. No unit has ever had a dual card setup - they only physically ever had one slot for one thing, and using two cards is non-trivial on the software side, so it wouldn't be that useful for early work - better to have a slower single card and know roughly what the scale-up will be, rather than try to use different hardware in an attempt to equal the power.

You are a true gent. Thanks for reply.


Archy
 
Demirug said:
Blade Servers are nice but unfortunately useless if you need a bunch of image processors for your cave/wall. I would be more impressed from a Cell PCIe card. I heard that IBM have many 6 SPU Cells and don’t know what to do with them.

http://www.mc.com/literature/literature_files/Cell_accelerator_board.pdf

$8000 :!:

CellPCI.jpg
 
Last edited by a moderator:
DeanA said:
Yeah, the current revision of devkit (DEH-R1040) is way way quieter than earlier versions. I'd actually go as far to say it's quieter than the last version of PS2 dev hardware.

Dean

We've measured one at 72db's at 1 meter, but that's prior to the fan control goin in.
 
ERP said:
We've measured one at 72db's at 1 meter, but that's prior to the fan control goin in.
Yeah.. everyone else in the office seems to have 1040's.. Until the next rev of kit(!), I've got to make do with a 1010, and as such I find myself always apologising to the other guys in the room for turning the kit on.

Personally I can't wait to see a real (or debug) kit running. A noise level of only 29db is bloody impressive, whichever way you look at it..

Dean
 
MrWibble said:
No, it's more likely NDAs than typos... positively confirming details about devkits (especially when discussing what might be latest and/or final, which is a minefield for misinterpretation comparing devkits and consumer hardware) is slightly dangerous territory...

I guess I'm ok saying that devkits have had RSX chips in some form for a while, in place of the graphics cards that the earlier units had. No unit has ever had a dual card setup - they only physically ever had one slot for one thing, and using two cards is non-trivial on the software side, so it wouldn't be that useful for early work - better to have a slower single card and know roughly what the scale-up will be, rather than try to use different hardware in an attempt to equal the power.


I was just wondering what you mean by a dual card setup?
 
Back
Top