PS3 Development vs PS3 Console Systems

gosh

Newcomer
kaigai02l.gif


As you can see the Cell processor will be 3.2 ghz in both but in the development system there will be 2x more memory than the console

In the RSX there will also be 2x more memory on the G70 in the development system than on the console and the bandwidth on the Development will be 3x more than the console. This also confirms that the G70 is as powerful as the RSX because they would not put a weaker Graphics chip in development kits.

As you can also see in the final console will be an option to input a HDD which will be also in the development system while development system will not have a bluRay disk while the console will
 
gosh said:
This also confirms that the G70 is as powerful as the RSX because they would not put a weaker Graphics chip in development kits.
This only confirms that the PS3 Evaluation System II has a G70. It confirms nothing else.
 
overclocked said:
I would think that the G70 used is the Quadro card, as a matter of fact im pretty sure that it is. :LOL:

I was thinking the same thing, but then i figured that in the end they don't really need to model stuff on the kit itself, they can model whatever they want with Maya or 3DSMax on a PC then take it over. A Quadro only really helps with those apps, as far as i'm aware.
 
Speaking about this article by Goto, the southbridge chip in PS3 Evaluation System (the current devkit version) is the same as the one in IBM's Cell blade server (PCI Express 4X), and XDR is at 2.4Gbps (75%), just like Cell 2.4Ghz.
 
I was thinking the same thing, but then i figured that in the end they don't really need to model stuff on the kit itself, they can model whatever they want with Maya or 3DSMax on a PC then take it over. A Quadro only really helps with those apps, as far as i'm aware.

Yes but now they have a two in one solution, hehe.. Nah but im pretty sure that they use a quadro card which explains why there is more memory.
 
Cell is at 2.4GHz for the eval kits. Says so in the Sony slides. Not sure why they have 3.2GHz showing there. And I think the b/w through the southbridge is wrong. PEACE.
 
MechanizedDeath said:
Cell is at 2.4GHz for the eval kits. Says so in the Sony slides. Not sure why they have 3.2GHz showing there. And I think the b/w through the southbridge is wrong. PEACE.
It's definitely a typo... the article itself says it's at 2.4 GHz for CEB-2030 Cytology.
kaigai60.jpg
 
Why so slow? If Cell was unveiled at 4 GHz, and ran even at 5 GHz, why are they managing little more than 2.4 GHz on current Cell chips? Yield trouble??
 
Shifty Geezer said:
Why so slow? If Cell was unveiled at 4 GHz, and ran even at 5 GHz, why are they managing little more than 2.4 GHz on current Cell chips? Yield trouble??
It's still in validation (Cell DD2 was taped out in Dec 2004)
 
gosh said:
This also confirms that the G70 is as powerful as the RSX because they would not put a weaker Graphics chip in development kits.
No, it doesn't, there's no information there at all that allows such a conclusion to be drawn. Besides, dev systems commonly have weaker graphics (and other) hardware than the final console.
 
MechanizedDeath said:
And I think the b/w through the southbridge is wrong. PEACE.
What do you mean 'wrong'?

Its a southbridge, not exactly fast. For example (picking the first MB maker google turned up) according to VIA (http://www.via.com.tw/en/products/chipsets/flex-express/) there southbridge only has 0.5Gb/s with 2xPCIE lanes.

Think about it, a PS3 requires a southbridge (for optical disk, etc.) but doesn't require a northbridge (where you would normally stick a graphics card). So until the proper RSX turns up it makes sense just to tack it on the southbridge and live with the crap bandwidth...
 
DeanoC said:
MechanizedDeath said:
And I think the b/w through the southbridge is wrong. PEACE.
What do you mean 'wrong'?

Its a southbridge, not exactly fast. For example (picking the first MB maker google turned up) according to VIA (http://www.via.com.tw/en/products/chipsets/flex-express/) there southbridge only has 0.5Gb/s with 2xPCIE lanes.

Think about it, a PS3 requires a southbridge (for optical disk, etc.) but doesn't require a northbridge (where you would normally stick a graphics card). So until the proper RSX turns up it makes sense just to tack it on the southbridge and live with the crap bandwidth...
Thanks Deano. I thought someone translated the article and said it read 8GB combined up/down. PEACE.
 
Alstrong said:
When did they switch to 512MB of XDR?
Thoses are for the Devkit only.
It's not uncommon for a Devvkit to have more RAM WRT the Console.
 
MechanizedDeath said:
I thought someone translated the article and said it read 8GB combined up/down. PEACE.
Actually that part of the article is really making a comparison, saying how a G70 on a PC using PCIe x16 has 8GB/s between CPU and GPU, with 4 GB/s divided between upstream and downstream. The current PS3 evaluation uses the 5GB/s southbridge, resulting in a 2GB/s downstream. (I don't know where he gets these numbers. Just quoting from the article). Article then mentions that when FlexIO is implemented, it will be 20GB/s downstream, 15GB/s upstream. Compare that with the current (supposed) 2GB/s downstream...

Deano's reply also got me really curious about the southbridge. Why does it need such a high bandwidth? Even though it is connected to the HDD, sound output, ethernet and controller I/O, it seems like an overkill. Am I thinking too much to suspect that there is some additional function down there that is yet to be disclosed?
 
As I understand it the RSX's core is to be clocked at 550MHz while the 7800 GTX is clocked at 430MHz.

The RSX and Cell also have some special abilities when exchanging data between one another according to Sony etc.

There's no doubt more we don't know yet.

All in all I wouldn't conclude the 7800 GTX is on par with what the RSX will be. The parts may be siblings but I believe it will be easy to determine who's the big brother of the two.
 
Shifty Geezer said:
Why so slow? If Cell was unveiled at 4 GHz, and ran even at 5 GHz, why are they managing little more than 2.4 GHz on current Cell chips? Yield trouble??

I wouldn't call it yeild trouble, more like normal development with budget concerns in mind. Pretty much every chip manufacturer clocks 90% of their chips below their maximum possible speed. It increases both yields and the reliability of the chips.



360 dev kits as similarly undreclocked on both the CPU and GPU.
 
I red somewhere that it was nothing to with yield concerns but had to do with heat output. They said to have it clocked at 4Ghz you would need a larger case or an exotic cooling solution.
 
Back
Top