PS3 Spec change possibility?

Well actually, for myself I see the two biggest variables as either the RSX and/or the GDDR3 speeds. I agree that Cell and XDR are likely not going anywhere. But in the last month or so Samsung's new 1.1ns RAM has been setting the world on fire, and though it was too late for MS to consider it, I don't rule out entirely that Sony might switch to it for the bank of GDDR.

As for RSX, the clockspeed is really the least of my needs in terms of knowledge holes.
 
ROG27 said:
for some reason I get the feeling that just because the CELL and the memory structure were finalized at last may's e3...that RSX wasn't necessarily taped out and was still being tweaked...and is, in fact, PS3's biggest variable (with a low degree of possible variability, I might add). The CPU won't change, the memory won't change, but the RSX's specs may not have been presented in finalized form since it was/is still in development.

So, maybe the RSX has changed slightly since E3?

maybe not performance-wise but flexibility of capabilities, perhaps?


If RSX is changed, it will be to reduce the clock, not increase it.

Again, you ahve the specs. You've given the specs to developers.

If you increase the yeilds but don't change the specs, you make more money.
If you change the specs, you make less money.


I'll be more than happy to change this view if someone can give me a rational and logical explaination on how increasing RSX, Memeory, or Cell clock speeds would make more money for Sony.
 
xbdestroya said:
But in the last month or so Samsung's new 1.1ns RAM has been setting the world on fire, and though it was too late for MS to consider it, I don't rule out entirely that Sony might switch to it for the bank of GDDR.
.


That would require a complete redesign of RSX's memory controller. It's a bit late for that, don't you think?
 
Powderkeg said:
That would require a complete redesign of RSX's memory controller. It's a bit late for that, don't you think?

I wonder. It's the same memory the GTX512 uses, is it not? Why would they need a redesign? We're not talking GDDR4 here.
 
xbdestroya said:
I wonder. It's the same memory the GTX512 uses, is it not? Why would they need a redesign? We're not talking GDDR4 here.
Yeah, there would be no need for mem controller redesign.
Unfurtunately I don't think Samsung can provide all the mem modules Sony needs ;)
And that kind of mem would be bloody expensive anyway..
 
xbdestroya said:
It's the same memory the GTX512 uses, is it not?

Is it?

If it is, then nevermind.

But the main thing is the latency change. There is a lot of predictive logic thrown into that memory controller and if you change the latency times significantly you may throw off that prediction process.

And I said may. I'm just guessing here so don't bite my head off if I'm wrong.
 
Powderkeg said:
If RSX is changed, it will be to reduce the clock, not increase it.

Again, you ahve the specs. You've given the specs to developers.

If you increase the yeilds but don't change the specs, you make more money.
If you change the specs, you make less money.


I'll be more than happy to change this view if someone can give me a rational and logical explaination on how increasing RSX, Memeory, or Cell clock speeds would make more money for Sony.

The question is were the developers told the same specs the public were?

Also, perceived perception of power by the early adopter crowd and any architectual changes that increase ease of development (eliminate bottlenecks) would be the rational arguments you'd be looking for. Early adopters set trends and happy devs are signed on devs. More devs mean more content to appease a wider audience which means more money.
 
nAo said:
Yeah, there would be no need for mem controller redesign.
Unfurtunately I don't think Samsung can provide all the mem modules Sony needs ;)
And that kind of mem would be bloody expensive anyway..

I was thinking the supply might be a serious constraint as well. Still, when MS and Sony announced their use of GDDR3 at 1400MHz DDR, they weren't exactly bargain-hunting, as it was the best of it's day. Now there's one better. ;)

Not to say the expense wouldn't be a consideration, but since a year or two down the line it's price difference from 1400 MHz would be marginal at most, it might make sense from a long-term perspective. But again I agree, for a Spring 06 launch, low quantities may simply make it infeasible anyway.
 
well an increase in ram can happen , i doubt it , gdr 700 will still be pretty expensive . and I don't see them going with even faster ram .
 
ROG27 said:
Also, perceived perception of power by the early adopter crowd and any architectual changes that increase ease of development (eliminate bottlenecks) would be the rational arguments you'd be looking for. Early adopters set trends and happy devs are signed on devs. More devs mean more content to appease a wider audience which means more money.


Would a 20% increase in clock speed sell more consoles?

Would it sell more games?

If you can't definitively say yes to both then it's moot. With better than expected yeilds leaving the clock speed alone makes them more money. That is absolutely certain.

If you increase the clock your yeilds will suffer. You've got to show proof positive that the reduced yeild is made up for with increased sales. It might make the job of the developer a little easier, but he isn't going to not make his PS3 game just because the RSX clock is 550MHZ and not 600MHZ. Early adopters aren't going to cancel their preorders because RSX speeds weren't increased.

The expectation of these clock speeds has been set. There is absolutely no downside to keeping them that way.

Increasing them is a gamble that could cut into profits with no certainty or even likelyhood of an increase in sales to make up for it.
 
Powderkeg said:
Increasing them is a gamble that could cut into profits with no certainty or even likelyhood of an increase in sales to make up for it.
this makes sense, nonetheless Sony increased PS2's CPU and GPU clock frequency after official specs were announced, IIRC.
 
Powderkeg said:
Would a 20% increase in clock speed sell more consoles?

Would it sell more games?

If you can't definitively say yes to both then it's moot. With better than expected yeilds leaving the clock speed alone makes them more money. That is absolutely certain.

If you increase the clock your yeilds will suffer. You've got to show proof positive that the reduced yeild is made up for with increased sales. It might make the job of the developer a little easier, but he isn't going to not make his PS3 game just because the RSX clock is 550MHZ and not 600MHZ. Early adopters aren't going to cancel their preorders because RSX speeds weren't increased.

The expectation of these clock speeds has been set. There is absolutely no downside to keeping them that way.

Increasing them is a gamble that could cut into profits with no certainty or even likelyhood of an increase in sales to make up for it.

Agreed on the performance end (MHZ/FLOPS)...I was thinking more along the way RSX handled processing things. We don't know much about it yet...is it traditional?...is it pixel shaders only? (vertex processing on cell)?...is it an alternative to unified shaders MS is touting?

I was, in my message prior, referring to architectual efficiency.
 
I see the main determiner of RSX's clockspeed as being voltage/power/heat concerns, not speed binning. For example, any Athlon64/Sempron can reach 2 GHz - you're not going to be increasing or decreasing your yields by launching chips in any range from 2 GHz on down. Now what we'd need to know is, similar to what Hofstee was saying in Version's post, what clockspeed can RSX reach before the power increase needed becomes lopsided against the speed increase gained?

I think they'll go for whatever speed allows them the max 'efficient' voltage while still staying within certain power/heat parameters. I don't see a speed wall as being the limiting factor, and thus likely not a big contributor to diminished yields if they bump it (or reduce it) from 550, because speed binning shouldn't play too large a role.

PS - Keep in mind that for myself, I'm not saying that any of these changes are likely at all - I'm simply presenting them as the *most* likely.
 
@xbdestroya: I remember reading similar things about the choice for Cell`s CLockspeed - Sony took the highest possible clockrate at the lowest possible voltage, even thought all of them could go higher on increased voltages. The other option would be *puts on sweaty shirt* "Holes, Holes, Holes... "

IMHO an additional 128 or 256MB would make the biggest impact, if you`d have to show the difference. Not saying it will happen as it wont, but even first-gen games could effectively use it to have better textures/normal-maps than XBox-Equivalents.
 
xbdestroya said:
I see the main determiner of RSX's clockspeed as being voltage/power/heat concerns, not speed binning. For example, any Athlon64/Sempron can reach 2 GHz - you're not going to be increasing or decreasing your yields by launching chips in any range from 2 GHz on down. Now what we'd need to know is, similar to what Hofstee was saying in Version's post, what clockspeed can RSX reach before the power increase needed becomes lopsided against the speed increase gained?

I think they'll go for whatever speed allows them the max 'efficient' voltage while still staying within certain power/heat parameters. I don't see a speed wall as being the limiting factor, and thus likely not a big contributor to diminished yields if they bump it (or reduce it) from 550, because speed binning shouldn't play too large a role.

PS - Keep in mind that for myself, I'm not saying that any of these changes are likely at all - I'm simply presenting them as the *most* likely.

good point...we don't really have a good idea what the speed wall is at the 90 nm process...but if early blunderings are any indicator...700 to 800 mhz is no sweat.
 
ROG27 said:
good point...we don't really have a good idea what the speed wall is at the 90 nm process...but if early blunderings are any indicator...700 to 800 mhz is no sweat.

Uh... I would *hardly* call 700 or 800 MHz no sweat though! It's true that the XT's have been able to reach 1 GHz, but that is sweating sweating sweating! Even 700 and 800, there's considerable sweat there. Even pushing past 600 MHz, the 1800XT is sweating hard. This is a power-hungry, incredibly hot chip.

But different chips, different results. Obviously Xenos is doing quite well on 90nm and at 500MHz, since the yields were so good across a range that they were actually thinking of increasing the clock... and this was a while ago! In the end though they opted not to (they might as well have done it honestly).

And the G70 reaching near 600 MHz on single-slot cooling at 110nm; I mean I'm expecting good things at 90nm from NVidia.

Still, RSX can't afford the kind of power and heat budget these PC chips have. Whatever SOI, low-k, and 90nm can give them in terms of heat/power savings; that'll be what determines the clock/voltage they're willing to push to.
 
Last edited by a moderator:
aye the form factor of the ps3 is much smaller than that of the pc.

So is power draw .


550mhz seems a good clockspeed in terms of heat and power for a closed box design.

Just like ati is able to get the x1800xt to 625mhz but the xenos is only at 500mhz on the same process(heck people are hitting 1ghz with the x1800xt)
 
Titanio said:
Anything's possible, but I highly doubt it.

They've already done their job in terms of establishing it as "the most powerful console" in terms of perception, IMO. So unless devs were really giving out about something, ala the PSP and its memory upgrade, I don't think we'll see much change - and most devs seem very happy with it from a hardware POV.

Perception doesn't really matter though, reality does.
 
nAo said:
this makes sense, nonetheless Sony increased PS2's CPU and GPU clock frequency after official specs were announced, IIRC.
True, though they were kinda on the other side last time. They needed to keep up with later consoles. Now they are the later console. Will it be as important to further surpass previous consoles? I dunno. I suspect Sony PR already has enough ammunition for the superior hardware claims. And that's probably enough.


Bill said:
Perception doesn't really matter though, reality does.
Is that truth, or is that your perception of it?
 
Back
Top