GTX512/RSX analysis

JVD I completely disagree with you. Any spec boost - however small - would carry with it a psychological effect among the 'hardcore' far beyond what the boost itself would bring in real life performance. It would be completley disingenuous for you to deny this, especially in an environment where everyone until now seemed to expect a spec decrease from Sony. And obviously power consumed goes up at higher frequencies - but at the same voltage 550 to 570 would be basically no change at all, being linear in nature voltage equal.

I don't need to 'prove' anything because I am simply presenting a theory - and that theory makes sense. Since you bring up the Athlons, let me ask you, how many Athlons/Semprons can *not* reach 2 GHz? I doubt any. At 550 MHz, RSX's limiting factor bin-wise will be simply what voltage is required to get the majority of the chips there. And like I said before, if that speed can be 570 as readily as 550 - why would Sony not boost it? A question you still have not answered.

As for home systems, I actually run my 2800 at two speeds normally: 1100 MHz at 0.9v and 2100 MHz at 1.3v. I can tell you I have a *significantly* lower power draw and operating temp than you yourself do, actually able to cool my CPU passively at both voltages with my Zalman should I choose to. ;)
 
Last edited by a moderator:
Based on this information and rumor Sony could improve RSX if it hasn't locked things down.
Especially the memory and some of the features.

It could also be that these features now being seen in the G70 and upcoming products are by-products of what is happening in RSX.

If Sony even announces next week that their final specs will be an estimated 10% better than the currently announced, it will divert more folks who aren't already set on buying a 360.

Interesting article.
Speng.
 
xbdestroya said:
JVD I completely disagree with you. Any spec boost - however small - would carry with it a psychological effect among the 'hardcore' far beyond what the boost itself would bring in real life performance. It would be completley disingenuous for you to deny this, especially in an environment where everyone until now seemed to expect a spec decrease from Sony. And obviously power consumed goes up at higher frequencies - but at the same voltage 550 to 570 would be basically no change at all, being linear in nature voltage equal.

I don't need to 'prove' anything because I am simply presenting a theory - and that theory makes sense. Since you bring up the Athlons, let me ask you, how many Athlons/Semprons can *not* reach 2 GHz? I doubt any. At 550 MHz, RSX's limiting factor bin-wise will be simply what voltage is required to get the majority of the chips there. And like I said before, if that speed can be 570 as readily as 550 - why would Sony not boost it? A question you still have not answered.

As for home systems, I actually run my 2800 at two speeds normally: 1100 MHz at 0.9v and 2100 MHz at 1.3v. I can tell you I have a *significantly* lower power draw and operating temp than you yourself do, actually able to cool my CPU passively at that voltage with my Zalman should I choose to. ;)

they already have a spec advantage , this boost your talking about will end up costing them money for nothing. Hardcore people don't matter , sony fans are already crying as loud as they can about 2tflops and the power of the cell and how killzone is real time . Adding 20mhz to the gpu isn't going to change that for better or worse


As for why they wouldn't boost it , i've already explained many times , cooling , voltage draw , price , small or no performance gain in reality .

These are all reasons why they wont


As for cpu , we most likely have diffrent cores. My cpu has its own performance. Same with the rsx . It will have its own diffrences in power draw at diffrent clock speeds and it will be brand new on a new process .


There is no reason for sony to increase anything about the ps3 all it will do is add cost .
 
jvd said:
There is no reason for sony to increase anything about the ps3 all it will do is add cost .

We don't know the heat and power RSX generates - we also don't know under which constraints the RSX will be within the PS3 case. If the case and the airflow/cooling measures allow for a higher heat/power consumption than the initial 550MHz RSX would have allowed, I see no reason why this would cost Sony more. As a matter of fact, it makes sense to get the most out of your design and if that means higher clockrates without changing anything significant, I see all the reason they would.


:!:
Great thread btw Xbd. A shame people like jvd is on the best way of getting it closed and reducing the chance of civil / constructive discussion. :???:
 
jvd said:
they already have a spec advantage , this boost your talking about will end up costing them money for nothing. Hardcore people don't matter , sony fans are already crying as loud as they can about 2tflops and the power of the cell and how killzone is real time . Adding 20mhz to the gpu isn't going to change that for better or worse


As for why they wouldn't boost it , i've already explained many times , cooling , voltage draw , price , small or no performance gain in reality .

These are all reasons why they wont

Answer this.

IF 99% of every RSX core that can achieve 550MHz at a certain voltage can also reach 570 at that same voltage, WHY wouldn't Sony raise it to 570?

Let's assume RSX is a 100-watt part; for absurdness and for ease of example.

Then at 570, the RSX would become a ~103 watt part. I don't think that's going to make cooling or power draw more difficult, do you? If RSX were a 50 watt part at 550 MHz, then it'd be ~52 watts at 570.

As for cpu , we most likely have diffrent cores. My cpu has its own performance. Same with the rsx . It will have its own diffrences in power draw at diffrent clock speeds and it will be brand new on a new process .

I'm sure you could lower your voltage to if you wanted - give it a shot at sub-2200 MHz levels.

There is no reason for sony to increase anything about the ps3 all it will do is add cost .

And I am telling you, not necessarily.
 
jvd said:
There is no reason for sony to increase anything about the ps3 all it will do is add cost .


Not saying it will happen, but if the speed bump doesn't increase costs for Sony, then they will probably do it. If RSX can handle a little more clockspeed without turning into a furnace, they can do it. If it's free, then why not!

Still, this is all speculation so we'll have to wait.
 
Economically if it cost Sony $50 to put in the current spec RAM now and it scales down to $40 in two year's time, $30 in four years, and if it would cost $100 to put in better RAM now and it'll scale down to $50 in two year's time, $40 in four years, the price/performance might be worth Sony considering.

Remember it's also imperative Sony make the PS3 as future proof as possible if MS are going to be a spending so much on launching a console in four years time (something that jvd has yet to explain why they should, despite my having asked twice politely, though he'll happily ask of other the motives of company decision making along speculations such as these...)
 
Shifty Geezer said:
Economically if it cost Sony $50 to put in the current spec RAM now and it scales down to $40 in two year's time, $30 in four years, and if it would cost $100 to put in better RAM now and it'll scale down to $50 in two year's time, $40 in four years, the price/performance might be worth Sony considering.

Multiply those numbers by 100 million units and I beg to disagree, I'd say it's hardly worth +1 billion dollars...
However I don't think it would cost that much to uprgrade the memory in the long run. it has to be really close in cost to current setup to make it feasible.
 
Last edited by a moderator:
Dr Evil said:
Multiply those numbers by 100 million units and I beg to disagree, I'd say it's hardly worth +1 billion dollars...
However I don't think it would cost that much to uprgrade the memory in the long run. it has to be really close in cost to current setup to make it feasible.

I think Shifty was just showing an example of the scaling over time - like you I doubt seriously that the 1.1ns GDDR3 would be priced at that sort of differential relative to the 1400MHz solution even at this early stage. What was ML's estimates on the RAM again? Something pretty low - $25 per 256MB - which if we were to extrapolate from (and I myself am not 100% willing to honestly) would probably indicate a very nominal BOM raise for the improved RAM, to be incurred only for the beginning of the console's life - say $12.50? That would represent an arbitrary 50% price premium on the GDDR3 going from 1400 to ~1800.

Again I don't trust Merril's numbers whatsoever anyway, so perhaps if their is someone here with an account at DRAMexchange or elsewhere that could provide us some more tangible differentials between the current pricing on 1.1ns RAM and slower options, we would have something more solid at least on that front.
 
I think the problem jvd is having is that he assumes any increases is a cost increase, which isn't necessarily the case for chips. Nominal frequency/voltages allow for a certain range (as you said, I doubt there is any A64 around that can't do at least 2ghz, even those clocked at 1.8ghz) -- it may end up being a similar situation, where a 550mhz RSX was a fairly conservative guess and they can increases it to ~570 on the same voltage while getting the same yields (in this case, there is no reason to supply a lower speed product because there aren't multiple markets that need to be met, like in A64's case).

Either way, I'll be happy -- it is about the games, afterall =o
 
Moving to 1.1ns GDDR3 would jump RSX VRAM bandwidth from 22.4GB/s to 28.8GB/s (up 29%) right?

Given that consensus seems to be that the VRAM bandwidth is the greatest issue for the PS3, if the cost of better GDDR3 is marginal why would Sony not opt for it?

That's the only spec upgrade I can conceivably see happening.
 
xbdestroya said:
JVD frankly I think I address the power issue as best as anyone has on the subject thus far; certainly I've been the only one to really bring the talk of voltage and indeed the limits on what Sony might clock the chip at up in our other discussions on the topic. A process shrink 110nm to 90nm, everthing else equal *should* indeed reduce heat and power - not to mention that low-k and SOI will reduce it a *lot*. You can't point to x1800 and say that 'you see, 90nm does nothing,' when x1800 is a transistor and clockspeed monster.

RSX isn't an SOI part.
 
avaya said:
Moving to 1.1ns GDDR3 would jump RSX VRAM bandwidth from 22.4GB/s to 28.8GB/s (up 29%) right?

Given that consensus seems to be that the VRAM bandwidth is the greatest issue for the PS3, if the cost of better GDDR3 is marginal why would Sony not opt for it?

That's the only spec upgrade I can conceivably see happening.

That's one of those upgrades that will have a cost increase, whereas a bump in speed of RSX might not. Depending on the cost increase it might be worthwhile (or they could have been planning it all along -- who knows with Sony), but it also very well might not (probably not infact).

The waiting game begi...err... continues -- on a side note I am interested to see the prices for the different GDDR3 speeds, just out of curiosity.
 
Vince said:
RSX isn't an SOI part.

Are you sure? I thought all the indications were that RSX would be both low-k and SOI, built on the same process as Cell itself. Isn't that what Sony's new Nagasaki lines are set up to handle? If you have any definitive evidence of this though, do present it, because then I have to make a change or two ASAP.
 
Last edited by a moderator:
Dr Evil said:
Multiply those numbers by 100 million units and I beg to disagree, I'd say it's hardly worth +1 billion dollars...
$1 billion sounds a lot, but taken in other prices it's a relative price. If Sony are already losing $200 per console, $10 is a great deal. Likewise if they're already making $200 profit per unit, $10 isn't too much too worry about.

We can all see PS3 isn't a minimum spec, cheapest possible solution. I'm sure they could remove and downrate features from the current spec to save 20 bucks a unit, and make $20 billion extra over the console's life (assuming 100 million sales) but Sony haven't. The whole thing is a costs to benefits consideration, like any product, and the cheapest solution isn't always the one wanted. If Sony decide the improved performance is worth the extra cost per unit, for whatever reason factoring in also perhaps long term strategies, it might be something they consider doing.
 
I figure I'll throw in my two cents here...

I think that the RSX will most likely stay at 550 MHz, no matter what. If they need to beef up cooling and up the power to get that number, they will. If they can achieve that number at the rated specs, then super. If they can get 550 MHz and not have to expend as much power and keep cooling down, then they will happily decrease those two things for cost purposes. While Sony is competing with Microsoft, there isn't really as direct of a competition as between ATI and NVIDIA. Each console is unique, and the programming for them will reflect it. Also, who exactly is going to develop FRAPS for PS3/Xbox 360 comparisons? It really isn't in anyone's interest to do such a thing other than for pure intellectual reasons.

I don't think NVIDIA and Sony will have a hard time at all getting RSX at 550 MHz within their stated specs. While the GTX is not "proof" that NVIDIA can do this, I think it basically points out that NVIDIA's engineers have a proven track record of being able to have aggressive clocks for complex ASICs. While a dog, the NV30 was able to hit 500 MHz on 130 nm FSG. Having a 302 million transistor part made on the "budget" 110 nm process running at 550 MHz (though they may be cherry picked) speaks volumes when comparing this product to the original NV40 which had a really rough time making it to 425 MHz on IBM's 130 nm process. Better tools, more experience, and now they have the advantage of Low-K (though I don't think Sony's process does utilize SOI, I am not 100% sure on that). Also, comparing the RSX at 550 Mhz to the ATI R520 is really pointless. ATI made some interesting choices for that product that should not reflect on anyone else's product when it comes to using the same process. A better comparison would be the R4x0 series made on 110 nm vs. that made on 130 nm Low-K. The R430 can usually hit around 465 MHz maximum before it fails, while the nearly identical R480 design made on 130 nm Low-K can clock to the high 500's. If NVIDIA can hit 550 Mhz with the "older" G70 on 110 nm, then really it isn't much of a stretch to consider that a newly designed part that is more optimized for higher clockspeed AND using a 90 nm Low-K process should be able to clock as fast, yet use a lot less power and produce less heat.

Again, I doubt Sony and NV will raise the clock specification on the PS3, as there really isn't any marketable reason for them to do such. They will either hit the specs they gave out, or they will lower the voltage and power to the chip if they can get away with it (thereby lowering the cost of the product by including less cooling and a smaller power supply).
 
Josh I still think that given the same voltage, there'd be no reason for Sony not to raise the core voltage - as the effects on power and heat (and cost) of such a move would be nearly immaterial, but good post with good thoughts.
 
Even if you use the same voltage, you are still going to get more power draw to fuel the extra clocks. A good example of this is the Athlon 64 and Pentium 4. Both use 1.4 v, but the P4 draws more power and runs far hotter than the Athlon. Now, going from 550 MHz to 570 MHz is not extreme in any way, shape, or form... but what would be the point of such a small increase if it will minimally affect the power being consumed?

Don't get me wrong, your idea does have merit, but in a closed box solution like this I would think they would be more in favor of sticking to the original specs and if they get a part that is cooler and less power hungry than originally spec'd, then they will take the power and heat savings.
 
JoshMST said:
Even if you use the same voltage, you are still going to get more power draw to fuel the extra clocks. A good example of this is the Athlon 64 and Pentium 4. Both use 1.4 v, but the P4 draws more power and runs far hotter than the Athlon. Now, going from 550 MHz to 570 MHz is not extreme in any way, shape, or form... but what would be the point of such a small increase if it will minimally affect the power being consumed?

Marketing! Upgrades make people happy -- all of a sudden people think they are getting more from the x amount of money they were prepared to spend (little jimmy tells his friends that PS3 got upgraded -- who cares if little jimmy doesn't know that a 550 to 570 clock increase is only ~3.6% on a single part of the system). Even if it is a tiny upgrade... It also brings out more positive press.
 
JoshMST said:
Even if you use the same voltage, you are still going to get more power draw to fuel the extra clocks. A good example of this is the Athlon 64 and Pentium 4. Both use 1.4 v, but the P4 draws more power and runs far hotter than the Athlon. Now, going from 550 MHz to 570 MHz is not extreme in any way, shape, or form... but what would be the point of such a small increase if it will minimally affect the power being consumed?

Don't get me wrong, your idea does have merit, but in a closed box solution like this I would think they would be more in favor of sticking to the original specs and if they get a part that is cooler and less power hungry than originally spec'd, then they will take the power and heat savings.

The second part I agree with wholeheartedly, to the extent that I think if they could reach 540 at a certain voltage, but required a full bump to reach 550 comfortably, they might go with 540. I personally don't see the increase of a couple of tens of MHz as very material in wattage and heat terms assuming the same voltage - the increase there being linear - but I see where you're coming from. I guess one's opinion is partly shaped by how much one feels that Sony would benefit from a spec increase. I personally could see a certain positive 'halo' effect associated with it, but hey whatever - all different facets of the same diamond. :)
 
Back
Top