Could be more RSX info...

Shifty Geezer said:
I don't think the sorts of PC gamers who buy SLI'd top-end GPUs care much about high-end console performance. They're either hardcore gamers who'll lap up any gaming hardware, or PC enthusiasts who only care about PC tech. The fact that a console has better performance than a PC is neither here nor there when the top-end PC space has a 6 month timeframe. Those who hang out up there are used to seeing their current gear become inferior in the blink of an eye, and are surely either happy with that or willing to shell out on the next latest and greatest GPU configuration.

That comment was inline with my unlikely hoax conspiracy theory, that is give hoax specs to reel ms into a sense of security with launching early. I'm sure it wouldn't be in good light for pc gpu consumers to think they're not buying the hottest annouced h/w but a substantially inferior outdated product.
 
Well, whatever the case, even under the conservative scenario of the minimum possible voltage for 4GHz operation (1.1v) being dropped to a comfortable core voltage for operation at 3.2GHz (1.0v), that voltage drop coupled with the frequency drop gives us wattage of roughly 66% of the original. So even in the 'worst-case' prediction here of ~85 watts on the original chip (though I'll say I think that 40 watt estimate is just way too high for an upper-bound for the SPEs), we'd still be at ~56 watts on the PS3 version of the chip. That's not peaches and cream per se, but it's not terrible either.

And though we don't have a schmoo for the PPE, the SPE's for their part should be operating at around 30C at that level; sub-furnace to be sure.

If STI's voltage drop was from 1.2v for 4GHz operation and/or to 0.9v for 3.2 GHz operation, the power savings would be even more pronounced; ~45% of the original wattage for 1.2v --> 0.9v, ~55% of the original wattage for 1.2v --> 1.0v, or ~54% of the original going from 1.1v --> 0.9v
 
Last edited by a moderator:
xbdestroya said:
Well, whatever the case, even under the conservative scenario of the minimum possible voltage for 4GHz operation (1.1v) being dropped to a comfortable core voltage for operation at 3.2GHz (1.0v), that voltage drop coupled with the frequency drop gives us wattage of roughly 66% of the original. So even in the 'worst-case' prediction here of ~85 watts on the original chip (though I'll say I think that 40 watt estimate is just way too high for an upper-bound for the SPEs), we'd still be at ~56 watts on the PS3 version of the chip. That's not peaches and cream per se, but it's not terrible either.

And though we don't have a schmoo for the PPE, the SPE's for their part should be operating at around 30C at that level; sub-furnace to be sure.

If STI's voltage drop was from 1.2v for 4GHz operation and/or to 0.9v for 3.2 GHz operation, the power savings would be even more pronounced; ~45% of the original wattage for 1.2v --> 0.9v, ~55% of the original wattage for 1.2v --> 1.0v, or ~54% of the original going from 1.1v --> 0.9v

Yea and its been said plenty of times also with regards to the xCpu that Cell will run cooler, it seems to been one of the primary design goals from the STI team.
 
aaronspink said:
Probably because it isn't... I'll believe it when I see it. ~25 watts for the PPE, 15-20 watts for the IO, another 20-40 watts for the SPEs.

David reaches slightly above 20W for IO (XDR interface + FLEX).

The 48W 4GHz CELL could be a special cherry picked bin, picked for low TDP. Since Sony only puts a 3.2GHz CELL in PS3 this doesn't seem to be the norm.

Cheers
 
Gubbi said:
David reaches slightly above 20W for IO (XDR interface + FLEX).

The 48W 4GHz CELL could be a special cherry picked bin, picked for low TDP. Since Sony only puts a 3.2GHz CELL in PS3 this doesn't seem to be the norm.

Cheers

Yeah ~21 watts for the I/O more or less rules out ~48watts as the norm for a 4GHz chip, but the voltage and frequency drop allowed to sustain 3.2GHz operation from 4GHz still gives a fair assurance that Cell within PS3 will be a cool-running chip relative to the XeCPU.

If RSX is able to stay within the range of Xenos in terms of power consumption and heat, I think Sony will have exceeded what a lot of people were expecting in terms of keeping the thermals of the console down.
 
Alas, with over a billion transistors in their quad-core RSX GPU, it's unlikely Sony will keep this aspect cool.

:p
 
Shifty Geezer said:
Alas, with over a billion transistors in their quad-core RSX GPU, it's unlikely Sony will keep this aspect cool.

:p
Oh please, give us a break. :rolleyes: It's common knowledge now that the RSX has seven cores including the shader processors and director core.
 
Rambus is scheduled to have a quaterly earnings call today. It'll probably be boring, but I thought I'd point this out incase anyone wants to listen in on it.

Rambus Inc. Announces Fourth Quarter 2005 Earnings Call
LOS ALTOS, Calif., Jan 12, 2006 (BUSINESS WIRE) -- Rambus Inc. (Nasdaq:RMBS), one of the world's premier technology licensing companies specializing in high-speed chip interfaces, will hold its quarterly conference call on January 19, 2006 at 2:00 p.m. Pacific Time to discuss financial results for the fourth quarter and 2005.

This call will be webcast and can be accessed via Rambus' IR web site at www.rambus.com. A replay will be available following conclusion of the call on Rambus' Investor Relations web site or for one week at the following numbers: (888) 203-1112 (domestic) or (719) 457-0820 (international) with ID# 4466070.

Shareholder questions may be submitted to Rambus in advance of the conference call for management to respond to commonly asked questions during the call. Please submit your questions before January 16, 2006 via our Investor Relations website home page at http://investor.rambus.com/.
 
xbdestroya said:
Well, whatever the case, even under the conservative scenario of the minimum possible voltage for 4GHz operation (1.1v) being dropped to a comfortable core voltage for operation at 3.2GHz (1.0v), that voltage drop coupled with the frequency drop gives us wattage of roughly 66% of the original. So even in the 'worst-case' prediction here of ~85 watts on the original chip (though I'll say I think that 40 watt estimate is just way too high for an upper-bound for the SPEs), we'd still be at ~56 watts on the PS3 version of the chip. That's not peaches and cream per se, but it's not terrible either.

My numbers were at 3.2 GHz. You can't trust ISSCC numbers too much.

Aaron Spink
speaking for myself inc.
 
aaronspink said:
My numbers were at 3.2 GHz. You can't trust ISSCC numbers too much.

Aaron Spink
speaking for myself inc.

Well, we'll see what the deal is soon enough I imagine. With Mercury Systems starting to ship evaluation designs, maybe some real-world TDP numbers will start to make themselves available.
 
Last edited by a moderator:
Edit: ok well you changed your post Edge so I guess I'm not replying to anything in particular anymore. :)
 
Last edited by a moderator:
During the Rambus conference call, the Sony deal was described as a 3 year project and they're finsihed with the bulk of the work, so Rambus is going to see a decrease in contract revenue starting with their 1st quater results.

The guy from Rambus also clearly states they'll be recieving royalties from a bus and interfaces inside the PS3. From the stuff I've read about XDR2, Rambus has described it as an interface.
 
Last edited by a moderator:
Brimstone said:
During the Rambus conference call, the Sony deal was described as a 3 year project and they're finsihed with the bulk of the work, so Rambus is going to see a decrease in contract revenue starting with their 1st quater results.

The guy from Rambus also clearly states they'll be recieving royalties from a bus and interfaces inside the PS3. From the stuff I've read about XDR2, Rambus has described it as an interface.

Hell no. There's no way a XDR2 module will be inside the PS3! Is there?:???:
 
mckmas8808 said:
Hell no. There's no way a XDR2 module will be inside the PS3! Is there?:???:

I really don't think so. This press release of their's from back last year indicates that 2007 should be around when we could expect XDR-2 to arrive on the scene. But then again who knows - not sure if Brimstone heard anything to the contrary on the conference call.
 
The difference between XDR and XDR2 is microthreading. XDR can clock at the 800 Mhz just like XDR2's intial clock speed. XDR2 has a slightly different physical design to allow for multithreading to work.

XDR2 was only briefly mentioned a few times in the span of the entire call.

This speculation is just me guessing.
 
Did you mean 8 GHz effective rather than 800MHz? (supposedly where they say XDR-2 will start) In addition to speed and micro-threading though, XDR-2 makes signaling improvements via adaptive timing, transmit equalizaton, and DRSL.

I mean it's just plain better, so if PS3 did use it well I think that would be great. I haven't heard of any companies ramping for it though. But if it's just the memory controller rather than the modules themselves (can that be?), maybe one of these recent Cell revisions takes advantage of it. Still if Rambus was saying 2007 in their press release, early/mid-2006 would signify a big departure from that.

Oh well whatever - cool information on the '3 year project' aspect of the Sony relationship either way. :)
 
Last edited by a moderator:
Back
Top