Nvidia, Sony, Cell, and RamBus *PTSD Flashbacks*

  • Thread starter Deleted member 11852
  • Start date
DDR2 also came in different speeds, weaker latency compared to XDR-Ram and it wasn't cheap initially and didn't drop in price until after DDR3 was obsoleting it with their initial prices and market adoption.
I'm not arguing that performances would have been the same, I'm saying that being the standard RAM it was significantly cheaper.
The price of RAM have had notorious hiccups that were not predictable, DDR2 was still cheaper, than more boutique memory. As a side note if newer and older memory are made on the same process there are no reason for the slower suckier memory to be that cheaper, DDR2 kept existing for legacy devices, volume went down => more reasons to kept the price high. The same reasoning applies to any RAM type, I suspect it will a while before MSFT gets the DDR3 (pretty fast type) in the XB1 at a significantly lower price than DDR4 using the same process especially as the y later get more common.
512MB XDR-RAM and 512MB GDDR3 in a retail PS3 would have made far more logical sense...however there are physical limitations to selling such a console at a reasonable price when customers will take any take for granted.
Looking at PC I would think that more main RAM is wiser, for 1GB of Memory I would think 768MB for the main RAM and 256 MB for the VRAM would sound more inline. Won't happen anyway unless scientists make a really major breakthrough lol
 
I'm not arguing that performances would have been the same, I'm saying that being the standard RAM it was significantly cheaper.
And nobody is disputing this, what we're disputing is that this was an option for a 2006 launch. It wasn't. Let it go.
 
http://www.hotchips.org/wp-content/uploads/hc_archives/hc22/HC22.23.215-1-Jensen-XBox360.pdf

Whatever that means. But I mean, measured power consumption went down quite a lot.

I think power consumption for Xenos was far less than for the Xenon. I could swear I read at launch that the power draw of the GPU was something like 25 or 30 W but I can't find that now, so perhaps I imagined it. The heatsink for the GPU had much less area and much, much less airflow though so indirectly that backs up the idea that the GPU produced rather less heat than the CPU and accounted for rather less than 50% of the silicon power draw.

I think it's reasonable to suggest that the slim SoC pumped out more heat than the Falcon GPU, perhaps more even than the launch GPUs...?

Whatever, the heat put out by Xenon shouldn't have been a problem and PC cards handled several times the power with much greater reliability. My Falcon RRoDed with a GPU solder issue despite the fans almost never going more than a notch or two above minimum - so the chips definitely thought they were running cool. Felt cool inside when I was doing clamp experiments too. Damn thing, a little past the three year warranty. I still bought another though. I couldn't bear the thought of not being able to play Halo 4 when it arrived. :(
 
I could swear I read at launch that the power draw of the GPU was something like 25 or 30 W but I can't find that now, so perhaps I imagined it.
No, it was stated in a bunch of places (maybe repeated from same source, I dunno) that xenon consumed 30W. I personally think that's bullshit though, comparable PC GPUs drew a lot more.

Whatever, the heat put out by Xenon shouldn't have been a problem and PC cards handled several times the power with much greater reliability.
Yes, because they had properly engineered cooling solutions, with sufficient amount of room for heatsink and airflow dedicated for the task. :)
 
No, it was stated in a bunch of places (maybe repeated from same source, I dunno) that xenon consumed 30W. I personally think that's bullshit though, comparable PC GPUs drew a lot more.
The difference maybe be just the GPU silicon vs. an entire graphics card (GPU, RAM, micro-controller, cooling). Some GPU fans are pulling up to 5w.
 
The difference maybe be just the GPU silicon vs. an entire graphics card (GPU, RAM, micro-controller, cooling). Some GPU fans are pulling up to 5w.

Indeed. The 360 also had a separate video out chip, which was likely drawing a few watts (the fan and HS shroud drew air over this also). VRMs on a GPU can kick out a crazy amount of heat too...
 
You guys are good at deluding yourselves. :p Just look at heatsinks for PC cards of the era comparable to the 360. Way bigger than the ridiculous extruded dinky thing MS put in there.
 
You guys are good at deluding yourselves. :p Just look at heatsinks for PC cards of the era comparable to the 360. Way bigger than the ridiculous extruded dinky thing MS put in there.
Not at all. I think you're probably comparing apples to oranges.
 
I'm not arguing that performances would have been the same, I'm saying that being the standard RAM it was significantly cheaper.
The price of RAM have had notorious hiccups that were not predictable, DDR2 was still cheaper, than more boutique memory. As a side note if newer and older memory are made on the same process there are no reason for the slower suckier memory to be that cheaper, DDR2 kept existing for legacy devices, volume went down => more reasons to kept the price high. The same reasoning applies to any RAM type, I suspect it will a while before MSFT gets the DDR3 (pretty fast type) in the XB1 at a significantly lower price than DDR4 using the same process especially as the y later get more common.

All I wanted to add was that console systems have certain engineering planned target performance...it's really best to focus on what was essentially an effective design of CellBE and XDR-RAM that DDR2 was never gonna match and just create problems.

DDR2 wasn't really "cheap"... we don't have hard numbers on pre-planned contracts where they cannot anticipate price drops...and I would imagine or assume that once that paper is signed, it's for that price regardless if AMD/Intel had saturated the market on DDR2 and caused a price drop like when DDR3.

Like I mentioned, IBM made PowerXCell8i use DDR2 for a completely different market in 2008...

Lets say if Sony had purposely pre-planned to launch PS3 in 2008...chances are their engineering design teams would still not use DDR2 because it would still be not as efficient...

Then contemporary die shrinks across the board plus engineering ambition would probably target a far more cutting-edge tech goals.

45nm CellBE all 8 SPUs @ 4.2Ghz with p-states and double the cache would bring CellBE to a cool running, comparably efficient performance per watt.

XDR-RAM at 1GB or preferable 2GB if costs plus die shrinks help make denser ram helping increase volume and allowing far more breathing room for the CellBE to address.

GDDR3 at 1GB or preferable 2GB for RSX reasons...but also to help solidify thruput

GDDR4 might be nice and niche...however my reasoning is that in 2008 2GB graphics ram on PC cards was seen as too much because the games and reviewers didn't have knowledge or apps that exposed ram usage like say game developers do... (look at Killzone 2/Uncharted 2 dev videos) so the unfortunate analysis from PC hard/soft reviews concluded that since they didn't see it, it wasn't necessary...

Games consoles handle ram and streaming different though...

Looking at PC I would think that more main RAM is wiser, for 1GB of Memory I would think 768MB for the main RAM and 256 MB for the VRAM would sound more inline. Won't happen anyway unless scientists make a really major breakthrough lol

I based my opinion on some "if" scenario where Sony instead green lights a 512MB+512MB configuration which honestly isn't that significant although it is...you still have to fill up that ram...the best solution is a 2007 tech plan launch or am optimal 2008 scenario that's a bit bleak.

Die shrinks help overall performance per watt and clocks...stuff like a SATA II controller becomes possible (but mainly for us in hindsight for SSD usage) and perhaps a PS4 like full or partial instal of games...so that the disc drive isn't constantly spinning...

Blu Ray drive could have been 4x or even 6x

Then cap everything with a Sony fabbed full blown custom GT200 GPU at 55nm...if it was possible to fab at 45nm then better still...and bam you would have a Microsoft dominated console market with a Ultra PS3 coming in boasting 4.5 times the GPU performance, capability and thruput...or games that almost look like PS4 games.

I'm off a bit because I still feel 65nm + 55nm wait for 2007 was the best possible theory idea to jack up image quality while 2008 is a cut off...too late

I think power consumption for Xenos was far less than for the Xenon. I could swear I read at launch that the power draw of the GPU was something like 25 or 30 W but I can't find that now, so perhaps I imagined it. The heatsink for the GPU had much less area and much, much less airflow though so indirectly that backs up the idea that the GPU produced rather less heat than the CPU and accounted for rather less than 50% of the silicon power draw.

I think it's reasonable to suggest that the slim SoC pumped out more heat than the Falcon GPU, perhaps more even than the launch GPUs...?

Whatever, the heat put out by Xenon shouldn't have been a problem and PC cards handled several times the power with much greater reliability. My Falcon RRoDed with a GPU solder issue despite the fans almost never going more than a notch or two above minimum - so the chips definitely thought they were running cool. Felt cool inside when I was doing clamp experiments too. Damn thing, a little past the three year warranty. I still bought another though. I couldn't bear the thought of not being able to play Halo 4 when it arrived. :(

PS3 launch units had a monster heatsink system...five or six copper heat pipes? Heat spreaders on GPU and CPU?

Xbox360 could have had that...I may have dramatically improved heat dispersal resulting cooler peak usage.

Power consumption...there were were launch socket draw tests done back then...not sure where they are but both had to be drawing over 300W hence their PSUs so there's no way in hell that a graphics GPU would be consuming such a low figure. Based on comparable information I would estimate 120W TDP or thereabouts for Xenos.

Xenon might be lower than 120W...probably 95W for launch units...that would help explain Xbox360 slims claiming 65% less power draw.

You guys are good at deluding yourselves. :p Just look at heatsinks for PC cards of the era comparable to the 360. Way bigger than the ridiculous extruded dinky thing MS put in there.

Xenos was 240 million transistors right? And the 271 (far less than 302m G70) million transistor G71 was likely consuming upwards of 135W for a PC card part..it still needed that hair drier heatsink system to draw heat away and out the case.
 
PS3 launch units had a monster heatsink system...five or six copper heat pipes? Heat spreaders on GPU and CPU?

Xbox360 could have had that...I may have dramatically improved heat dispersal resulting cooler peak usage.

Power consumption...there were were launch socket draw tests done back then...not sure where they are but both had to be drawing over 300W hence their PSUs so there's no way in hell that a graphics GPU would be consuming such a low figure. Based on comparable information I would estimate 120W TDP or thereabouts for Xenos.

Xenon might be lower than 120W...probably 95W for launch units...that would help explain Xbox360 slims claiming 65% less power draw.

360 was nothing like 300W, it was about 180W at the wall.

http://www.anandtech.com/show/3774/welcome-to-valhalla-inside-the-new-250gb-xbox-360-slim/3

Assume a power supply efficiency of about 85%, take off a few watts a piece for fans, HDD, DVD, video output chip (separate to Xenos), each of the eight (overheating) memory chips, VRMs, random board shit, etc etc. Also bear in mind that Xenon had a vastly more expensive heatpipe cooler with several times the cooler surface area and several times the airflow of Xenos: Xenon was clearly by far the bigger consumer of power. Actually, 30W for Xenos - just the chip - isn't so unreasonable. It certainly wasn't pulling vastly more than that - the numbers don't allow it.

It's simply not possible for Xenos to have been drawing even remotely close to 120W. That's outlandish and would have meant it was drawing several times more power than the massively better cooled CPU.

Xenos was smaller and lower clocked than high end PC parts of the time, and even if it wasn't you can't compare a full graphics card to a single chip.
 
I should have added a comparison based on reference, not absolute power draw...

Years ago I found a rare PC review site that had total power draw numbers iirc on older GeForce 7800 GTX, 7900 GTX and ATI Radeon 1800, 1900 XTs... (because of the constant referencing of "gimped" console GPUs and claims that G80 would have been realistically possible on this forum site by posters)

Admittedly they were total system power draw...I must admit I read that linked url article when it came out years ago...just couldn't recall it at the time of posting.

Edit: the url link of the site I mentioned that did at the time total power draw is something I cannot recall or cannot find...I found this one instead which attempts at "estimates" power draw for the graphics card alone.

http://forums.atomicmpc.com.au/index.php/topic/264-the-truth-about-graphics-power-requirements-v2/

The closest thing to the PS3 Nv47 based RSX imho is the GeForce 7950GT (G71 at 90nm) at 61W estimated.

GeForce 7800/7900 GTXs (110nm/90nm) are at 81W, 84W

A GeForce 8800 GTX aka G80 at 90nm is estimated at 132W

A GeForce 9800 GTX+ aka G92 at 55nm is est at 80W

A GeForce GTX 285 OC to 702Mhz aka GT200 at 55nm is est at 161W

An ATI Radeon HD 3870 aka RV670 at 55nm is est at 123W while the 3850 which is a down clocked, single slot card is est at 84W with a 1GB buffer.

Take that with a grain of salt however as there are variations with vendors and possible binning affecting the numbers.

I wish there was a much more comprehensive record keeping of these numbers but PC parts seems hard to track these days because the old stuff is often just forgotten apparently.
 
Last edited:
I should have added a comparison based on reference, not absolute power draw...

Years ago I found a rare PC review site that had total power draw numbers iirc on older GeForce 7800 GTX, 7900 GTX and ATI Radeon 1800, 1900 XTs... (because of the constant referencing of "gimped" console GPUs and claims that G80 would have been realistically possible on this forum site by posters)

Admittedly they were total system power draw...I must admit I read that linked url article when it came out years ago...just couldn't recall it at the time of posting.

Edit: the url link of the site I mentioned that did at the time total power draw is something I cannot recall or cannot find...I found this one instead which attempts at "estimates" power draw for the graphics card alone.

http://forums.atomicmpc.com.au/index.php/topic/264-the-truth-about-graphics-power-requirements-v2/

The closest thing to the PS3 Nv47 based RSX imho is the GeForce 7950GT (G71 at 90nm) at 61W estimated.

GeForce 7800/7900 GTXs (110nm/90nm) are at 81W, 84W

A GeForce 8800 GTX aka G80 at 90nm is estimated at 132W

A GeForce 9800 GTX+ aka G92 at 55nm is est at 80W

A GeForce GTX 285 OC to 702Mhz aka GT200 at 55nm is est at 161W

An ATI Radeon HD 3870 aka RV670 at 55nm is est at 123W while the 3850 which is a down clocked, single slot card is est at 84W with a 1GB buffer.

Take that with a grain of salt however as there are variations with vendors and possible binning affecting the numbers.

I wish there was a much more comprehensive record keeping of these numbers but PC parts seems hard to track these days because the old stuff is often just forgotten apparently.

It's true, the magical internet and Google's algorithms have limits to how well they preserve and make findable the interesting stuff from yesteryear.

Your post got me curious to look for the kind of power that older memory types memory consume, with an eye to guestimating the draw of the memory chips for the original 360 (8 x 512 GDDR 1400) and PS3 (XDR and GDDR3 1300). That might explain in precise detail a chunk of the power that was used by these systems ... if I could find it.

Unfortunately, while I've been able to find some data sheets for GDDR3 I've found nothing with power per chip listed, and I've found very little at all for XDR. The 7900 and 7950 (estimated) values are interesting though, given clocks, bus widths, memory etc ....
 
It's true, the magical internet and Google's algorithms have limits to how well they preserve and make findable the interesting stuff from yesteryear.

Your post got me curious to look for the kind of power that older memory types memory consume, with an eye to guestimating the draw of the memory chips for the original 360 (8 x 512 GDDR 1400) and PS3 (XDR and GDDR3 1300). That might explain in precise detail a chunk of the power that was used by these systems ... if I could find it.

Unfortunately, while I've been able to find some data sheets for GDDR3 I've found nothing with power per chip listed, and I've found very little at all for XDR. The 7900 and 7950 (estimated) values are interesting though, given clocks, bus widths, memory etc ....

For what purpose?

To find out if XDR-RAM consumes more wattage than DDR2?

Cheaper isn't always the better or even efficient answer and cheaper isn't always "cheaper"

Memory performance targets have always had some deeper engineering reasons discussed by teams way in advance.

To find out if adding additional ram would mean additional wattage consumption?

This may be a little something but even double the ram (the desirable option) in retrospective theory...has more to do with some cost and competition and arbitrary decisions based on limitations back then.

Biggest problem for me is when they made those decisions but it's too late...an alternate history would be dramatically different if double or quadruple the overall architecture power/efficiency was chosen.
 
The ps3's problem wasn't Cell or XDR ram . It was bluray. The drives were expensive and late and if they went with dvd they could have launched the same year as MS or if the chips weren't ready they could have released in 2006 but at a cheaper price point and most likely 512 megs of ram for the gpu.
 
For what purpose?

To find out if XDR-RAM consumes more wattage than DDR2?

Cheaper isn't always the better or even efficient answer and cheaper isn't always "cheaper"

Memory performance targets have always had some deeper engineering reasons discussed by teams way in advance.

To find out if adding additional ram would mean additional wattage consumption?

This may be a little something but even double the ram (the desirable option) in retrospective theory...has more to do with some cost and competition and arbitrary decisions based on limitations back then.

Biggest problem for me is when they made those decisions but it's too late...an alternate history would be dramatically different if double or quadruple the overall architecture power/efficiency was chosen.

I'd be interested to see how much power was taken by 128-bit vs 256-bit bus on the GDDR3 front. eDRAM undoubtedly saved power and engaged ludicrous speed, but I'd like to see if there were any indicators as to how much.

At ~180W peak the 360 was just behind the PS3 on around 200W peak. I suspect that the power MS saved in going for a smaller main memory bus got thrown back into aggressive CPU and (particularly) GPU clocks.*

*Aggressive by console standards, not PC. And MS were hitting 3200/500 12 months earlier.
 
The ps3's problem wasn't Cell or XDR ram . It was bluray. The drives were expensive and late and if they went with dvd they could have launched the same year as MS or if the chips weren't ready they could have released in 2006 but at a cheaper price point and most likely 512 megs of ram for the gpu.
I mostly agree, although Cell was also a problem. And RSX. Unlike PS4 where the stars aligned into a good, all-round product, PS3 did have lucky stars and a few points came together to produce a less than ideal result. Still, it muddled through. And plenty of us are grateful for PS3's BRD capability.
 
I mostly agree, although Cell was also a problem. And RSX. Unlike PS4 where the stars aligned into a good, all-round product, PS3 did have lucky stars and a few points came together to produce a less than ideal result. Still, it muddled through. And plenty of us are grateful for PS3's BRD capability.

More specifically, development on Cell was a problem. On top of that, the horror stories that have come out about the early days of PS3 development both in terms of documentation and the actual coding environment are legendary. One of my favorites is Dave Lang's description of the near ritualistic process for updating the dev boxes due to their tendency to brick, which of course would stop development dead.

No, I don't believe that PS3 could have launched a year early without the BR drive. As it was that thing came in hot at all levels.

After typing that I took a minute to see if I could find the Bombcast on which those PS3 launch game development impressions were shared and I found it. Relevant discussion starts at 1:08:45.
 
The ps3's problem wasn't Cell or XDR ram . It was bluray. The drives were expensive and late and if they went with dvd they could have launched the same year as MS or if the chips weren't ready they could have released in 2006 but at a cheaper price point and most likely 512 megs of ram for the gpu.

Release in 2006 with 512 megs?

It's a matter of timing....Microsoft rushed a generation jump too early in 2005.

They rushed the tech, CPU/GPU/Ram/storage.

The best Sony could have done is pre-plan for 2007 at 65nm and 55nm...2007 hardware design and ram targets...both companies should have done that.

Adding more ram doesn't seem to be a solution compared with current gen ram abundance and their refreshed mid-gen hardware doubling up+ their GPUs.

I'd be interested to see how much power was taken by 128-bit vs 256-bit bus on the GDDR3 front. eDRAM undoubtedly saved power and engaged ludicrous speed, but I'd like to see if there were any indicators as to how much.

At ~180W peak the 360 was just behind the PS3 on around 200W peak. I suspect that the power MS saved in going for a smaller main memory bus got thrown back into aggressive CPU and (particularly) GPU clocks.*

*Aggressive by console standards, not PC. And MS were hitting 3200/500 12 months earlier.

But isn't the Blu Ray drive and the PS3 heatsink system a dramatic difference over Xbox360?

I've played with many 2005-07 x360s...their heatsink system was no where near as complex as the PS3 06-08 and Blu Ray drive has to consuming a bit...considering it played three different console generations in one.

I have some Xbox360 magazine special where dev boxes are pictured...and prototype stuff... 90nm for the time was what yielded those clocks, cores, gpu and temps. Originally Microsoft was gonna put just 256MB and it wasn't until feedback requests called for that 512MB

An more powerful Xbox360 with built in HD-DVD would have saved a lot of headache...combined with my hindsight die shrinks...perhaps Microsoft would have avoided 2 Billions in loses due to RRoD and piracy.

Both Blu Ray and HD-DVD drives were relatively new cutting-edge technology with the asking price to validate it...consoles may have both been $600 price points but consumers would have had less reasons not to upgrade due to price fears theory.

And piracy would have taken longer with bigger newer storage media.
 
Back
Top