Nvidia, Sony, Cell, and RamBus *PTSD Flashbacks*

  • Thread starter Deleted member 11852
  • Start date
There's only so much I can say, and I'm not sure I should have written my previous post.

As for Sony & MS not working with NV, I heard it's more about costs than anything (including fees for a shrink and such) that didn't please either MS or Sony.
I think this is what happens with Nvidia on the console GPU front. I think Nvidia is not willing to provide a low-profit-margin custom GPU. Or maybe they are, just not with nearly as low a margin as AMD is willing to go. And I think AMD's aggression here is serving them well. Nvidia is taking much of the cake on the desktop GPU sales market too (maybe in laptops as well, I don't know as much there market share wise), so AMD has to be aggressive on these fronts I think.

I am also considering going AMD for my next GPU, I just need to explore some few niggles before I make the jump, and of course I want to see the performance of their new cards :)

Did they even approach ATi?
I also wonder this a lot. I hope that they did. But for some reason I think the Sony pride of that era may have hindered that.... or Ken Kutaragi decided not to ask ATI knowing that they were working with Xbox maybe...
 
1 - *your
2 - My only perception is the same as everyone else's: the PS3 came out later and brought a substantially less advanced, less flexible and worse performing GPU than the X360.

Perception based on what?

Both consoles featured GPUs that would deliver "near parity" on multi-platform games as long as devs didn't take shortcuts with bad "port-jobs"

What exactly is the comparable GPU to Xbox360 Xenos? Sure it had some exclusive features but it wasn't like Xbox360 games were all running at 60fps and 1080p

Also there were official releases that Sony was targeting 65nm chip engineering process node for CellBE which means they weren't going to release a PS3 in 2005 or earlier to begin with.

Ironically it was Microsoft who jumped the gun and wanted to target 2005 as a launch year...in hindsight if Microsoft had wanted a Dx10 GPU they would have had to wait for 2007 where they would also have had a possible 65nm Xenon CPU mated to a 55nm RV670 plus larger eDram and larger than 512MB UMA ram...they originally targeted 256MB UMA size ram.

Also despite what you are believing about UMA versus split ram... there wasn't really a huge difference between graphics as much a game design choices...and somehow Gran Turismo 6 has tesselation...that's not a checklist feature of RSX.

Regardless, the PS3 got a nvidia GPU. The PS4 did not.


That's what most people say, yes.
But how late? The X360's development started and the console was released within less than 3 years. That was really fast. So exactly how late was nvidia to the party in order to be excused to provide an old GPU architecture to Sony?

PS4 and Xbone chip contracts are different and they also ironically enough target "near parity"

As for "old" architecture...again what and where is the actual proof? It sounds like you read bulletpoint negative reports on PS3. Did you actually play the games on PS3? And Xbox360 or form a long play through analysis of graphics to mark one dramatically unplayable?

Like I mentioned before...Sony's PS3 targets were for a 65nm CellBE...even if they couldn't hit their target estimate clocks, they were definitely targeting performance per watt and were forced to go 90nm or give Microsoft two whole years instead of one which becomes a non-engineering decision by that point.

The PS3 evaluation prototype was based on a Nvidia Nv40 GPU...those GPUs originally came out on 130nm process while G70 aka Nv47 originally came out at 110nm process...the 90nm target for RSX was was performance per watt decision.

GeForce 7800 GTX came out in 2005 while 7900 GTX was an early 2006 release.

GeForce 8800 GTX aka G80 had a higher power requirements and heat output not practical on a 90nm target even if Sony stole and slapped them on...your only choice there is to wait for either a 65nm or preferable 55nm G92 GPU to have as similar or near the launch temps and power consumption of RSX and have more than double the GPU performance which would have made Microsoft look extremely bad unless they shipped with a comparable RV670 derivative. (Because regardless of marketing hype...they would both have still targeted graphics parity on multi-platform games)

If you take the time what you should do is analysis on Microsoft's motives. Halo 2 was a sales phenomenon back then in 2004... and Bungie was only contracted for ten years. ;)

XDR was a bad decision, ultimately IBM moved to DDR with later revision of the Cell. The main PS3 failure, is not the GPU, neither Cell a posteriori for the system has a whole it is the memory amount and type, as shown with every PC you need a lot more RAM than, VRAM (the gap is no longer as big now with really high resolution and asset quality), the PS3 went with 2 tiny pools of fast memory.

1) IBM made and released PowerXCell8 with DDR2 integrated controller on a 65nm chip process design that shipped in 2008...it didn't exist before that and was aimed a different market altogether. It also wouldn't be practical for PS3 even if the console was launched in 2008 because that CPU die size was actually bigger due to DDR2 use.

The 2005 PS3 near final prototype development system had 512MB XDR and 512MB GDDR3 along with 90nm engineering sample versions of Cell and RSX or G70 pre-FlexI0 bus. Obviously even dropping PS2 BC and choosing that ram size would have added more cost to manufacture for 2006...

Are we honestly thinking realistically about the cost? Because Sony would have had to sell it at a loss for $800 at the least.

Imho there are no argument for tackling Nvidia wrt the PS3. As for the XBOX, MSFT signed bad contract, actually Nvidia was brave to face bullying from a way bigger company asking for a rebate not accounted for in existing contracts.
If Nvidia failed to any of its contractual obligations there would have been lawsuits, and there have been lawsuits, for them and plenty of of companies that have nothing to do with gaming, some won some lost. Now it is pretty clear where the pretty specific "nvidia is a bad business partner", when there are no factual evidence of broken contractual obligations in the aforementioned case.
It is imho something else than a rational conversation I won't engage further... but I agree with you ;)

1) I just provided you with a reason to debunk such an argument against Nvidia and there was zero way Microsoft was gonna allow their Xenos to be duplicated for Sony so stop.

2) You are saying the Xbox 1 was a bad contract? Microsoft signed it...they know or knew their long term plans. Nv2A was a bleeding edge GPU in 2001 even with GeForce 4Ti having higher clocks.

Matter of fact I used to have a GeForce 3 200Ti and GeForce 3 (which has higher clocks than the 200Ti) and played Halo and KOTOR...same image quality or better than Xbox 1...hardly any noticed frame drops.

As for "Nvidia being a bad business partner" for Microsoft? Again...Microsoft got what they wanted...look at their motives...they cancelled a bunch of Xbox 1 software that was being promised and funded in favor or gathering Halo gamers into a newer more powerful console promise.

As for Sony? Sony's own chip fabs in Japan were gonna fab Nv47 RSXs and they did, plus make customizing and die shrinks...

At the end of the day that last console promises depended on sales from early adopters and brand buyers who were under the console war marketing hype and fear marketing of both companies even in forum boards with lots of misinformation that still pervades to this day.


I think this is what happens with Nvidia on the console GPU front. I think Nvidia is not willing to provide a low-profit-margin custom GPU. Or maybe they are, just not with nearly as low a margin as AMD is willing to go. And I think AMD's aggression here is serving them well. Nvidia is taking much of the cake on the desktop GPU sales market too (maybe in laptops as well, I don't know as much there market share wise), so AMD has to be aggressive on these fronts I think.

I am also considering going AMD for my next GPU, I just need to explore some few niggles before I make the jump, and of course I want to see the performance of their new cards :)

I also wonder this a lot. I hope that they did. But for some reason I think the Sony pride of that era may have hindered that.... or Ken Kutaragi decided not to ask ATI knowing that they were working with Xbox maybe...

I stated earlier that the contracts were different and these topics have been discussed in Beyond3d back then.

AMD offered a CPU/GPU combo to both companies...Nvidia can only make GPUs...so it becomes what it was...a performance per watt and package deal...

As for Sony approaching ATI for a GPU...Ken Kutaragi pride? Wtf??!! Have we sunk so low to make such baseless and rather degrading to console gaming assumptions?

How was Sony gonna benefit from approaching ATI? Who are in bed with Microsoft...who were in bed with Nvidia...if anything Microsoft benefited far more by successfully getting Sony away from possible proprietary GPU designs and PC ports.

CellBE didn't stop game devs from eventually deciding to make PC versions of PS3 games that were once exclusive like Valkyria Chronicles...and a few others.
 
Last edited:
Sounds like Sony should have waited it out with the Ps3 but they couldn't because they had to respond to the 360 being out on the market already. Judging by the h/w failures in both original model consoles I guess both Sony and MS rushed to get them out on the market. I'm still baffled by the GPU heat-sink on the original 360. Although i'd say the 360 was easily the more balanced architecture overall, even if less ambitious. Let's hope Sony don't pull another Cell any time soon :p
 
Sounds like Sony should have waited it out with the Ps3 but they couldn't because they had to respond to the 360 being out on the market already. Judging by the h/w failures in both original model consoles I guess both Sony and MS rushed to get them out on the market.

Nope. It wasn't Sony/MS rushing, but the industry transition to lead-free solder that caused most of all the initial failures. That defect didn't become apparant until much later.

For both of them, That was after both had multiple attempts of work arounds, such as the extreme clamping systems in the x360 heatsinks. Once the industry figured it out, MS no longer used the extreme clamping setups for their heatsinks and even produced models with smaller heatsinks and fans that no longer needed to spin as fast/high.

Albeit, I think Sony had some initial troubles with the bluray lasers lasting long enough or getting defocused.

I wouldn't expect such high failure rates ever again unless the industry goes through another major transition.
 
One of the reasons why RSX was so crap was probably that the deadline was not for release in the end of 2006, but rather in the beginning (or even end of 2005).
 
that's odd. You'd think after a decade things would improve on that regard... guess not. I don't know much if anything about that area of technology.
 
They don't need to improve. Lead-free solder works as long as you know how to use it. Kinda like a console is capable of achieving 'end-of-life' quality graphics from day 1, but the earliest titles are always fairly weak-source by comparison because people need to learn how to use them best.
 
that's odd. You'd think after a decade things would improve on that regard... guess not. I don't know much if anything about that area of technology.

There are lots of solder chemistries, but my work has been on the same one for a very long time, as far as I know. At the start it was a big deal to transition to lead-free, but now it's just business as usual.
 
Nope. It wasn't Sony/MS rushing, but the industry transition to lead-free solder that caused most of all the initial failures. That defect didn't become apparant until much later.

For both of them, That was after both had multiple attempts of work arounds, such as the extreme clamping systems in the x360 heatsinks. Once the industry figured it out, MS no longer used the extreme clamping setups for their heatsinks and even produced models with smaller heatsinks and fans that no longer needed to spin as fast/high.

Albeit, I think Sony had some initial troubles with the bluray lasers lasting long enough or getting defocused.

I wouldn't expect such high failure rates ever again unless the industry goes through another major transition.

We would have to analyze failure rates on PC market GPUs to come to a better assumption.

Didn't Nvidia have manufacturing problems with some of the G8X derivatives? Or was it the mobile G8X GPUs that were having major thermal/solder issues in laptops that caused a recall and lawsuits?

Iirc most of those G8X derivatives were 90nm based...I believe there were some 80nm G8Xs (note not the big G80 itself) that despite being lower models they had thermal and solder problems.

I can't recall if ATI had similar problems but on PC forums there were times when non-user overclocked graphics cards would die and require an RMA replacement and I don't have the numbers but as GPUs got more power hungry and thermals rose it took some die shrinks to mostly make those issues go away.

In ATI's case they were on 80nm before they introduced the Radeon HD 2900 which as a card, ran at higher temps and sucked wattage like a drain. It took the 55nm RV670 to bring that under control even though back then they cut transistor count from over 740 million to 666 million.

The decisions by Microsoft, and followed by Sony were made long before the Early 2005 and May E3 2005 technical presentations.

It's hard to argue that a properly disassembled Xbox360 and even PS3 launch units should dissipate heat and probably preventing any of those RROD problems...but decisions were made...Microsoft could not forsee that it would cost them millions or a billion in customer service for not so great quality assurance.

Yet the heatsink systems of both launch units are dramatically different. Can we imagine if Xbox360 had the PS3 heatsink system?

However analysis of GPU temps for high end PC GPUs actually goes down to cooler temps from 90nm to 55nm until GT200 and RV770...those extra transistors exist for a reason.

One of the reasons why RSX was so crap was probably that the deadline was not for release in the end of 2006, but rather in the beginning (or even end of 2005).

Only that contradicts the May E3 2005 Sony hardware presentation where they specially stated their target was "Spring 2006" not 2005.

The 65nm target for Cell was known at least around 2003...maybe 2004.

The Sony Evaluation Prototype which existed at least in 2004 had a Nv40 aka GeForce 6800 at 130nm because 90nm didn't exist...those contracts with Nvidia were happening at least before 2004 or around 2003 or even before that...but that's speculation because you need to have some talks and deals before making an Nv40 evaluation prototype.

Again the 2005 PS3 "Engineering Sample 90nm" Cell and Nv47 or G70 (made to evaluate their target numbers) PS3 Prototypes which were near final hardware (no FlexI0...was using PCI-E) and 512MB XDR-Ram and 512MB GDDR3 was being used as developer hardware right?

That Spring 2006 was announced to slip into holiday season 2006 around either the end of 2005 or months before E3 2006.

GeForce 7800 GTX was released in retail PC market in early 2005...at 110nm and 430Mhz...followed by a November GeForce 7800 GTX 512MB clocked at 550Mhz...same process...wasn't a high yields product and was quickly replaced in early 2007 by the 90nm 7900 GTX at 650Mhz.

Meanwhile Xenos was made by Chartered Foundry right? Nvidia's GPUs were made by TSMC right?

And Sony revealed in the same May E3 2005 that RSX was gonna be fabbed in a Sony/Toshiba Foundry...or just a Sony fab...still it's all there.

If the GPUs were crap it was mainly because of their target dates and compromises...had they both targeted 55nm 2007 (which is when that process node existed), we can argue that with a RV670 based Xenos would gain a lot more from having more than double the transistor count at 666 million instead of 270-ish million...add the same 10MB eDram or twice and you get more than double the performance while likely consuming the same wattage and running a bit cooler.

For Sony you'd need a 55nm G92 based RSX and despite the different GPUs...multi-platform graphics parity would be similar.

The solder hasn't changed, but the processes have, and they've improved immensely.

Indeed...by 2007 ATI eliminated a lot of issues they had... in hindsight 55nm would have been a probable RROD killer...

This would mainly result in double over what we knew of performance/features... 720p 30fps would probably have been the same standard at Crysis killing image quality but 720p 60fps and 1080p would have been there.
 
I believe one of the problems with the 360 GPU in its original incarnation was that - in addition to the solder tech, and the initially insufficient cooling - the temperature probes in the GPU weren't located in the hottest parts and so the system under-reported temperatures, making cooling less responsive and allowing the system to keep running when it should have been protecting itself.

My favourite workaround in early 360s isn't actually the emergency extra heatsink, it's the step they tried before that. When your first workaround is to physically glue the GPU package to the motherboard to reduce package twisting (contributing to cracked solder) you know you got problems.

RRoD persisted into 55nm, though at greatly reduced rates. Technically slims can't RRoD as the error warning were removed, but by all accounts they cracked all their problems with their SoC which was designed to handle everything they could throw at it.
 
IBM made and released PowerXCell8 with DDR2 integrated controller on a 65nm chip process design that shipped in 2008...it didn't exist before that and was aimed a different market altogether. It also wouldn't be practical for PS3 even if the console was launched in 2008 because that CPU die size was actually bigger due to DDR2 use.
Indeed as I said it was a bad choice in the first place, DDR2 was available when the TBE was designed, it was actually the standard.
The 2005 PS3 near final prototype development system had 512MB XDR and 512MB GDDR3 along with 90nm engineering sample versions of Cell and RSX or G70 pre-FlexI0 bus. Obviously even dropping PS2 BC and choosing that ram size would have added more cost to manufacture for 2006... Are we honestly thinking realistically about the cost? Because Sony would have had to sell it at a loss for $800 at the least.
No we are not thinking that they should have sold the system at 800€. Thing is XDRAM was quite boutique, I don't think using 512 MB of DDR2 in the stead of 256 MB XDRAM would have have any significant impact on the system price.
I'm just saying that the cell building block as they were they were multiple bad arbitrations that were done. One may disagree but it is different all together than thinking that the impact on the BOM of removing 3256 MB of XDRAM and adding 512MB of DDR2 would have been +200€
 
The PS3 like the PS2 is a great example of how not to design a console. A poop-load of power doesn't mean much when developers have to pat their head, rub their tummy, while jumping and shouting "YAHTZEE!" over and over to get decent performance. It only sorta worked for the PS2 because the Playstation brand was the most successful one at the start of the sixth gen, as everyone wanted one and developers knew they had an audience to cater to. Obviously it didn't work again so well for the PS3.

I sorta think Ken Kutaragi was a hack. While obviously a gifted engineer, he obviously forgot to consider the development side of the equation for both the EE and Cell. You know his pride was probably shot when he and his team realized that going with a PS2 like setup, that is a massive CPU combined with vector engine + "2D" GPU (whether it was a graphics centric Cell or not) wasn't going to work out well for the PS3 in an era of fully equipped 3D GPUs with T&L, pixel and vertex shaders. It confounds me how Sony dismissed developer workflow, while MS fully embraced it and designed the absurdly more architecturally elegant Xbox 360. It becomes even more apparent when you remember that the 360 released a year earlier, featured a more advanced and more forward thinking GPU that forced developers on PS3 to spend countless hours moving graphics jobs to the Cell SPEs in order to get multiplatform parity.

It's good to see Sony learned their lesson when designing the PS4.
 
PS4 is the most well-designed console of my lifetime end-to-end, when you consider both software and hardware. The pretty much did not miss on anything. Affordable, easy to use, reliable, easy to develop for and has very flexible hardware. It's really the model to strive for going forward.
 
I believe one of the problems with the 360 GPU in its original incarnation was that - in addition to the solder tech, and the initially insufficient cooling - the temperature probes in the GPU weren't located in the hottest parts and so the system under-reported temperatures, making cooling less responsive and allowing the system to keep running when it should have been protecting itself.

In one of the firmware updates back in 2006-2007 Microsoft was able to add something that kept the heatsink fan running for a minute or two after you turned off the Xbox360...I know this from being present with a couple of friends who had launch units which did not do that initially. They were three different units and the first black Xbox360 Elite model.

That trick didn't work in the long run, all units died, another had four dead Xbox360s in the closet...he kept buying new instead of using the recall program.

A far better analysis would have been testing a launch unit with re-applied quality thermal paste and no other modifications as that's the only way to know.

That said even in early 2006...perhaps late 2005 there were overtly enthusiastic fanatics who did do water cooling custom jobs on launch units...although that's different I do wonder how efficient and effective that was in preventing the BGA solder from cracking.

Remember properly seated heatsink and thermal paste is what allows heat to conduct away and be dispersed hence with proper heat dispersal it won't matter if we ran a 24/7 non-biased week long marathon with the most stressful games as a benchmark stress test. If it's effective it should be fine which is probably how it seemed ok to the QA assessment of the time.

My favourite workaround in early 360s isn't actually the emergency extra heatsink, it's the step they tried before that. When your first workaround is to physically glue the GPU package to the motherboard to reduce package twisting (contributing to cracked solder) you know you got problems.

Which if heat is dispersed, it shouldn't need the glue.

RRoD persisted into 55nm, though at greatly reduced rates. Technically slims can't RRoD as the error warning were removed, but by all accounts they cracked all their problems with their SoC which was designed to handle everything they could throw at it.

Only 55nm Xbox360 Xenos GPU fabrication did not exist.

Chip fabrication was 90nm, 80nm, 65nm then 45nm or 40nm for the unified CPU+GPU assembly of the last Xbox360s.

PC market 55nm was a half node stop gap decision announced and made because after Radeon HD 2900 ran so hot and drank too much wattage, ATI needed to stay competitive and profitable with yields which is a huge factor in wafer production related to process node specially when you're selling $300-$500 dollar graphics cards and also have to keep special workstation versions for clients.

ATI mostly skipped 65nm production for high end GPUs, Nvidia didn't because they transitioned to 55nm in April 2008...although if they had planned for it, there's no plausible reason why they couldn't skip direct to 55nm but their PC decisions are a different market.

Xbox360 slim models shouldn't crack any solders because they produce far less heat altogether despite the CPU and GPU+eDRAM being part of a close package...that's just benefits of the later years die shrink.

Indeed as I said it was a bad choice in the first place, DDR2 was available when the TBE was designed, it was actually the standard.

No we are not thinking that they should have sold the system at 800€. Thing is XDRAM was quite boutique, I don't think using 512 MB of DDR2 in the stead of 256 MB XDRAM would have have any significant impact on the system price.
I'm just saying that the cell building block as they were they were multiple bad arbitrations that were done. One may disagree but it is different all together than thinking that the impact on the BOM of removing 3256 MB of XDRAM and adding 512MB of DDR2 would have been +200€

DDR2 also came in different speeds, weaker latency compared to XDR-Ram and it wasn't cheap initially and didn't drop in price until after DDR3 was obsoleting it with their initial prices and market adoption.

512MB XDR-RAM and 512MB GDDR3 in a retail PS3 would have made far more logical sense...however there are physical limitations to selling such a console at a reasonable price when customers will take any take for granted.

The PS3 like the PS2 is a great example of how not to design a console. A poop-load of power doesn't mean much when developers have to pat their head, rub their tummy, while jumping and shouting "YAHTZEE!" over and over to get decent performance. It only sorta worked for the PS2 because the Playstation brand was the most successful one at the start of the sixth gen, as everyone wanted one and developers knew they had an audience to cater to. Obviously it didn't work again so well for the PS3.

Let's see...the PS2 had a reputation for being complex to program for in 1999 and 2000. PS2 Engineering Sample Prototype board was revealed in early 1999 implying that design decisions were finalized in late 1998...then there's the initial process node which required dedicated heatsink and fans each on CPU and graphics chipset...even back then Sony waited for a die shrink because yields and thermals and wattage...brings more credible reason to believe that 90nm Cell was NOT a desired target by Sony's engineering team and thermal/wattage and yields targets no matter.

I sorta think Ken Kutaragi was a hack. While obviously a gifted engineer, he obviously forgot to consider the development side of the equation for both the EE and Cell. You know his pride was probably shot when he and his team realized that going with a PS2 like setup, that is a massive CPU combined with vector engine + "2D" GPU (whether it was a graphics centric Cell or not) wasn't going to work out well for the PS3 in an era of fully equipped 3D GPUs with T&L, pixel and vertex shaders. It confounds me how Sony dismissed developer workflow, while MS fully embraced it and designed the absurdly more architecturally elegant Xbox 360. It becomes even more apparent when you remember that the 360 released a year earlier, featured a more advanced and more forward thinking GPU that forced developers on PS3 to spend countless hours moving graphics jobs to the Cell SPEs in order to get multiplatform parity.

It's good to see Sony learned their lesson when designing the PS4.

"Hack" is quite a stretch do devalue and diminish and insult that one person who basically helped revolutionize game consoles into full 3d polygon graphics, sophisticated sound chips (starting with the Sony sound chip in SuperFamicom/SNES which is still highly relevant in retrospectives)

He did have a thick English accent though and perhaps that's how his words were used against him far more as destroying his contributions became such a trend in 2006 till now. That and the insult of using Kaz quoting 1994's "Riiiiidge Racer" from the title menu of that game when revealing the PSP version...by people who perhaps were trolling and such a meme became so negative and destructive.

Maybe if Ken Kutaragi's English accent wasn't so thick and his use of the language more fluent, he wouldn't have been used as a caricature by the misinformation brigade.

Also it's best to look at how PlayStation 2 wasn't really expected to be such a record setting sales records breaking games console.

Regardless of "hype misinformation" that DVD drive gave many men and costumers in general to get a DVD player that played PS1 and PS2 games...that's three hard to ignore features.

Being first in comparison to Xbox 1 in comparison is ridiculous given the nature of when Microsoft finally announced and revealed their plans.

Sega's loses and leaky plan handling leading up to dropping the ball with Dreamcast only helped to rush development of PS2...by relation.

As such PlayStation 2 setting sales records, calling for increased production at fabs and reaching a certain number of units plus the different strategies and all of a sudden PS2 became known as easy to develop for because most technical stuff was documented.

PS3 may be seen in a negative light when being stuck in 2006 but those technical hurdles were documented. There's reason to still believe 2006 PS3 F.U.D. in 2011...oh wait it's 2016...

Finally each console was designed for different eras... PS3 had some major disadvantages compared to PS2 not just lukewarm sales. PS4 and Xbone were planned for different reasons which we all know now...it isn't fair to still believe some decisions were bad decisions in the past as even Microsoft were limited by physical limitations of not anticipating that reaching a launch in 2005 was gonna cost them billions in loses to write off...

PS3-Xbox360 multi-platform parity was still reached...and it's no different than current PS4-Xbone generation issues.
 
In one of the firmware updates back in 2006-2007 Microsoft was able to add something that kept the heatsink fan running for a minute or two after you turned off the Xbox360...I know this from being present with a couple of friends who had launch units which did not do that initially. They were three different units and the first black Xbox360 Elite model.

I wonder if having the fan on after the GPU stopped generating heat could paradoxically worsen things if the problem is solder bump failure due to thermal cycling.

If the GPU and its package had already climbed above the temperature where materials were softening and differences in thermal expansion posed a threat to reliability, allowing the cooling solution to run after thermal production ceased would rapidly drop temps back down, potentially forcing uneven contraction in a shorter time period than it took to get to the problematic temps in the first place.

I suppose a more full analysis of what material layer's stiffening/softening, or the timing of the layers reaching various temperature thresholds might explain why that might still have been preferable.
 
I'm not sure but I believe that as long as the heatsink system was disassembled and then checked to be seated with custom thermal paste then heat should dissipate negating any bending or warping.

It will still get hot but the heat will be conducted out by the heatsink and afterwards the fan running should just drown out the danger level heat...

As long as it's below some threshold...but there might be something with mainboard composition quality...and solder...but even in the PC graphics world we didn't get obsessed with temps until the GeForce 5800 series and ATI R4X0 series high end...or was it R5X0?
 
Aku... Simply re-pasting old 360s won't save them from RRoD, they might live longer, but the early models are fundamentally faulty engineered, they will die eventually pretty much guaranteed. Meanwhile, you have chips today getting PLENTY hot (90+C sometimes) without bumps cracking in just a year or two. Maybe in time they will, but there seems to be no huge flood of this happening. People are still using 6ish years old (IE, from era after we mastered lead-free solder), GPUs, CPUs, and they're still soldiering on.
 
A far better analysis would have been testing a launch unit with re-applied quality thermal paste and no other modifications as that's the only way to know.

I know folks that did this. Didn't save 'em. I also know someone that changed their brand new 360 HS clamp to remove the X-clamp (there was a hypothesis that these caused RRoD for a while). Didn't save it either. Anecdotes for sure, but I think MS were facing a fundamental engineering issue.

Which if heat is dispersed, it shouldn't need the glue.

The original HS for the GPU simply wasn't up to the job - it couldn't take heat away fast enough. Even with the emergency HS boost the 90nm 360s still ran hot.

There's always going to be changing heat within the package though, even with sufficient cooling, and so mechanical forces being exerted on the solder. Not understanding and engineering around these seems to be at the centre of the RRoD issues (certainly beyond 90nm).

Jasper went back to the original GPU heatsink as it didn't need the additional capacity of the boosted HS. RRoD continued on though, admittedly in much smaller numbers.

Only 55nm Xbox360 Xenos GPU fabrication did not exist.

Chip fabrication was 90nm, 80nm, 65nm then 45nm or 40nm for the unified CPU+GPU assembly of the last Xbox360s.

Yeah, Jasper was 65nm not 55nm. Don't forget the 32nm Xboxen!

Xbox360 slim models shouldn't crack any solders because they produce far less heat altogether despite the CPU and GPU+eDRAM being part of a close package...that's just benefits of the later years die shrink.

I'd guess that the slim SoC generates far more heat than the Jasper or even Falcon GPUs. I don't think it's just temperature that helps the slim be so reliable, I think that by that time MS/IBM understood what they needed to understand about working with the solder, and had built the chip with a better understanding of thermal issues.

I wouldn't be surprised if the slim SoC actually tires to maintain a more constant temperature, and not necessarily the lowest temperature it can at any time.
 
Back
Top