Why are ancient CPUs used in Spacecrafts?

Deepak

B3D Yoddha
Veteran
Even Aircrafts, Missiles, Radars etc use such old CPUs IIRC.

So what are the reasons?

Because they have been perfected? They are ~ 100% reliable over long period of time.

Any other reasons?
 
Not neccessarily perfected, just that our knowledge about their flaws is the most complete...

Also, they're less complicated than recent designs, so there's less chance of undiscovered errata hiding somewhere inside them.
 
1. They're fast enough to do the task at hand.
2. They need to be resilient against bit-flipping errors from cosmic radiation, which means much bigger transistors and different layouts compared to standard consumer grade chips. Packaging also needs to be of a high resistance variety.
 
Long design cycles and product lifetimes play a role as well. For example, the space shuttle is really old so it uses old CPUs.
 
They did upgrade it though... It actually uses DRAM now instead of magnetic core memory. :LOL:
 
Smart enough to win at chess? :p

They probably don't do chess anymore as they don't want to embarrass the human players. What better way to prove that Skynet is on the way for the various nutjobs than to produce a computer which is 'smarter' than a human?
 
Big transistors for resistance to bit-flipping due to radiation, as corduroygt said.

Also, the list of requirements is many miles long, so it's simply not cost-efficient to try and get a recent processor on the "approved" list, as long as the old ones do well enough. Steering a rocket is a lot less like rocket science than steering a jet through "fly-by-wire".

Then again, SpaceX and their likes use much more modern equipment.
 
Steering a rocket is a lot less like rocket science than steering a jet through "fly-by-wire".
You can say that again. Heck, the Saturn V was controlled by an (at least partly) analog flight computer for chrissakes...not to mention the old V2, and probably a lot of other missile and rocket designs of the 40s to 60s era. Maybe part of the 70s also, I'm not sure. The Voyagers, Vikings and Pioneers had digital computers at least, but then again, you couldn't do what needed to be done on those probes with shitty old analog stuff.

Then again, fly-by-wire planes don't really need all that cutting-edge stuff, the JAS Gripen for example used 68020s originally in its flight computer I believe, and that fighter jet is an aerodynamically unstable design. Without computer control, that airplane would crash, or break up in the air, and still it manages with rather mediocre hardware. It's all in the programming I suppose, use the available resources wisely, and you can do some great stuff. Piss it away like lazy PC programmers do today however...

We're so friggin' spoiled these days. Can't even run a friggin' web browser without a 1GHz+ CPU, it's pathetic. Kids don't even know what a 68k chip is. It was before their time, they've got no sense of perspective. In their minds, texturemapped, pixel-shaded 3D graphics has always existed.
 
I totally agree!

Especially all those "high-tech" browser apps and platforms, like ASP.NET, require an unbelievable amount of overhead to get even the simplest thing done and are a large throwback in programming. (A syntax error or illegal object reference will crash your program, and is very hard to debug, for example.) They might be a lot, but "state-of-the-art" they're only because it's the current, common platform. In all other respects they're totally backwards and stupid.

The common "solution": keep all the overhead, complexity and problems, but run a VM in the browser, so you can make "adequate" programs...

Well, it's what pays the bill for most programmers and IT companies nowadays.


Ok, I do know why they developed and turned out like they did, but I'm not calling it progress. Mostly a way to increase the requirements for bigger and faster hardware, and increase IT spending.
 
Ok, I do know why they developed and turned out like they did, but I'm not calling it progress. Mostly a way to increase the requirements for bigger and faster hardware, and increase IT spending.

To be frank with you, wouldn't efficient programming put you out of a job?
 
Maybe it would, but if so then only in the short run... If there's one constant factor, it's that we've always been able to find a use for all the ops and flops thrown at us. I'm sure that would not fundamentally change even if we'd stop wasting computing resources with shitty programming practices.

Now, I don't know the specs for the microcontrollers used in stuff like flatpanel displays and so on, probably something hella slow by today's standards, but when it takes upwards of 10 seconds to power up from the moment the on button is pushed to the point where graphics is showing on the screen... Well, something's very wrong there. My Sony TV's the same, my PS3 almost boots faster for chrissakes, and it is not a fast device either by any stretch.
 
To be frank with you, wouldn't efficient programming put you out of a job?
Well, that's what people and companies say that see "finishing the software project" as a way to become obsolete: make sure they keep on needing you and pay your bill. At some point, the customer is fed up enough and looks for a better alternative.

I rather do projects, within time and budget: that way there's an endless supply of new projects for me to do. Because I deliver the goods.
 
Back
Top