NASA's Mars Perseverance rover is powered by a processor from 1998

Shawn Knight

Posts: 15,284   +192
Staff member
Editor's take: Conventional wisdom would suggest that when sending a robot to explore another planet, you’d want to outfit it with the latest and greatest technology available to maximize its potential. As it turns out, that wasn’t a top priority for NASA.

When the space agency’s Mars Perseverance rover touched down on the surface of the Red Planet last month, it did so with a modified PowerPC 750 processor on board. If you recall, that’s the same chip that came equipped in Apple’s iconic iMac G3 way back in 1998.

Why outfit the rover with a CPU that is over two decades old? Because it has to work, that’s why.

The processor inside the Perseverance rover is a RAD750, a radiation-hardened variant of the PowerPC 750 that is made by BAE Systems. It packs just 10.4 million transistors and operates at up to 200MHz. Critically, it can withstand up to a million rads of radiation and operating temperatures between -55C and 125C.

Because Mars’ atmosphere is different than what is found here on Earth, and it is further away from the Sun, anything sent to the planet is susceptible to damage from radiation and extreme temperatures.

The RAD750 is also a proven performer, having been used successfully in more than 150 spacecraft including the Kepler space telescope and the Curiosity rover.

Permalink to story.

 
Another thing about this is that how much compute power do these things actually NEED? These things need to move slow for reliability reasons ANYWAY and transmitting the type of data these things are going to collect doesn't take that much power. It already sent 1080p video back so how much more power do we actually need? I'm sure the minds at NASA would be great at optimizing code instead of programs today that rely on raw horsepower. Why spend resources optimizing code when you can count on a 50-70% increase in hardware performance every year?

Optimizing code goes a LONG way but is expensive to do. In something like this where money is essentially no object the budget is there to optimize the code
 
I mentioned this a few months back, reliability is everything over maximum performance. Radiation hardening processors and fully testing them for flight certification is also an expensive job.

The smaller you make the transistors the more likely you'll get random flipped bits from radiation fields, or even random cosmic rays out in space. Modern transistors on 7nm are a fraction of the size of the ones in a RAD750 CPU.

First noticed in the 1960s astronauts reported seeing the odd flash in their eyesight now and then. While it could simply have been photons, it was later posited they were potentially cosmic rays. Passing right through their eye and stimulating their retina, optic nerve pathways, or even possibly directly their brains. Could play havoc with a CPU in a critical cycle.
 
Call me dumb, but what happened to the regular CPU shielding to prevent cold and radiation interference? Would it not be much better to use a modern CPU, just shielded properly?

Or was it +1bln on Apple's upgrade list?

Sometimes you just can't understand the dynamics of space and warfare engineering, and all you need to set things clear is to once again watch The Pentagon Wars movie.
 
Call me dumb, but what happened to the regular CPU shielding to prevent cold and radiation interference? Would it not be much better to use a modern CPU, just shielded properly?

Or was it +1bln on Apple's upgrade list?

Sometimes you just can't understand the dynamics of space and warfare engineering, and all you need to set things clear is to once again watch The Pentagon Wars movie.
electron leakage is already a problem on modern processors, It would be stupid to put something like that on something where warranty service is not available. I've seen computers with CPUs made in the 80s still running, I'm now lucky if I get 10 years out of a modern CPU before instability becomes an issue. And, frankly, if they can optimize code to get 1080p video recording out of a CPU this old, what's the problem? If these CPUs are more reliable than modern ones and can be just as capable for the job required why does it even matter? These rovers don't need to play Crysis
 
Just as funny .... up until a few years ago GM continued to use the old 8088 Intel processor in most of their cars ...... talk about product durability!
 
As a bonus, this should make it harder for Aliens to exploit hardware bugs to hack the rover.

As a downside, it most likely won‘t run Crysis. Can‘t have it all.
 
Optimizing code goes a LONG way but is expensive to do. In something like this where money is essentially no object the budget is there to optimize the code
This is NASA, money is an object because they get their budget slashed every year. It's a time-honored tradition in the federal government.
 
Call me dumb, but what happened to the regular CPU shielding to prevent cold and radiation interference? Would it not be much better to use a modern CPU, just shielded properly?
If you cap a CPU with a metal shield, high energy particles will cause huge amounts of bremsstrahlung to cascade through the silicon. Radiation hardening involves making the chip layers as thin as possible, to let the particles pass straight through; modern CPUs have lots of relatively thick layers (especially the insulation layer) which makes them not only harder to reduce in size, but would also affect the operational parameters of the chip.
 
If you cap a CPU with a metal shield, high energy particles will cause huge amounts of bremsstrahlung to cascade through the silicon. Radiation hardening involves making the chip layers as thin as possible, to let the particles pass straight through; modern CPUs have lots of relatively thick layers (especially the insulation layer) which makes them not only harder to reduce in size, but would also affect the operational parameters of the chip.
Dude, that bremsstrahlung is way over my head :)
 
Bremsstrahlung is 'braking radiation' - electromagnetic waves emitted by a charged particle (such as a proton, electron, alpha, etc) being rapidly decelerated. Space is full of high energy charged particles and so when they pass through metal, they slow down and emit a spray of X-rays. It's these rays that can affect the lattice structure of the silicon.
 
Dude, that bremsstrahlung is way over my head :)


When you actually try to stop something as energetic as a Cosmic Ray, it's not easily done (there are almost always leftover energy after the initial impact, much like you get multiple particle collisions from a self-sustaining nuclear reaction).

This is why most of the time, they recommend something bulky and dense for shielding, like water/Lead/concrete. To get an idea of how thick and expensive effective GCR shielding suits are, have a look at this:

https://en.wikipedia.org/wiki/StemRad

It is way easier to just design the CPU to be as thin as possible (to avoid impacts by being too small). You can also add different structures to your Microprocessor to make them more resistant to errors (3-way voting, for one, allows handing permanent damage tor each step in a processor's pipeline {unlikely], or easy rejection of a one-time error [way more likely]!)
 
Last edited:
Well imagine if they also had to worry about Intel exploits and not just the radiation.

Still while the points made by people are excellent I'm gobsmacked that we haven't moved the game on at but further than nearly quarter of a century old CPU. Perseverance might not need a lot of CPU power but NASA would be using this level of CPU to run the space station. Surely a custom solution specifically designed for space applications but still with greater capabilities could have been built by now.
 
Well imagine if they also had to worry about Intel exploits and not just the radiation.

Still while the points made by people are excellent I'm gobsmacked that we haven't moved the game on at but further than nearly quarter of a century old CPU. Perseverance might not need a lot of CPU power but NASA would be using this level of CPU to run the space station. Surely a custom solution specifically designed for space applications but still with greater capabilities could have been built by now.


This is that "custom solution" - you start with a known proven design, add- triple-redundant voting logic to every stage of the pipeline, and suddenly your tiny RISC processor gets beefy (grows from 43 mm2 to 130 mm2 !)

Given the fact that there are very few space projects with this insane level of general-purpose-compute required, the market is small; if they took the time to design a processor from scratch, the improvements wouldn't be enough to pay for the added expenses (validating things for space takes forever, and lots of cash).

If you are only selling a few hundred of these OVER THE MARKET LIFETIME, you need to optimize performance-per- dollar as much as humanly possible.

https://en.wikipedia.org/wiki/RAD750#Deployment

You can't rely on mass-production to make these things cheaper (SpaceX might change this level ofdemand, but we are still decades away)
 
Last edited:
Just proves that you don't need a Threadripper or a Xeon for rocket science.

Actually all the CPU power that we're seeing now is almost not necessary. In today's world of chasing after faster and more powerful iterations, we leave out optimizations. Same like with graphics for games. GPUs are so much fast but the games themselves are feeling empty or "me-too". Instead of stopping and ponder on how we can make best use of current tech, they started thinking how to make higher resolution textures and howto make the next gen card. The games themselves are left soul-less.

What the software and hardware companies make us do - buy another CPU or GPU - is plain rip-off. We can still do Microsoft Word in a lowly Pentium 1 with Office 95. But they just had to say no, the next version is better and you need the next version of Windows for it and for the next Windows, we recommend you to get the next powerful CPU. It's like the software and hardware companies have a secret pact. Both want to stay in business and this is how they make business - make people abandon their current systems and force them to upgrade. The same damn thing happening in mobile phone world.
 
Sweet, I've got a 233 MHz beige G3 sitting right beside me with an ATI Rage graphics chip built-in. It's a Quake RAVE beast from back in the day.
 
So... are these CPUs still actually being manufactured? Or is this used or new-old-stock?

At $285,000 each, I suspect they're being manufactured to order for NASA and other Earth orbit satellite operators who need the same sort of protective properties in their on board computers.
 
Just proves that you don't need a Threadripper or a Xeon for rocket science.
What the main computer in Perseverance has to handle arguably isn't rocket science, though. The data handling, and decision making it has to do from it, doesn't need to be managed very rapidly - after all, it's top speed is 0.09 mph.

For actual rocket science, more modern processors are used, but they're not Threadrippers either. Not because of programming efficiency, but down to the fact that the majority of rocket management is fast vector mathematics and even a basic dual core i3 will up to that task.
 
Shows how bad this overpaid tech ceos really are, ex amd / intel could have easily produce any slog superior to that radiation hardened POS with pocket money... instead they focus on their compensation packages and making millions selling their stocks at the ideal time.
 
As a bonus, this should make it harder for Aliens to exploit hardware bugs to hack the rover.

As a downside, it most likely won‘t run Crysis. Can‘t have it all.
I would be surprised if Aliens wanted to play Crysis, but hey, you never know. :laughing:
 
Shows how bad this overpaid tech ceos really are, ex amd / intel could have easily produce any slog superior to that radiation hardened POS with pocket money... instead they focus on their compensation packages and making millions selling their stocks at the ideal time.
Especially in Intel's case, so far anyway, they also focus on milking their customer's wallets with piddling performance improvements for vastly more money. ;)
 
Back