Posts: 14,332 +163
In a nutshell: Unlike Hubble which orbits close enough to Earth for astronauts to reach if need be, the new James Webb Space Telescope is roughly a million miles from home. That makes it virtually unserviceable (at least in the near term), meaning NASA had to equip it with only the most reliable and hardened hardware before sending it off into space.
All data collected by Webb is stored locally on a 68GB solid-state drive, and three percent of the space is reserved for engineering and telemetry data. All things considered, Webb can collect up to 57GB of science data per day depending on the target it is observing.
Webb needs to offload its data daily in order to avoid running out of room. It does this via a 25.9-gigahertz channel on the Ka-band at speeds up to 28 megabits per second. The observatory also uses two channels in the S-band including a 2.09-GHz uplink that receives future transmission and observation schedules from Earth at 16 kilobits per second and a 2.27-GHz, 40-kb/s downlink to send engineering data back home.
For comparison, Hubble can generate up to 2GB of data daily.
Webb is part of the Deep Space Network (DSN), a global communications network with ground facilities located in the US, Spain and Australia that supports various spacecraft missions. All of its communications channels employ the Reed-Solomonerror-correction protocol, which is also used in QR codes and Blu-ray / DVDs.
Webb will only wipe its SSD to free up space after it receives confirmation that its existing data has successfully made it to Earth.
Webb's SSD shouldn't degrade too much over time, either. By the end of its planned 10-year mission, NASA expects it to still be able to hold around 60GB of data.