IBM to build low-power, exascale computer for largest-ever radio telescope

Rick

Posts: 4,512   +66
Staff

For the next five years, IBM will be working with the Netherland's National Institute of Radio Astronomy (ASTRON) in hopes of developing a low-powered, exascale supercomputer. According to IBM, such a computer would be millions of times faster than today's high-end desktop PCs and possibly thousands of times faster than even the most recent super computers. The computer will be used to analyze data collected by SKA (square-kilometer array), a cutting-edge radio telescope which will become the largest and most sensitive of its kind ever built. ASTRON plans to have the telescope ready by 2024.

Exascale refers to a computing device that is so incredibly fast, the number of floating-point operations per second it can perform is measured not in gigaflops or even petaflops, but exaflops. Today, even the highest-end desktop CPUs clock-in around 20 gigaflops. Even when you consider the current masters of parallel computing, GPUs, FLOPS of the peta-kind are still unheard of outside the realm of supercomputing.

This low-powered supercomputing cluster will be charged with the task of collecting, storing and analyzing exabytes worth of astronomical data on a daily basis. To put this into perspective, for each day it operates, the amount of data which will be collected is expected to exceed the sum of raw data transferred through the entire Internet, globally. In fact, under ideal conditions, it may be more than twice as much.

To maintain a combination of high performance and low power, IBM will be investigating a number of its own experimental technologies, including 3D stacked circuits and novel optical transfer technologies.

A few interesting facts:

  • SKA will need to store up to 1,500 petabytes of data per year.
  • The computer will need to manipulate  bewteen 1 to 3 exabytes of data per day.
  • 64-bit architecture currently has an address limit of 18 exabytes.
  • SKA will survey over 3000km of sky at a time, nearly the width of the continental U.S.

Permalink to story.

 
Didn't IBM just recently promise they would have the first quantum computer by about 2022, which would leave today's supercomputers in the dust?

From the article above it sounds like IBM might not have that much faith in what they claimed just before...
 
@VitalyT
Teraflop comes after gigaflop, followed by petaflop and exaflop
 
@VitalyT

It would be a long time between the first quantum computer and an enterprise ready version. The technology they are developing for this will be ready for enterprise use in just 5 years when they are expecting a huge demand for it. Sounds like a good plan to me.
 
64-bit architecture has a limit of 18 exabytes, eh? Before we're even done transitioning to 64-bit, a purpose of 128-bit computing is already here...
 
"To put this into perspective, for each day it operates, the amount of data which will be collected is expected to exceed the sum of raw data transferred through the entire Internet, globally. In fact, under ideal conditions, it may be more than twice as much."


... holy crap.
 
maybe IBM hopes to put to rest if we are alone in the universe with this computer.............who am I kidding.....they just want money lol.
 
gamerex: No, this is not even close to "a purpose of 128-bit computing is already here"
First of all, this tech is 12 years off, 12 years ago we got our first look at a 1Ghz processor and 256MB graphics cards where top of the line tech.
Ontop of that the just released Windows 2000 ran great with 128MB RAM.

This new super computer will "manipulate between 1 to 3 exabytes of data per day."
That does not mean it will have all that data in memory at any given time, but that it will process that much data over an entire day.
This system will not come close to the address limit of 18 exabytes that the 64bit architecture has.
 
The data sets to be manipulated might well be higher than the daily amount of data collected. Data sets are usually a super set or are derived elements that come from a smaller sample size. Please do not confuse the two. Data sets could well be higher than the 18 exabyte limit depending on the operations to be performed. I am aware of a facility doing gene sequencing manipulations that routinely are in the multiple petabyte range. Hundreds of CPUs, terabytes of RAM, sequencing runs lasting weeks.
 
Back