IBM to build low-power, exascale computer for largest-ever radio telescope

Rick

Posts: 4,512   +66
For the next five years, IBM will be working with the Netherland's National Institute of Radio Astronomy (ASTRON) in hopes of developing a low-powered, exascale supercomputer. According to IBM, such…

Read the whole story
 

VitalyT

Posts: 4,922   +3,709
TechSpot Elite
Didn't IBM just recently promise they would have the first quantum computer by about 2022, which would leave today's supercomputers in the dust?

From the article above it sounds like IBM might not have that much faith in what they claimed just before...
 
G

Guest

@VitalyT
Teraflop comes after gigaflop, followed by petaflop and exaflop
 
G

Guest

@VitalyT

It would be a long time between the first quantum computer and an enterprise ready version. The technology they are developing for this will be ready for enterprise use in just 5 years when they are expecting a huge demand for it. Sounds like a good plan to me.
 

gamerex

Posts: 150   +0
64-bit architecture has a limit of 18 exabytes, eh? Before we're even done transitioning to 64-bit, a purpose of 128-bit computing is already here...
 

DanUK

Posts: 220   +12
"To put this into perspective, for each day it operates, the amount of data which will be collected is expected to exceed the sum of raw data transferred through the entire Internet, globally. In fact, under ideal conditions, it may be more than twice as much."


... holy crap.
 

Tygerstrike

Posts: 827   +93
maybe IBM hopes to put to rest if we are alone in the universe with this computer.............who am I kidding.....they just want money lol.
 

Per Hansson

Posts: 1,967   +223
Staff member
gamerex: No, this is not even close to "a purpose of 128-bit computing is already here"
First of all, this tech is 12 years off, 12 years ago we got our first look at a 1Ghz processor and 256MB graphics cards where top of the line tech.
Ontop of that the just released Windows 2000 ran great with 128MB RAM.

This new super computer will "manipulate between 1 to 3 exabytes of data per day."
That does not mean it will have all that data in memory at any given time, but that it will process that much data over an entire day.
This system will not come close to the address limit of 18 exabytes that the 64bit architecture has.
 

dikbozo

Posts: 84   +6
The data sets to be manipulated might well be higher than the daily amount of data collected. Data sets are usually a super set or are derived elements that come from a smaller sample size. Please do not confuse the two. Data sets could well be higher than the 18 exabyte limit depending on the operations to be performed. I am aware of a facility doing gene sequencing manipulations that routinely are in the multiple petabyte range. Hundreds of CPUs, terabytes of RAM, sequencing runs lasting weeks.