TechSpot

IBM to build low-power, exascale computer for largest-ever radio telescope

By Rick
Apr 2, 2012
Post New Reply
  1. For the next five years, IBM will be working with the Netherland's National Institute of Radio Astronomy (ASTRON) in hopes of developing a low-powered, exascale supercomputer. According to IBM, such…

    Read the whole story
     
  2. VitalyT

    VitalyT TS Evangelist Posts: 1,921   +565

    Didn't IBM just recently promise they would have the first quantum computer by about 2022, which would leave today's supercomputers in the dust?

    From the article above it sounds like IBM might not have that much faith in what they claimed just before...
     
  3. @VitalyT
    Teraflop comes after gigaflop, followed by petaflop and exaflop
     
  4. @VitalyT

    It would be a long time between the first quantum computer and an enterprise ready version. The technology they are developing for this will be ready for enterprise use in just 5 years when they are expecting a huge demand for it. Sounds like a good plan to me.
     
  5. gamerex

    gamerex TS Enthusiast Posts: 176

    64-bit architecture has a limit of 18 exabytes, eh? Before we're even done transitioning to 64-bit, a purpose of 128-bit computing is already here...
     
  6. Holyscrap

    Holyscrap TS Rookie Posts: 39   +12

    yea build a computer 12 years before it will be used . that will end well....
     
  7. DanUK

    DanUK TS Enthusiast Posts: 192   +8

    "To put this into perspective, for each day it operates, the amount of data which will be collected is expected to exceed the sum of raw data transferred through the entire Internet, globally. In fact, under ideal conditions, it may be more than twice as much."


    ... holy crap.
     
  8. Tygerstrike

    Tygerstrike TS Enthusiast Posts: 827   +93

    maybe IBM hopes to put to rest if we are alone in the universe with this computer.............who am I kidding.....they just want money lol.
     
  9. Per Hansson

    Per Hansson TS Server Guru Posts: 1,932   +126 Staff Member

    gamerex: No, this is not even close to "a purpose of 128-bit computing is already here"
    First of all, this tech is 12 years off, 12 years ago we got our first look at a 1Ghz processor and 256MB graphics cards where top of the line tech.
    Ontop of that the just released Windows 2000 ran great with 128MB RAM.

    This new super computer will "manipulate between 1 to 3 exabytes of data per day."
    That does not mean it will have all that data in memory at any given time, but that it will process that much data over an entire day.
    This system will not come close to the address limit of 18 exabytes that the 64bit architecture has.
     
  10. dikbozo

    dikbozo TS Enthusiast Posts: 106

    The data sets to be manipulated might well be higher than the daily amount of data collected. Data sets are usually a super set or are derived elements that come from a smaller sample size. Please do not confuse the two. Data sets could well be higher than the 18 exabyte limit depending on the operations to be performed. I am aware of a facility doing gene sequencing manipulations that routinely are in the multiple petabyte range. Hundreds of CPUs, terabytes of RAM, sequencing runs lasting weeks.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.