It would be a long time between the first quantum computer and an enterprise ready version. The technology they are developing for this will be ready for enterprise use in just 5 years when they are expecting a huge demand for it. Sounds like a good plan to me.
"To put this into perspective, for each day it operates, the amount of data which will be collected is expected to exceed the sum of raw data transferred through the entire Internet, globally. In fact, under ideal conditions, it may be more than twice as much."
gamerex: No, this is not even close to "a purpose of 128-bit computing is already here"
First of all, this tech is 12 years off, 12 years ago we got our first look at a 1Ghz processor and 256MB graphics cards where top of the line tech.
Ontop of that the just released Windows 2000 ran great with 128MB RAM.
This new super computer will "manipulate between 1 to 3 exabytes of data per day."
That does not mean it will have all that data in memory at any given time, but that it will process that much data over an entire day.
This system will not come close to the address limit of 18 exabytes that the 64bit architecture has.
The data sets to be manipulated might well be higher than the daily amount of data collected. Data sets are usually a super set or are derived elements that come from a smaller sample size. Please do not confuse the two. Data sets could well be higher than the 18 exabyte limit depending on the operations to be performed. I am aware of a facility doing gene sequencing manipulations that routinely are in the multiple petabyte range. Hundreds of CPUs, terabytes of RAM, sequencing runs lasting weeks.