Memristors could make CPUs and RAM obsolete

Status
Not open for further replies.

Rick

Posts: 4,512   +66
Staff

Dubbed the "memory of the future," a new type of integrated circuit combining resistors with memory capabilities was recently shown off by HP and touted as a possible alternative to transistors in both computation and storage devices. Today, these functions are handled by separate chips, meaning data must be transferred between them, slowing down the computation and wasting energy. Could this new tech mean the unification of memory and CPU?


In simplest terms, transistors are the on/off mechanisms which allow CPUs to compute and memory to store data. CPUs are often made with tens or even hundreds of millions of transistors. As the performance and capacity of devices increase, so do the number of transistors. Memristors have the potential to fundamentally change circuitry by performing the same duties as the transistor, but more efficiently and occupying far less space.

They also provide some unique properties such as three-dimensional stacking, faster switching, retention of off/on states without power and a proven durability which exceeds hundreds of thousands of reads/writes.

Little more than theory in 1971, the development of actual memristors didn't become reality until 2008 by an HP Labs research team led by Professor Stan Williams. The idea may have been floating around for some time, but finding the processes and materials to physically engineer memristors have remained elusive. A detailed article on the subject can be found here, explaining some of the principles behind how memristors work. If the technology proves viable, it will no doubt become a hugely important breakthrough for our increasingly digital lives -- perhaps in as little as three years.

Permalink to story.

 
sounds pretty cool! Especially applicable in mobile or low power devices where the CPU could be suspended completely and restored instantly (without having to reload any state information).
 
3 years, crap, can't wait that long.

Here I come new i7 920 PC!!!
 
It will be interesting to see how this can affect both CPU and GPU development and of course memory. It is exciting... but I have no idea what they are talking about. Sure they might start stacking chip designs, but I imagine there might be a little of a learning curve. Moreover, will it affect the nature of the processors? Will the x86 processor conform to the technology or will a completely new processor need to be designed?
 
The memristor revolution will most likely occur in stages. The first logical move is a convergence of RAM and storage, which would be a huge change in computer architecture, but also the easiest to implement. You would (in essence) have a lightning fast SSD that is your storage and your system memory, a highly efficient configuration with less components and interfacing required. Since it is persistent memory, this would make quick wake-ups easily possible.

The harder (but even more efficient) architecture change will be true melding of logic and memory, creating a processor that computes and stores simultaneously. That will take quite a bit of engineering, but would usher in true "instant-on" computing. I really don't see the x86 architecture fitting into this type of system, it is based on too much legacy segregation of processes.
 
Thanks for the additional insight Vrmithrax. This appears to be yet another cool technology though may take some time before we see it.
 
Is this for real? Looks like a copy of cheese doodles to me. I wonder if Frito Lay will sue HP.

I wonder if the consumer will be able to afford it without mortgagging(sic) his house , his sons house, grandsons house--etc.
 
Is this for real? Looks like a copy of cheese doodles to me. I wonder if Frito Lay will sue HP.

I wonder if the consumer will be able to afford it without mortgagging(sic) his house , his sons house, grandsons house--etc.

Or few other things among his kidneys and lungs you mean? :D Hey but technology has a habit of becoming cheaper as the time goes on, So I wouldn't count on it by the way.
 
Long time coming, there had to be something better than this OLD tech just made smaller and faster.

Going back to the late 70's, the personal computers of the day 16k of RAM was a lot!

Divide that number into your current PC's RAM, pretty sick...
 
Old news ...read about this like a year ago....still havent said anything new
 
What the??? How is this going to replace a CPU, or even memory?

"retention of off/on states without power and a proven durability which exceeds hundreds of thousands of reads/writes".

At 3Ghz, a PC could burn through the 'durability' in about a millisecond! One loop counter, "do i = 1 to 1000000" and it's gone!

Replace Compact Flash, perhaps.
ChrisH
 
Guest said:
What the??? How is this going to replace a CPU, or even memory?

"retention of off/on states without power and a proven durability which exceeds hundreds of thousands of reads/writes".

At 3Ghz, a PC could burn through the 'durability' in about a millisecond! One loop counter, "do i = 1 to 1000000" and it's gone!

Replace Compact Flash, perhaps.
ChrisH

Should probably keep in mind that the first commercial transistors had lifetimes measured in the 1000s of cycles as well... And were huge... If you applied the same comparative logic by only looking at the original specs of transistors, computers as we know them now would be impossible.

Improvements will come, it's still a very new implementation of a theory that's been around a while. Efficiencies and densities will get better as experimentation moves the concept forward. Just like in every other aspect of computing.
 
Badfinger said:
Long time coming, there had to be something better than this OLD tech just made smaller and faster.

Going back to the late 70's, the personal computers of the day 16k of RAM was a lot!

Divide that number into your current PC's RAM, pretty sick...

Totally agree... Maybe this is the missing link that will help them get the Stargate up and running.
 
You guys have to remember that HP is behind this. If you every really worked on computers, you know that if one thing on a 2-3 year old HP dies, so does two or three other things. There junk. If HP is leading this new theory being implemented, then don't get your hopes to high up for it. Once another company gets a hold of it then maybe.
 
Guest said:
You guys have to remember that HP is behind this. If you every really worked on computers, you know that if one thing on a 2-3 year old HP dies, so does two or three other things. There junk. If HP is leading this new theory being implemented, then don't get your hopes to high up for it. Once another company gets a hold of it then maybe.

And YOU should probably keep in mind that HP Labs is a research and development group within HP, not the PC and Printer division. Just like IBM has a research department, and most other big forward-thinking companies in the industry. HP Labs, like the other R&D divisions out there, have been responsible for developing little enhancements and breakthroughs that power almost everything you use on a daily basis. And if you ever print a page in your life, you are most likely utilizing some level of tech that HP helped foster.

True, I can't say their PCs are great. But they are pushing the envelope with the new memristor technology, and you spouting disdain for their hardware line has absolutely nothing to do with this research.
 
"CPUs are often made with tens or even hundreds of millions of transistors."
Statements like these undermine the credibility of the article. CPUs have already passed the 2 billion transistor mark.
 
The application of this technology strikes me as pretty simple. One time of RAM to rule them all. I don't know if it will be the fastest or replacement for the whole shmear (DDR5) but it certainly has to become the most important.

If everyone can get to see data as code and code as data, which it is, then it follows that software will become inherently self-modifying. Effectively, because you do not need to load anything to use it, applications will become more specialized and modular, because packing features will be pointless as it becomes easier to just access what you need right away. The filesystem becomes the global data structure on which all programs interoperate. No more need for other layers. Programs could actually work together more seamlessly, because this RAM obliterates the need to load or save things; an access to a file would MEAN access to fast memory. You don't need to load or save data anymore (unless you're copying data to another area for backup), so suddenly there only needs to be ONE program that can invoke and manipulate modules, randomly and performantly. OS's will become 10 times smaller without all those layers that try to streamline the separation between storage and "active" programs. We'll probably just call everything "objects" in the future.
 
It's called memory mapped massive storage, and many servers with massive amounts of memory implement it with underlying layers of cache which essentially act as your so called no-need-for-layers. In assistance with this technology there's little change to be done for any server that would take this new adaptation. What you said is nothing special, it's been theorized and attacked as far as the 70s against the currently proposed modular computer model of a processor, memory and mass storage.
 
Status
Not open for further replies.
Back