Intel unveils Pohoiki Beach: a chip cluster that mimics the human brain

nanoguy

Posts: 1,355   +27
Staff member
The big picture: One of the recurring themes in computer technology in recent years is the dramatic slowdown of Moore’s Law, and how big companies are scrambling to find creative ways around the limitations of silicon. Now Intel has revealed a neuromorphic system that is more than 1,000 times faster than traditional CPUs at specific tasks. It’s not going to run your games any faster, but it is a boon for self-driving cars and the Internet of Things.

Intel had a big show off earlier at DARPA’s Electronics Resurgence Initiative 2019 summit in Detroit, with a project called Pohoiki Beach -- essentially a cluster of 64 of the company’s Loihi chips networked on a Nahuku board, capable of mimicking the self-learning abilities of eight million neurons. The system is able to process information much faster than a traditional CPU+GPU architecture, all while sipping 10,000 times less power than that setup.

The tech giant says its neuromorphic chips aren’t a suitable replacement for traditional CPU architecture, instead its potential lies in accelerating specialized applications like constraint-satisfaction problems, graph searches, and sparse coding. Simply put, those algorithms are essential for things like autonomous cars, object-tracking cameras, prosthetics, and artificial skin to name a few.

Pohoiki Beach is now headed to over 60 research partners, some of which have already started working on adapting it to real-world problems. The chipmaker claims that even when the system was scaled up 50 times, it was still able to use five times less power. In another application, mapping and location tracking with Loihi chips was just as accurate, but 100 times less power-hungry than a CPU-run solution that’s popular in the industry.

Intel promised to scale up Pohoiki Beach to 100 million neurons later this year, paving the way for tiny supercomputers that accelerate AI and other complex tasks, and is also looking at cramming as many Loihi chips as it can into a USB form factor system codenamed Kapoho Bay, ideal for low-power applications.

If you’ve been following the discussion around computing technology, you may already know that exotic dreams like MESO quantum computing are still far from becoming a reality. Intel’s been hard at work trying to crack that problem, but its strategy remains centered around chiplets.

Permalink to story.

 
I'm not buying it. It all looks and sounds like another attempt at competing with nVidia's CUDA platform, which today is so powerful, the entire industry is moving there now.

Funny, just minutes ago I left a comment about the Tensorflow, but here it is a closely related subject, because that Tensorflow is the very tool that reveals the full potential of nVidia CUDA at solving AI tasks. Intel knows it, and so they are trying to get a piece of the market, by making very bold claims, as usual.

Anyhow, I don't think the market will be interested in anything specific in the AI field, because the field is fairly complex, requires a lot of time and learning, and doing all that just for some specific AI tasks, and not generic ones - sounds like not a good investment.

Today, nVidia CUDA + Tensorflow is a monster couple of a general-purpose AI engine, which is very much worth studying, if you an IT person. Everything else is mostly speculation.
 
Last edited:
Is this bringing to mind Skynet or anything similar? AI is one thing to help solve problems that humans seem slow to do, yet it still seems like playing with fire, one day it will burn you.
 
I'm not buying it. It all looks and sounds like another attempt at competing with nVidia's CUDA platform, which today is so powerful, the entire industry is moving there now.

Funny, just minutes ago I left a comment about the Tensorflow, but here it is a closely related subject, because that Tensorflow is the very tool that reveals the full potential of nVidia CUDA at solving AI tasks. Intel knows it, and so they are trying to get a piece of the market, by making very bold claims, as usual.

Anyhow, I don't think the market will be interested in anything specific in the AI field, because the field is fairly complex, requires a lot of time and learning, and doing all that just for some specific AI tasks, and not generic ones - sounds like not a good investment.

Today, nVidia CUDA + Tensorflow is a monster couple of a general-purpose AI engine, which is very much worth studying, if you an IT person. Everything else is mostly speculation.

obviously the "entire industry" is not going CUDA, there are many approaches to this young science. No one should want CUDA dominance, that would crush innovation. Let many approaches compete, and many smart people enter the field. all approaches, including CUDA, will be better for it.

Look at both Intel and Microsoft, they gave us crap when they had virtual monopolies. It wasn't until competition broke out, that they got back in the game and made some advancements. Do you think if MS was competitive, they would have lost Mobile? Of course not, they would have been nimble enough to see the trends and competed.

Maybe you are right in that CUDA is best, I have no idea. But I do know competition will make them better
 
obviously the "entire industry" is not going CUDA, there are many approaches to this young science. No one should want CUDA dominance, that would crush innovation. Let many approaches compete, and many smart people enter the field. all approaches, including CUDA, will be better for it.

Look at both Intel and Microsoft, they gave us crap when they had virtual monopolies. It wasn't until competition broke out, that they got back in the game and made some advancements. Do you think if MS was competitive, they would have lost Mobile? Of course not, they would have been nimble enough to see the trends and competed.

Maybe you are right in that CUDA is best, I have no idea. But I do know competition will make them better
The point being, Intel is at its catch game where it has been for some time now, pressured by the competition, becoming jack-of-all-trades - fast at trying everything, to end up good at nothing.
 
obviously the "entire industry" is not going CUDA, there are many approaches to this young science. No one should want CUDA dominance, that would crush innovation. Let many approaches compete, and many smart people enter the field. all approaches, including CUDA, will be better for it.

Look at both Intel and Microsoft, they gave us crap when they had virtual monopolies. It wasn't until competition broke out, that they got back in the game and made some advancements. Do you think if MS was competitive, they would have lost Mobile? Of course not, they would have been nimble enough to see the trends and competed.

Maybe you are right in that CUDA is best, I have no idea. But I do know competition will make them better
The point being, Intel is at its catch game where it has been for some time now, pressured by the competition, becoming jack-of-all-trades - fast at trying everything, to end up good at nothing.
Except they're a lot bigger than all of their competitors, so if they actually DO get something right, they can wipe the floor with them...
 
The question is whether they are as powerful as quantum-physicist's brain, or a politicians brain. Because the latter one can be emulated with 2 switches.
 
Back