1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Intel unveils Pohoiki Beach: a chip cluster that mimics the human brain

By nanoguy · 8 replies
Jul 16, 2019
Post New Reply
  1. Intel had a big show off earlier at DARPA’s Electronics Resurgence Initiative 2019 summit in Detroit, with a project called Pohoiki Beach -- essentially a cluster of 64 of the company’s Loihi chips networked on a Nahuku board, capable of mimicking the self-learning abilities of eight million neurons. The system is able to process information much faster than a traditional CPU+GPU architecture, all while sipping 10,000 times less power than that setup.

    The tech giant says its neuromorphic chips aren’t a suitable replacement for traditional CPU architecture, instead its potential lies in accelerating specialized applications like constraint-satisfaction problems, graph searches, and sparse coding. Simply put, those algorithms are essential for things like autonomous cars, object-tracking cameras, prosthetics, and artificial skin to name a few.

    Pohoiki Beach is now headed to over 60 research partners, some of which have already started working on adapting it to real-world problems. The chipmaker claims that even when the system was scaled up 50 times, it was still able to use five times less power. In another application, mapping and location tracking with Loihi chips was just as accurate, but 100 times less power-hungry than a CPU-run solution that’s popular in the industry.

    Intel promised to scale up Pohoiki Beach to 100 million neurons later this year, paving the way for tiny supercomputers that accelerate AI and other complex tasks, and is also looking at cramming as many Loihi chips as it can into a USB form factor system codenamed Kapoho Bay, ideal for low-power applications.

    If you’ve been following the discussion around computing technology, you may already know that exotic dreams like MESO quantum computing are still far from becoming a reality. Intel’s been hard at work trying to crack that problem, but its strategy remains centered around chiplets.

    Permalink to story.

  2. VitalyT

    VitalyT Russ-Puss Posts: 4,537   +3,125

    I'm not buying it. It all looks and sounds like another attempt at competing with nVidia's CUDA platform, which today is so powerful, the entire industry is moving there now.

    Funny, just minutes ago I left a comment about the Tensorflow, but here it is a closely related subject, because that Tensorflow is the very tool that reveals the full potential of nVidia CUDA at solving AI tasks. Intel knows it, and so they are trying to get a piece of the market, by making very bold claims, as usual.

    Anyhow, I don't think the market will be interested in anything specific in the AI field, because the field is fairly complex, requires a lot of time and learning, and doing all that just for some specific AI tasks, and not generic ones - sounds like not a good investment.

    Today, nVidia CUDA + Tensorflow is a monster couple of a general-purpose AI engine, which is very much worth studying, if you an IT person. Everything else is mostly speculation.
    Last edited: Jul 16, 2019
  3. grumblguts

    grumblguts TS Enthusiast Posts: 97   +93

    All on off switches mate.
    cuda on off switch, tensor cores on off switches.
    you get excited over nothing.
  4. VitalyT

    VitalyT Russ-Puss Posts: 4,537   +3,125

    Every neuron in your brain is an On/Off switch, and wow, look, you can even type - how do you do that?
  5. Robertrogue

    Robertrogue TS Booster Posts: 72   +29

    Is this bringing to mind Skynet or anything similar? AI is one thing to help solve problems that humans seem slow to do, yet it still seems like playing with fire, one day it will burn you.
  6. GregonMaui

    GregonMaui TS Booster Posts: 150   +51

    obviously the "entire industry" is not going CUDA, there are many approaches to this young science. No one should want CUDA dominance, that would crush innovation. Let many approaches compete, and many smart people enter the field. all approaches, including CUDA, will be better for it.

    Look at both Intel and Microsoft, they gave us crap when they had virtual monopolies. It wasn't until competition broke out, that they got back in the game and made some advancements. Do you think if MS was competitive, they would have lost Mobile? Of course not, they would have been nimble enough to see the trends and competed.

    Maybe you are right in that CUDA is best, I have no idea. But I do know competition will make them better
  7. VitalyT

    VitalyT Russ-Puss Posts: 4,537   +3,125

    The point being, Intel is at its catch game where it has been for some time now, pressured by the competition, becoming jack-of-all-trades - fast at trying everything, to end up good at nothing.
  8. Squid Surprise

    Squid Surprise TS Evangelist Posts: 2,568   +1,548

    Except they're a lot bigger than all of their competitors, so if they actually DO get something right, they can wipe the floor with them...
  9. Markoni35

    Markoni35 TS Addict Posts: 257   +115

    The question is whether they are as powerful as quantum-physicist's brain, or a politicians brain. Because the latter one can be emulated with 2 switches.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...