Human thought crawls at 10 bits per second, Caltech study finds

Skye Jacobs

Posts: 583   +13
Staff
What just happened? Scientists have discovered that our brains process thoughts much more slowly than previously believed. This surprising finding has its roots in our evolutionary history and sheds more light on why our minds work the way they do.

Researchers at the California Institute of Technology have unveiled a startling revelation about the human mind: our thoughts move at a mere 10 bits per second, a rate that pales in comparison to the staggering billion bits per second at which our sensory systems gather environmental data. This discovery, published in the journal Neuron, is challenging long-held assumptions about human cognition.

The research, conducted in the laboratory of Markus Meister, the Anne P. and Benjamin F. Biaggini Professor of Biological Sciences at Caltech, and spearheaded by graduate student Jieyu Zheng, applied information theory techniques on an extensive collection of scientific literature. By analyzing human behaviors such as reading, writing, video gaming, and Rubik's Cube solving, the team calculated the 10 bits per second figure – a rate that Meister describes as "extremely low."

To put this in perspective, a typical Wi-Fi connection processes about 50 million bits per second, making our thought processes seem glacial by comparison. This stark contrast raises a paradox that Meister and his team are eager to explore further: "What is the brain doing to filter all of this information?"

The human brain contains over 85 billion neurons, with one-third dedicated to high-level thinking in the cortex. Individual neurons are capable of transmitting more than 10 bits per second, yet our overall thought process operates at a much slower rate. This discrepancy presents another conundrum for neuroscientists to unravel.

Furthermore, the study highlights a peculiar constraint of human cognition: our ability to process only one thought at a time, rather than multiple thoughts in parallel like our sensory systems. This sequential nature of thought is exemplified in activities such as chess, where players can only envision one possible sequence of moves at a time.

Zheng and Meister propose that this limitation may be rooted in our evolutionary history. They suggest that the earliest creatures with nervous systems primarily used their brains for navigation – moving towards food and away from predators. If our complex brains evolved from these simple systems, it would explain our tendency to follow only one "path" of thought at a time.

"Human thinking can be seen as a form of navigation through a space of abstract concepts," the researchers write.

This new quantification of human thought speed has far-reaching implications, potentially debunking some futuristic scenarios proposed by tech visionaries. For instance, the idea of creating direct interfaces between human brains and computers to accelerate communication may be less promising than previously thought, as our brains would likely still communicate at the same 10 bits per second rate.

The study also suggests that our cognitive speed is well-suited to our environment. "Our ancestors have chosen an ecological niche where the world is slow enough to make survival possible," Zheng and Meister note. "In fact, the 10 bits per second are needed only in worst-case situations, and most of the time our environment changes at a much more leisurely pace."

Permalink to story:

 
Hey, I’m a computer guy just as much as the rest of us, but human brains running at 10-bits? What utter nonsense. It seems like an absolute apples to oranges to comparison to try and extrapolate our data rate or bit rate.

“Yeah, we’re disappointed, our current estimates say the human brain won’t be able to hit 5GHz.”
 
If 10 bits = 1.25 bytes = 1.25 letters, I just read this article at a lot faster than 10bps. Thinking the study is rubbish took no extra time.
Unless your thoughts don't use ANSI encoding for letters, which is kinda certain.

So yeah, it's extremely hard to measure in any shape or form.
 
This measurement is very flawed. It fails to account for how vast the brain neuron network is, and how much is happening in parallel.

We can visualize complex scenarios and high-detailed imaging, all in parallel, which is not possible with some 10-bit speed, it's x1000 times faster than that.

Vision alone, is comparable to handling and processing in real time at least 2x8K@120Hz video streams, flawlessly. That's over 200Gbit of data being processed in real time.
 
Last edited:
This measurement is very flawed. It fails to account for how vast the brain neuron network is, and how much is happening in parallel.

We can visualize complex scenarios and high-detailed imaging, all in parallel, which is not possible with some 10-bit speed, it's x1000 times faster than that.
Parallel for sure, and not just a meager 16 or 32 lanes, millions or billions. Just makes one wonder what a human brain must do to make a memory, which I suppose is required for us to have rational thought.
 
Well, considering this is from Caltech, it's probably a bunch of over theorized, and over funded, time wasting, PHD thesis, bullsh*t.

Besides, we have five senses, and all can be processed simultaneously. (To one degree or another), So, even if this is true, at least we have a 5 bit buss width to fall back on.

Which doesn't include the memory speed transfer at which we can access our "ROM"
 
This measurement is very flawed. It fails to account for how vast the brain neuron network is, and how much is happening in parallel.

We can visualize complex scenarios and high-detailed imaging, all in parallel, which is not possible with some 10-bit speed, it's x1000 times faster than that.

Yeah this was my thought as well, maybe it was said in the paper but not the article. Our brains more more like HBM than L3 cache.
 
Hey, I’m a computer guy just as much as the rest of us, but human brains running at 10-bits? What utter nonsense. It seems like an absolute apples to oranges to comparison to try and extrapolate our data rate or bit rate.

“Yeah, we’re disappointed, our current estimates say the human brain won’t be able to hit 5GHz.”

Another computer guy here and I totally agree. Additionally, thoughts are impulses before they are words, thoughts are in multiple languages depending on where we're born etc so there's a common impulse layer before they are expressed in our chosen language. I bet that impulse layer isn't firing at only a few bits.
 
Well, considering this is from Caltech, it's probably a bunch of over theorized, and over funded, time wasting, PHD thesis, bullsh*t.

Besides, we have five senses, and all can be processed simultaneously. (To one degree or another), So, even if this is true, at least we have a 5 bit buss width to fall back on.

Which doesn't include the memory speed transfer at which we can access our "ROM"
Well, considering this is from Caltech, it's probably a bunch of over theorized, and over funded, time wasting, PHD thesis, bullsh*t.

Besides, we have five senses, and all can be processed simultaneously. (To one degree or another), So, even if this is true, at least we have a 5 bit buss width to fall back on.

Which doesn't include the memory speed transfer at which we can access our "ROM"
Well, considering this is from Caltech, it's probably a bunch of over theorized, and over funded, time wasting, PHD thesis, bullsh*t.

Besides, we have five senses, and all can be processed simultaneously. (To one degree or another), So, even if this is true, at least we have a 5 bit buss width to fall back on.

Which doesn't include the memory speed transfer at which we can access our "ROM"

Actually we have 11 primary senses. The view of 5 senses has been updated in more recent times, as well as 26 secondary senses. It just shows how we still have so much to learn about the brain, it’s an incredible piece of bioengineering. I read somewhere that the most powerful supercomputers have the computational capacity of an ants nervous system; estimates are the year 2050 before we have anything that can mimick the human brain, and even then it will only be capable of AGI
 
It may have 10 bits... But the programming behind it ? Took time and evolution. The paradox is that all this evolution was connected to all the space around it... Even LUCA, the tiniest aspect of our evolution, reacted to its environment and used it to evolve, used rain they said.

Also the brain has the ability to overclock a bit, by means of affirmations, goals, etc.

The thought is linear but what lies beyond it, is not. They say we evolved because we seen beyond limitations, after all.
 
@Just Some Dude I really wish you had left some sort of link to the 11/26 hypothesis of human senses count.

This is from a doctor at John Hopkins in which he puts the count a nine:


I'll go along with six, as adding the sense of balance is reasonable, (Which BTW, cats and mountain goats share in an overabundance). OTOH, I'm not entirely sure why an inbred function of the autonomic system, is necessarily entitled to be categorized as a "sense". I rather think the "sense of balance", is more of a slang term, than a biological explanation of its causation for existence.

However, when you add, "sense of pain", I would argue that pain, (to some degree, but not in all cases), is a component of touch. Again arguably, it could be interpreted as an instinctive warning signal. While "sense of temperature", is also a component of touch It's just that temperature is a general application of touch, while "touching something", with your finger, (or for that matter your backside)., is more localized. Although that doesn't hold true for all rumps.

Animals can feel pain, see, hear, smell, touch, and taste. Pretty much, we ascribe all animal behaviour to "instinct", but disavow any innate instinctive abilities in our species as "learned". In reality, our senses are simply traits acquired via the evolutionary process. This is a fascinating topic though. I believe in years past, pondering how many instincts we have, was all the rage. (Don't know exactly where I read that But I can reasonably be excused for not remembering, as I'm pretty old, and my bit rate is likely well below 10 at this stage of my life.) :rolleyes:

Do you mind if I ask why you quoted my post three times? :confused:

In any case, I wish you and yours a happy holiday season.
 
This measurement is very flawed. It fails to account for how vast the brain neuron network is, and how much is happening in parallel.

We can visualize complex scenarios and high-detailed imaging, all in parallel, which is not possible with some 10-bit speed, it's x1000 times faster than that.

Vision alone, is comparable to handling and processing in real time at least 2x8K@120Hz video streams, flawlessly. That's over 200Gbit of data being processed in real time.
Ah, now there we have a very interesting subject. The brains ability to handle vision is very dependant on pattern recognition and learning. The raw capabilities of brain handling vision are very poor and it takes years for the brain to develop good vision. Different parts of the brain play different parts in processing an image so there is a lot of learning and parallel processing going on. Not all of it is plain thought power. In a way I believe the results of this study. Many of the things we put down to thought and intelligence depend upon pattern recognition for speed of responce. Try a few tests of your brains ability to process serial bit streams. 25 words per minute morse code is just about the limit for a human being and that is ultimately achieved by trained pattern recognition not thought. I am starting to convince myself that 10 bits per second is quite good for a human brain.
 
Yeah this was my thought as well, maybe it was said in the paper but not the article. Our brains more more like HBM than L3 cache.
Since this topic directly compares human memory processes to binary data transfer speed, I feel somewhat compelled to point at that we don't use binary math to process information. Hence, a direct comparison isn't possible.

The only (somewhat logical), comparison I can come up is that which we categorize as "muscle memory", (as in learning to play some sort of musical instrument), is comparable to us, "constantly updating out firmware". Or maybe, "it's like riding a bicycle, you never forget". Again, a childhood, "firmware update".
 
Last edited:
It may have 10 bits... But the programming behind it ? Took time and evolution. The paradox is that all this evolution was connected to all the space around it... Even LUCA, the tiniest aspect of our evolution, reacted to its environment and used it to evolve, used rain they said.

Also the brain has the ability to overclock a bit, by means of affirmations, goals, etc.

The thought is linear but what lies beyond it, is not. They say we evolved because we seen beyond limitations, after all.
I would think that human emotions are responsible for the overclocking if anything,
 
I suppose the subjects studied were indulging, sounds like something they would do to study something in California.
The valley kidz could probably use a new, more up to date euphemism, for being stoned anyway.

SIC: "Dude, this last batch really has me underclocked, big time. I can feel every one of my 32 senses" (blows a smoke ring). Then turns to dude #2 and asks, "want a click?"
 
This is, of course, almost entirely nonsense, and it should be obvious it's nonsense if they stepped away from the calculation and thought about whether it could possibly be accurate. Obviously they are measuring something, just not the thing they think they are measuring.
In particular, people can think multiple things at the same time, they just need to be reasonably different rather than too similar...
 
Well, I was able to complete and enjoy Elden Ring + Shadow of the Erdtree despite my human brain deficiencies, so I guess I'm fine :-D
 
Back