WTF?! Not content with worrying everyone about job losses, killer robots, and total extinction, it seems AI also has the capacity to threaten our mental privacy. The warning comes from the United Nations, which says that "warp speed" advancements in neurotechnology, such as brain implants or scans, could allow AI to look into our private thoughts.

The United Nations Educational, Scientific and Cultural Organization (UNESCO) is so concerned about the prospect of human rights violations posed by neurotechnology that it has started developing a global "ethical framework" to address it, writes Agence France-Presse.

Neurotechnology is a field that looks to connect electronic devices to the nervous system. Probably the best-known company working in this area is Elon Musk's Neuralink, which previously showed a video of a nine-year-old monkey with one of the company's implants in its brain playing Pong using only its thoughts. In May, the FDA gave Neuralink permission to begin human trials after rejecting approval in early 2022.

Although Musk once talked about Neuralink implants allowing users to stream music directly into their brains, its primary focus is to help treat fatal neurological diseases, paralysis, blindness, and more.

As with so many things these days, artificial intelligence is being used to boost neurotechnology, by processing and learning from swathes of data in ways that were never previously possible, said Mariagrazia Squicciarini, a UNESCO economist specializing in AI. "It's like putting neurotech on steroids," she told AFP.

Gabriela Ramos, UNESCO's assistant director general for social and human sciences, said the merging of neurotechnology and AI was "far-reaching and potentially harmful."

"We are on a path to a world in which algorithms will enable us to decode people's mental processes and directly manipulate the brain mechanisms underlying their intentions, emotions, and decisions," Ramos told a conference in Paris.

UNESCO isn't calling neurotechnology a bad thing; Squicciarini praised its ability to help people with severe neurological conditions. But UNESCO is warning that its "warp speed" advancements and AI integration necessitate ethical guidelines to protect human rights.

Neurotechnology has come a long way over the last two decades, but in some cases, it can cause more harm than good. As per Insider, UNESCO spoke to Hannah Galvin, a woman with epilepsy who opted to have an invasive neurological device installed inside her skull on the surface of her brain to detect seizures and warn her in advance, so she knew to lie down before an attack. But the device led to her feeling like she had "someone inside her head," and it was going off constantly due to her having more than 100 seizures per day. Eventually, Galvin had the device removed.

"Neurotechnology could help solve many health issues, but it could also access and manipulate people's brains, and produce information about our identities, and our emotions. It could threaten our rights to human dignity, freedom of thought and privacy," said UNESCO director-general Audrey Azoulay in June.

Last week saw Musk, who has long called for tighter regulation on AI, launch xAI, a company that seeks to understand the "true nature of the universe."