Your brain needs a really good lawyer

 

Selección

Sigal Samuel 

If you take it for granted that nobody can listen in on your innermost thoughts, I regret to inform you that your brain may not be private much longer.

You may have heard that Elon Musk’s company Neuralink surgically implanted a brain chip in its first human. Dubbed “Telepathy,” the chip uses neurotechnology in a medical context: It aims to read signals from a paralyzed patient’s brain and transmit them to a computer, enabling the patient to control it with just their thoughts. In a medical context, neurotech is subject to federal regulations.

But researchers are also creating noninvasive neurotech. Already, there are AI-powered brain decoders that can translate into text the unspoken thoughts swirling through our minds, without the need for surgery — although this tech is not yet on the market. In the meantime, you can buy lots of devices off Amazon right now that would record your brain data (like the Muse headband, which uses EEG sensors to read patterns of activity in your brain, then cues you on how to improve your meditation). Since these aren’t marketed as medical devices, they’re not subject to federal regulations; companies can collect — and sell — your data.

With Meta developing a wristband that would read your brainwaves and Apple patenting a future version of AirPods that would scan your brain activity through your ears, we could soon live in a world where companies harvest our neural data just as 23andMe harvests our DNA data. These companies could conceivably build databases with tens of millions of brain scans, which can be used to find out if someone has a disease like epilepsy even when they don’t want that information disclosed — and could one day be used to identify individuals against their will.

Luckily, the brain is lawyering up. Neuroscientistslawyers, and lawmakers have begun to team up to pass legislation that would protect our mental privacy.

In the US, the action is so far happening on the state level. The Colorado House passed legislation this month that would amend the state’s privacy law to include the privacy of neural data. It’s the first state to take that step. The bill had impressive bipartisan support, though it could still change before it’s enacted.

Minnesota may be next. The state doesn’t have a comprehensive privacy law to amend, but its legislature is considering a standalone bill that would protect mental privacy and slap penalties on companies that violate its prohibitions.

But preventing a company from harvesting brain data in one state or country is not that useful if it can just do that elsewhere. The holy grail would be federal — or even global — legislation. So, how do we protect mental privacy worldwide?

Your brain needs new rights

Rafael Yuste, a Columbia University neuroscientist, started to get freaked out by his own neurotech research a dozen years ago. At his lab, employing a method called optogenetics, he found that he could manipulate the visual perception of mice by using a laser to activate specific neurons in the visual cortex of the brain. When he made certain images artificially appear in their brains, the mice behaved as though the images were real. Yuste discovered he could run them like puppets.

He’d created the mouse version of the movie Inception. And mice are mammals, with brains similar to our own. How long, he wondered, until someone tries to do this to humans?

In 2017, Yuste gathered around 30 experts to meet at Columbia’s Morningside campus, where they spent days discussing the ethics of neurotech. As Yuste’s mouse experiments showed, it’s not just mental privacy that’s at stake; there’s also the risk of someone using neurotechnology to manipulate our minds. While some brain-computer interfaces only aim to “read” what’s happening in your brain, others also aim to “write” to the brain — that is, to directly change what your neurons are up to.

The group of experts, now known as the Morningside Group, published Nature paper later that year making four policy recommendations, which Yuste later expanded to five. Think of them as new human rights for the age of neurotechnology:

Seguir leyendo: Vox

Imagen de Gerd Altmann en Pixabay

Vistas:

202