The Momentous Promise and Peril of Neural Implants
April 8, 2024
Three weeks ago, Neuralink, a neurotechnology company founded by Elon Musk, posted a livestream on X showing a quadriplegic man playing video games using only his mind. The video, while raw and unaffected, is nothing short of magical: While performing Harry-Potter style wizardry, Noland Arbaugh says of the ability to play computer games after the brain implant surgery: “It has already changed my life.”
But the undeniably awe-inspiring scene belies a potentially very troubling implication of this technology. According to an article in the journal Frontiers in Systems Neuroscience, neural devices like Neuralink’s implant are designed to record or manipulate brain activity. As the article elaborates:
As we can retrieve ever more detailed and voluminous information about what is going on “inside” a patient, the issues of data integrity, data security, and privacy are gaining very high relevance for neurotechnology as well. Naturally, the read-out of brain activity and the corresponding data processing help the patient and alleviate the consequences of a disease or disability, thus restoring his or her quality of life to some degree. However, these data also become more “sensitive” the more precisely one is able to interpret the patients’ intentions and internal states.
The point of neural implants is to read your mind – and this is what makes them at once remarkable and dangerous. Brain implants like Neuralink’s will give blind patients the ability to see; but, without proper safeguards, they will also expose the brains of those patients to unprecedented access and potential exploitation by powerful corporations, making individuals even more vulnerable.
Fortunately, some U.S. state governments are taking action to protect neural data. On March 26, the Colorado Senate unanimously passed HB 24-1058, regulating neural and biological data as “sensitive” data under the Colorado Privacy Act. The bill now goes to the governor for signature. Similar efforts are underway in California and several countries around the world. But, as our Center has written in the context of AI and 3D immersive technology, we need comprehensive federal privacy legislation in the U.S. and elsewhere to close loopholes in existing local privacy laws and to protect everyone, regardless of where they happen to reside.