Your brain waves are up for sale – a new law wants to change that


The law targets brain technologies at the consumer level. Unlike sensitive patient data obtained from medical devices in clinical settings, which is protected by federal health law, data surrounding consumer neurotechnologies is largely unregulated, Genser said. This breach means companies can collect large amounts of highly sensitive brain data, sometimes for an unspecified number of years, and share or sell the information to third parties.

Supporters of the bill expressed concern that neural data could be used to decode a person's thoughts and feelings or to learn sensitive facts about a person's mental health, such as whether someone has epilepsy.

“We've never seen anything with this power — to identify, code people and bias against people based on their brain waves and other neural information,” said Sean Pauzauskie, a board member of the Colorado Medical Society, who he first brought the problem to Kipp's attention. Pauzauskie was recently hired by the Neurorights Foundation as medical director.

The new law extends to biological and neural data the same protections as the Colorado Privacy Act to fingerprints, facial images and other sensitive biometric data.

Among other protections, consumers have the right to access, delete and correct their data, as well as to opt out of the sale or use of the data for targeted advertising. Companies, in turn, face strict regulations on how they treat this data and must disclose the type of data they collect and their plans for it.

“People should be able to control where that information goes, that personally identifiable and maybe even personally predictive information,” Baisley said.

Experts say the neurotech industry is poised to expand as big tech companies like Meta, Apple and Snapchat get involved.


“It's moving fast, but it's about to grow exponentially,” said Nita Farahany, a professor of law and philosophy at Duke University.

From 2019 to 2020, investments in neurotech companies increased by around 60% globally and by 2021 will reach around $30 billion, according to a market analysis. The industry gained attention in January when Elon Musk announced on social media platform X that a brain-computer interface manufactured by Neuralink, one of his companies, had been implanted in a person for the first time. Musk has since said that the patient made a full recovery and can now only control a mouse with his thoughts and play chess online.

Although strangely dystopian, some brain technologies have led to innovative treatments. In the year 2022, a completely paralyzed man was able to communicate through a computer simply by imagining his eyes moving. And last year, scientists were able to translate the brain activity of a paralyzed woman and transmit her speech and facial expressions through an avatar on a computer screen.


“The things people can do with this technology are great,” Kipp said. “But we just think there should be some guardrails for people who don't intend to read your mind and use your biological data.”

That's already happening, according to a 100-page report released Wednesday by the Neurorights Foundation. The report looked at 30 consumer neurotechnology companies to see how their privacy policies and user agreements squared up to international privacy standards.

It found that only one company restricted access to a person's neural data in a meaningful way, and that nearly two-thirds could, under certain circumstances, share data with third parties. Two companies hinted that they were already selling this data.

“The need to protect neural data is not tomorrow's problem, it's today's problem,” said Genser, who was among the authors of the report.


Colorado's bill is the first of its kind to be signed into law in the United States, but Minnesota and California are pushing for similar legislation. On Tuesday, the California Senate Judiciary Committee unanimously approved a bill that defines neural data as “sensitive personal information.”

A number of countries, including Chile, Brazil, Spain, Mexico and Uruguay, have already enshrined brain data protections in their state or national constitutions or taken steps to do so.

“In the long term,” Genser said, “we would like to see global standards developed,” for example, by expanding existing international human rights treaties to protect neural data.

In the United States, proponents of Colorado's new law hope it will set a precedent for other states and even create momentum for federal legislation. But the law has limitations, experts noted, and could apply only to consumer neurotechnology companies that are collecting neural data specifically to determine a person's identity, as the new law specifies. Most of these companies collect neural data for other reasons, such as to infer what a person might be thinking or feeling, Farahany said.

“You're not going to worry about this Colorado bill if you're one of those companies right now, because none of them use them for identification,” he added.

But Genser said the Colorado Privacy Act protects any data that qualifies as personal. Since consumers must provide their names to purchase a product and agree to the company's privacy policies, such use is included in personal data, he said.

“Given that consumer neural data was not protected at all under the Colorado Privacy Act,” Genser wrote in an email, “that sensitive personal information is now labeled with equivalent protections, as the data biometrics is a big step forward.”

In a parallel Colorado bill, the American Civil Liberties Union and other human rights organizations are pushing for stricter policies on the collection, retention, storage and use of all biometric data, whether for the purposes of identification or not. If the bill passes, its legal implications would apply to neural data.

Big tech companies played a role in shaping the new law, arguing it was too broad and risked harming their ability to collect data not strictly related to brain activity.

TechNet, a policy network representing companies like Apple, Meta and Open AI, successfully pushed to include language focused on the law regulating brain data used to identify people. But the group was unable to remove language-governing data generated by “an individual's body or bodily functions.”

“We felt like this could be very broad for a number of things that all of our members do,” said Ruthie Barko, TechNet's executive director for Colorado and the central United States.

This article originally appeared on The New York Times.


Leave a Reply

Your email address will not be published. Required fields are marked *