On Wednesday, Colorado expanded the scope of its privacy law initially designed to protect biometric data like fingerprints or face images to become first in the nation to also shield sensitive neural data.
That could stop companies from hoarding brain activity data without residents realizing the risks. The New York Times reported that neural data is increasingly being collected and sold nationwide. And after a market analysis showed that investments in neurotechnology leapt by 60 percent globally from 2019 to 2020—and were valued at $30 billion in 2021—Big Tech companies have significantly intensified plans to develop their own products to rake in potentially billions.
For instance, in 2023, Meta demoed a wristband with a neural interface used to control its smart glasses and unveiled an AI system that could be used to decode the mind. In January, Elon Musk announced that Neuralink implanted its first brain chip in a human that can be used to control a device with their thoughts. And just last month, Apple Insider reported that “Apple is working on technology that could turn the Apple Vision Pro into a brainwave reader to improve mental health, assist with training and workouts, and help with mindfulness.”
Many technologies collect neural data for a variety of purposes, The Times reported. The tech has gone from inspiring medical uses leading to breakthrough treatments to personal uses like monitoring brain activity to help people meditate or interpreting brain signals to try to help users find better matches on dating apps. But not every user understands exactly how their neural data may otherwise be used.
Colorado’s law requires tech companies to gain consent to collect neural data and to be more transparent about how such data is used. Additionally, it must be easy for people to access, delete, or correct any neural data gathered that could be used—either on its own or in combination with other personal data—”for identification purposes.”
Companies must also provide paths for users to opt out of the sale of their neural data or use of their data in targeted advertising. “Tracking a person’s brain activity in real time” could give Big Tech the ultimate tool for targeted ads by theoretically offering “a more reliable, more precise, and personalized representation of an ad’s effectiveness,” Undark reported.
Through neurotechnologies, companies “have access to the records of the users’ brain activity—the electrical signals underlying our thoughts, feelings, and intentions,” NYT reported, but until now, they’ve gone largely unregulated in the US.
In Colorado, Democratic State Representative Cathy Kipp pushed for the privacy law updates by introducing a bill after a member of the board of directors of the Colorado Medical Society, Sean Pauzauskie, told her about loopholes in state laws.
Pauzauskie has since become medical director of The Neurorights Foundation, a charitable organization dedicated to promoting ethical neurotechnology innovation while protecting human rights. The Times noted that advancements in neurotechnology have helped paralyzed patients communicate through computers, which are widely viewed as important medical breakthroughs that critically rely on tech-monitoring brainwaves.
Kipp’s bill warned that neural data “can reveal intimate information about individuals, including health, mental states, emotions, and cognitive function” but “outside of medical settings” can “operate without regulation or data protection standards.”
“The things that people can do with this technology are great,” Kipp told NYT. “But we just think that there should be some guardrails in place for people who aren’t intending to have their thoughts read and their biological data used.”
Kipp told NYT that her concern in passing Colorado’s law was ensuring that nobody’s brain activity was being monitored without consent.
“I am excited that our bill to protect neurological and biological data has passed nearly unanimously,” Kipp told Ars. “Clearly, privacy of thought is a bipartisan issue.”
In a report, Neurorights has warned that companies seem to have lax stances when it comes to sharing neural data.
Neurorights surveyed privacy policies and user agreements of 30 consumer neurotechnology companies, finding that all but one company had access to neural data and two-thirds of companies were sharing neural data with third parties. Two companies implied they are selling data. Only one company restricted access to neural data, and four companies clearly stated they do not sell neural data.
Hurdles to federal brainwave data protection
Currently, similar legislation is advancing in California and has been introduced in Minnesota, but while Colorado’s bill passed unanimously, there has been some notable opposition that could stop the country from embracing Colorado’s privacy standards.
Some opposition comes from academic researchers. According to a co-sponsor of Colorado’s bill, Republican State Representative Mark Baisley, private universities fiercely opposed the law because it potentially limited their “ability to train students who are using ‘the tools of the trade in neural diagnostics and research’ purely for research and teaching purposes,” NYT reported.
Baisley told Ars that Colorado’s dispute with private universities is potentially unique to the state. When drafting the law, public universities conducting research with neural data were exempted, due to a conflict where the state attorney general would be charged with both prosecuting and defending public universities if any claims over their use of neural data arose. Because private universities in the state conducting similar research did not receive the same exemption, they opposed the legislation.
Baisley told Ars that he intends to push a follow-up bill next year to remove the exemption from public universities and resolve the conflict with private universities.
Other opponents include tech companies. TechNet, which represents companies like Apple, OpenAI, and Meta, pushed for changes in a parallel Colorado bill. TechNet won a battle to update the bill text to include language “focusing the law on regulating brain data used to identify individuals,” NYT reported, but lost a battle to ditch “very broad” language relating to data generated by “an individual’s body or bodily functions,” which Colorado’s law now includes.
The ACLU raised concerns about limiting the law to only cover data that can be used to identify individuals, which Colorado’s law currently does, instead recommending policy that restricts all biometric data collection, retention, storage, and use. In Colorado, this limitation means that companies that don’t specifically collect brainwave data for identification purposes—but for other purposes such as decoding someone’s thoughts or feelings—won’t be impacted by the law.
But although it’s maybe not a perfect privacy law, it’s still progress, Neurorights co-founder Jared Genser told NYT.
“Given that previously neural data from consumers wasn’t protected at all under the Colorado Privacy Act, to now have it labeled sensitive personal information with equivalent protections as biometric data is a major step forward,” Genser said.
Neurorights is hoping that Colorado’s law will inspire federal lawmakers to take similar action soon.
In a post on X, Neurorights celebrated Colorado’s law passing, “declaring Colorado the first place in world to legally define and protect neural data as sensitive.”
“Hopefully, we’ve begun some momentum that the world will take on,” Baisley told Ars.
This story was updated on April 18 to include comments from co-sponsors of Colorado’s law.