Privacy Daily is a service of Warren Communications News.
Self-Regulation Urged

Brain Data: A Burgeoning Legal Challenge for Privacy

Neurotechnology is the next big thing in privacy law and our minds are the last vestige of privacy, Cooley lawyers said Wednesday during a webinar.

Sign up for a free preview to unlock the rest of this article

Advances in neurotechnology can read data from brains and AI can derive information about people from it, speakers noted. There are devices that can measure brain activity via technologies such as electroencephalograms and structural magnetic resonance imaging.

New devices include those that can detect data from the central or peripheral nervous system. Brain-computer-interfaces are electrodes that can be implanted into different parts of the brain to treat such illnesses as epilepsy and paralysis. Most of these are at the clinical trial stage and could be on the market in five years, speakers said.

In addition, research continues on interpreting brain data to "read" the gist of a person's thoughts, the attorneys noted. This could be used, for instance, to help stroke victims use their minds to control their limbs, or to detect an epileptic seizure in advance. Business use cases for this technology include monitoring employees at work or directing customers to products they like. There are also consumer use cases such as managing stress and anxiety; government use cases for law enforcement and crime investigation, and military uses.

While reading people's minds has benefits such as mental health support or boosting personal safety, it carries risks when used for mental manipulation, discrimination and cybersecurity and data breaches, speakers said.

An increasing number of U.S. states and countries worldwide have laws addressing neurotechnology, panelists noted. These include Colorado and California, which recently amended their privacy legislation, adding neural or biological data to their definitions of sensitive personal information.

Chile was the first country to explicitly protect "neurorights," lawyers noted. In Europe, the European Data Protection Supervisor and Spanish Data Protection Authority issued a report on neurodata last year, concluding that it often constitutes a special category of data under the General Data Protection Regulation.

The EU AI Act bars certain uses of AI, such as in emotion-recognition systems, which would likely apply in the neurodata context, lawyers said, and general consumer data protection and privacy laws could also be relevant.

There is momentum for folding into neurotechnology laws additional cognitive biometrics that aren't strictly neural data but that can also be used to infer information about someone's mental state, such as heart rate or eye movement, speakers said.

With neurotechnology laws in their infancy, the industry should recognize that it can self-regulate in the neurotechnology and cognitive areas, and that there's every reason to do so, speakers said. Among other things, they recommended that companies go beyond privacy policies to give consumers full transparency about what their products will do and how people's neural data will be used. People should also be able to access, correct and delete their brain data and have choices about secondary uses of the information.

Attorneys also recommended that organizations implement privacy and security by design; future-proof data anonymization standards; and have a policy for dealing with law enforcement requests for neurodata. In addition, they said, brain data should not be held beyond when it's needed because that risks exposure to data breaches and cyberattacks.