Council of Europe Crafting Privacy Guidelines for Neuroscience
The Council of Europe has begun work on data protection and neurotechnology guidelines, which it expects to complete in around two years, sources told us. The interplay between neurotechnology and data protection is important given the growing use of the technology beyond the medical sphere, speakers said at a Data Protection Day conference in Brussels on Tuesday.
Sign up for a free preview to unlock the rest of this article
The basis for the CoE work is an expert report submitted to its Committee of Convention 108 on data protection in June 2024, a CoE spokesperson said in an email.
The report defines neurotechnology as "an umbrella term used to describe the spectrum of devices, tools, systems, and algorithms used to understand and/or influence, access, monitor, assess, emulate, simulate or modulate the structure, activity and function of the nervous systems of human beings and other animals."
"We all have a brain" and historically it was a black box, said Marcello Ienca, professor of ethics of AI and neuroscience, Munich Technical University School of Medicine and Health, during the Brussels conference. The human brain is the largest data repository on the planet. Accordingly, accessing its inner workings makes mountains of personal information available, said Ienca, a co-author of the CoE report.
Mind-reading isn't necessarily bad, especially for medical reasons, because we can't cure what we can't understand, Ienca added. But we must be very careful about the data gatekeepers and who gets to read brains, because there's a growing ecosystem of consumer-graded neurotechnology and Big Tech wants a share, he said: It's an "arms-race to people's minds."
There's a great need for global standards for neurotechnology, said Alessandra Pierucci, chair of the CoE's Consultative Committee of Convention 108. Pierucci, of the Italian Data Protection Authority (Garante), told conference attendees that the CoE is a good place to explore the many issues neurotechnology raises, since the 46-member organization promotes democracy, human rights and the rule of law.
Neuroscience raises privacy challenges such as consent to the use of personal data, Pierucci noted. She expects a lively debate on that issue, acknowledging that consent can't legitimize actions that violate human rights. Neurotechnology regulation needs a risk-based approach similar to that of AI, she added.
The CoE guidelines won't be binding per se on the signatories of Convention 108, but they indicate the approach and tools governments should use to meet the level of protection the Convention calls for, the CoE spokesperson said.
To ensure privacy in any area, there must be a differentiation between two categories of countries in Europe, the CoE said: EU nations, where the General Data Protection Regulation is directly applicable and enforceable, and non-EU CoE member nations, where privacy laws are ensured via national legislation in line with Convention 108.
There's no sector-specific privacy legislation for neurotechnology so far, which is why the CoE committee has taken the initiative to develop it, the spokesperson noted. General data protection rules apply that are or can be enforced by national data protection authorities.
The European Data Protection Supervisor is also monitoring developments in neurotechnology and privacy, an EDPS spokesperson emailed. A June 2024 EDPS TechDispatch report stressed the need for "strict adherence to data protection principles, including necessity, proportionality and data minimization, due to the intrusive nature of processing such data," the spokesperson wrote. Last July, the European Parliament's Panel for the Future of Science and Technology published a study on the protection of mental privacy in the area of neuroscience (societal, legal and ethical challenges.)