Privacy Daily is a service of Warren Communications News.
What is 'Intention'?

EU AI Act's Biometric Data Provisions Lack Clarity, IAPP Panel Says

DUBLIN -- One of the least clear aspects of the EU AI Act is its content on using biometric data, panelists said Wednesday at the IAPP AI Governance Global Europe 2025 conference.

Sign up for a free preview to unlock the rest of this article

Older privacy laws had a narrow definition of biometric data, but the AI Act expanded the General Data Protection Regulation (GDPR) to encompass other use cases, said Bird & Bird attorney Nora Santalu. The problem is that no one knows how broad the definition is, and there seems to be a misalignment between the AI Act and GDPR definitions, Santalu said.

Biometric systems prohibited by the AI Act include "obvious" ones such as inferring emotions in workplaces or schools, Santalu noted. Other uses, such as remote biometric identification and emotional recognition systems not expressly prohibited, are classed as high-risk.

In addition, there are gray areas. For example, it's unclear if hairstyles qualify, Santalu said.

One extremely hazy area involves emotion recognition systems (ERS), said Santalu. The AI Act defines an ERS as one which has the purpose of identifying or inferring the emotions or intentions of natural persons on the basis of biometric data. But, Santalu asked, what are emotions, and what are intentions?

The Dutch Data Protection Authority has consulted on emotional recognition, noted Emilie Van den Hoven, the DPA's senior supervisor for AI & Algorithms. "It is very difficult to make sense of" what the notion of intention entails, and EU AI guidelines don't say much about it, she said.

Many respondents to the Dutch inquiry called for greater clarity in interpreting the law, said Van den Hoven. She noted that if an ERS doesn't use biometric data, it doesn't fall under the AI Act.

Respondents also said that ERS is taking off quickly in education and customer service, Van de Hoven reported. They were concerned about it and stressed the need for more clarity, she added. Commenters also said ERS would be difficult to deploy as part of a multi-modal system because it's unclear how the systems could be separated. In addition, respondents worried about making sense of AI Act provisions in light of the GDPR.

The Dutch DPA is preparing its next AI risk report, said Van den Hoven. Anticipated this summer, it will address emotional recognition.

Asked whether it's too early to regulate uses of neurotechnology, Van den Hoven said no. Science and the law must be able to talk to each other, she said: There are already neurotechnology devices on the market, so privacy watchdogs must deal with them.

On why the use of intention in AI systems might be classed as high-risk in some circumstances rather than prohibited, Van den Hoven said it will be up to the European Court of Justice or enforcement authorities to interpret the notion.