Privacy Daily is a service of Warren Communications News.

'AI Craze' Drives Questionable Payments for Biometric Data, EPIC Argues

The Brazilian National Data Protection Authority's (ANPD's) decision to bar Tools for Humanity (TFH) from offering to pay people to have their irises scanned "speaks to a bigger problem" with AI, Justin Sherman, Electronic Privacy Information Center Scholar in Residence, argued Thursday.

Sign up for a free preview to unlock the rest of this article

ANPD's analysis found that paying people, via cryptocurrencies, might dissuade them from giving consent, it said. Consent for processing sensitive personal data, such as biometric data, must be free, informed, unequivocal and provided in a specific manner for specific purposes, it said. Paying for iris scans could also influence people's decisions, particularly when they're vulnerable and poor.

The TFH example points to larger issues, Sherman wrote. Hype about AI research and development is "driving companies to buy people's data where they otherwise weren't and, conversely, to sell people's (including customers' and users') data where they otherwise wouldn't."

Those practices are bad enough, but they're driven by an "AI craze," which will rapidly increase data-sharing, data-selling and monetization threats to privacy, Sherman argued. It will enable repeated exploitation of people's data and cause pronounced harm to vulnerable populations.

Founded by Sam Altman and Alex Blannia, TFH previously was called "Worldcoin," Sherman noted. In July 2023, French privacy regulator CNIL said it was legally questionable. The Kenyan government, in August 2023, ordered it to stop collecting its citizens' iris scans due to lack of clarity about how the information was stored and uncertainty about the cryptocurrency payment, he said.

In its decision, the ANPD made clear that it believes paying people for data doesn't just threaten their ability to truly consent to data collection and use; it also influences people to go against their underlying wish not to have their data collected, especially if they're financially struggling, Sherman said.

Consumers should be wary of being promised money or cryptocurrency for their data, Sherman said. They should be skeptical about claims that their data will be anonymized, especially when data such as iris scans isn't seriously anonymizable, Sherman said. Policymakers should consider whether companies telling consumers to sell their data for AI training are getting full, affirmative, express consent, and whether practices are illegal. Most importantly, he said, policymakers should be very concerned that the current AI market environment is only further incentivizing companies to make money from people's personal data.