Privacy Daily is a service of Warren Communications News.
Digital Fairness Act Coming

GDPR, Age Assurance Needed to Protect Kids Online, European Parliament Panel Told

Compliance with EU data protection principles is the first step toward ensuring that minors aren't exposed to harmful content online, European Data Protection Supervisor Wojciech Wiewiorowski said Monday. Age assurance or verification is also important, but it shouldn't entirely block children from accessing some content, speakers said at the European Parliament Internal Market and Consumer Protection Committee (IMCO) meeting.

Sign up for a free preview to unlock the rest of this article

Panelists' comments will feed into an IMCO own-initiative report on protecting minors online.

Protecting minors online is a priority for the EDPS and European Data Protection Board, said Wiewiorowski. The EDPB is preparing guidelines on processing the personal data of children under the General Data Protection Regulation (GDPR), which could emerge early next year, he added.

Meanwhile, the European Commission is working on a Digital Fairness Act (DFA) to address dark patterns and digital contracts following a review of EU consumer protection laws. Strong protection of minors online is a priority for the EC as well as for the European Parliament, said Maria-Myrto Kanellopoulou, head of the consumer law unit, EC Justice and Consumers Directorate.

EC research found more than 80% of respondents to a survey experienced distress and other negative emotions and were confused by dark patterns, Kanellopoulou said. The DFA will try to clarify and make more concrete the EU's various digital consumer laws, give young consumers additional control over their online experiences and limit minors' exposure to content such as the sale of virtual items, she said.

There must also be a focus on proper enforcement of consumer protection rules and the Digital Services Act, Kanellopoulou said. The DSA governs providers of intermediary services such as social media, online marketplaces, online platforms with at least 45 million active monthly users in the EU and very large search engines. Among other requirements, very large online platforms must analyze the systemic risks they create from the dissemination of illegal content or harmful effects such content has on fundamental rights (see 2311100001).

Platforms will change their services but progress in that area is being challenged by some tech companies and governments -- and by the pressure for growth and jobs, said Leanda Barrington Leach, executive director of international nongovernmental organization 5Rights, children's digital rights advocate. She urged the EU to enforce the GDPR and other applicable laws, but not to shut out kids from the digital world through age-gating.

There must be a balance between minors' right to privacy and access to content, said European Games Developer Federation Managing Director Jari-Pekka Kaleva. When needed, age assurance and verification and GDPR management should occur at the device level, he said, because it's the most privacy-friendly way to implement an age-verification process.