Privacy Activist Files GDPR Complaint Against ChatGPT in Norway
Open AI's ChatGPT regularly "hallucinates" about people, providing false information without giving them a way to correct it, European privacy advocacy group Noyb charged Thursday in a complaint to the Norwegian Data Protection Authority.
Sign up for a free preview to unlock the rest of this article
When Norwegian user Arve Hjalmar Holmen asked ChatGPT if it had information about him, the bot offered a "made-up horror story," Noyb said: It said Holmen was a convicted criminal who murdered two of his children and attempted to murder his third son.
The fake story included real information about Holmen's personal life, including the number and gender of his children and the name of his hometown, Noyb said. The bot also falsely said Holmen was sentenced to 21 years in prison.
Noyb previously filed a complaint about ChatGPT "Hallucination" in 2024, where it requested that the company rectify or erase false information. "OpenAI simply argued it couldn’t correct data. Instead, it can only “block” data on certain prompts, but the false information still remains in the system," Noyb said.
Noyb added: "Given the mix of clearly identifiable personal data and fake information, this is without doubt in violation of the General Data Protection Regulation," which requires that companies ensure that personal data they produce about people is accurate. The group asked the DPA to order that OpenAI delete the defamatory information and correct its model to eliminate inaccurate results, and to impose a fine.