ChatGPT Health Raises Privacy Red Flags for CDT
A consumer privacy advocate raised concerns over a new OpenAI health and wellness tool, unveiled Monday.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
OpenAI described its new ChatGPT Health as “a dedicated experience that securely brings your health information and ChatGPT’s intelligence together, to help you feel more informed, prepared, and confident navigating your health.”
“ChatGPT Health builds on the strong privacy, security, and data controls across ChatGPT with additional, layered protections designed specifically for health -- including purpose-built encryption and isolation to keep health conversations protected and compartmentalized,” OpenAI said. The company added that the AI app is “designed to support, not replace, medical care,” and isn’t “intended for diagnosis or treatment.”
“To keep your health information protected and secure, Health operates as a separate space with enhanced privacy to protect sensitive data,” the AI company said. “Conversations in Health are not used to train our foundation models.”
However, the Center of Democracy & Technology (CDT) pointed out possible risks.
“New AI health tools offer the promise of empowering patients and promoting better health outcomes, but health data is some of the most sensitive information people can share, and it must be protected,” said Andrew Crawford, CDT senior counsel of privacy and data.
"The U.S. doesn’t have a general-purpose privacy law, and HIPAA only protects data held by certain people like healthcare providers and insurance companies,” added Crawford: AI companies aren’t typically covered by the federal health law.
OpenAI’s announcement “means that a number of companies not bound by HIPAA’s privacy protections will be collecting, sharing, and using peoples’ health data. And since it's up to each company to set the rules for how health data is collected, used, shared, and stored, inadequate data protections and policies can put sensitive health information in real danger.”
In addition, Crawford is wary about OpenAI’s assurances that data will be kept private. "While OpenAI says that it won’t use information shared with ChatGPT Health in other chats, AI companies are leaning hard into personalization as a value proposition,” he said. “Especially as OpenAI moves to explore advertising as a business model, it’s crucial that separation between this sort of health data and memories that ChatGPT captures from other conversations is airtight.”