Privacy Daily is a service of Warren Communications News.

FTC Assessing Chatbot COPPA Compliance in Orders to Tech Industry

The FTC will explore compliance with the Children’s Online Privacy Protection Act (COPPA) rule as part of its inquiry seeking information from tech companies about AI chatbot interactions with young users, the agency said Thursday.

Sign up for a free preview to unlock the rest of this article

Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.

The FTC issued orders to Alphabet, Meta, Instagram, OpenAI, Snap, X and Character Technologies. They were issued using Section 6(b) authority, which allows the FTC to seek documents and internal communication without a specific law enforcement purpose.

The agency said the inquiry will partially focus on COPPA compliance, including how companies monetize user engagement, process user inputs, develop and approve characters, measure negative impacts before and after launch, employ disclosures, monitor compliance and use or share personal information from users’ conversations with chatbots.

House Commerce Committee Chairman Brett Guthrie, R-Ky., and ranking member Frank Pallone, D-N.J., issued a joint statement in support on Thursday: “Artificial intelligence has unleashed exponential levels of innovation, yet we are alarmed by recent incidents concerning the use of AI chatbots by minors. While some AI chatbot providers are taking steps to address such horrific and disturbing instances, additional investigation is needed to ensure children and teens are not in danger when using these services."

The FTC should "consider the tools at its disposal to protect children from online harms," the committee leaders added. "We are also hopeful Congress will be able to build on this work with durable, bipartisan legislation to protect children online and empower parents."