Privacy Daily is a service of Warren Communications News.

States Group Demands Answers About Meta AI Exposing Minors to Sex Risks

A South Carolina-led coalition of 28 states authored a letter to Meta on Tuesday asking about its social media assistant, Meta AI, that allegedly exposes minors to sexual exploitation risks. Although Meta has said the AI assistant is safe and appropriate for all ages, the states argue that recently reported incidents prove otherwise.

Sign up for a free preview to unlock the rest of this article

“We are alarmed by the reports that Meta allows children on Facebook and Instagram to engage in sexually explicit role-play with AI and fails to adequately warn parents about that use,” South Carolina Attorney General Alan Wilson (R) said in a press release. “It’s also disturbing to see reporting that Meta’s AI assistant allows adult social media users to practice grooming children by engaging in sexual role-play with underage AI personas. If these claims are true, Meta is endangering children and giving predators a new digital tool to exploit them.”

Meta AI is implemented across Meta platforms, and allows interaction between users and synthetic personas via text, voice and picture, the letter said; recent reporting has shown that AI personas engage in sexual conversations with minor users.

"In one case, a Meta-created persona using the voice of John Cena described a sexual encounter with a user posing as a 14-year-old girl and acknowledged its illegality," said Wilson's press release. "User-created underage personas were also implicated in facilitating pedophilic scenarios with adult-identifying users."

The states gave Meta until June 10 to respond to all their questions in the letter, including: "Did Meta remove guardrails from Meta AI to allow sexual or romantic role-play with users?" and "Are any sexual or romantic role-play capacities of Meta AI available on Meta’s platforms to users under the age of 18?"