Privacy Daily is a service of Warren Communications News.

NJ Sues Discord for Exposing Kids to Predators, Sexual Content; Discord Disputes Claims

New Jersey Attorney General Matthew Platkin announced Thursday he is filing a lawsuit against messaging platform Discord for deceptive and unlawful business practices that led to children on the platform being exposed to violent and sexual content as well as child predators.

Sign up for a free preview to unlock the rest of this article

“Discord markets itself as a safe space for children, despite being fully aware that the application’s misleading safety settings and lax oversight has made it a prime hunting ground for online predators seeking easy access to children,” said Platkin. “These deceptive claims regarding its safety settings have allowed Discord to attract a growing number of children to use its application, where they are at risk. We intend to put a stop to this unlawful conduct and hold Discord accountable for the harm it has caused our children.”

The complaint, filed in the Superior Court of New Jersey, Chancery Division, Essex County, comes after a multi-year investigation by the AG's office that found that while Discord represented its platform as safe for children, including policies prohibiting underage use of the app and direct messaging features that claimed to scan for and delete explicit media content from private messages, reality was the opposite.

"News accounts and reports from prosecutors’ offices illustrate that despite the app’s promises of child safety, predators use the app to stalk, contact, and victimize children," including cases where adults were charged and convicted with using Discord to contact children, often posing as children themselves, and transmitting and soliciting explicit images through the app," according to the press release. "In many criminal cases involving sexual exploitation of children on Discord, the children were under the age of 13, despite Discord’s claim to enforce its policy prohibiting children under 13 from using the app."

The complaint specifically alleges the messaging platform violated the Children’s Online Privacy Protection Act (COPPA) and the New Jersey Consumer Fraud Act.

“Discord claims that safety is at the core of everything it does, but the truth is, the application is not safe for children," said Cari Fais, director of the Division of Consumer Affairs. "Discord’s deliberate misrepresentation of the application’s safety settings has harmed—and continues to harm—New Jersey’s children, and must stop. By filing this lawsuit, we’re sending a clear message that New Jersey will not allow businesses to grow their customer base through unlawful and deceptive practices, especially when those practices put children at grave risk.”

In an email to Privacy Daily, Discord said that it's a communications, not a social media, platform. As such, it is not designed to maximize engagement, and users are in control of their experience. Thursday's lawsuit is focused on safety initiatives undertaken years ago. Those have continuously been invested in and improved since, Discord said. The messaging platform also said there are various safety tools and resources, for users, parents and educators, in addition to taking action against content and activity that violates community guidelines.

“Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer," the statement said. "Given our engagement with the Attorney General's office, we are surprised by the announcement that New Jersey has filed an action against Discord today. We dispute the claims in the lawsuit and look forward to defending the action in court.”