Regulator Unveils Age-Verification Guidance Under U.K. Online Safety Act
U.K. regulator Ofcom Thursday issued industry guidance detailing how apps and sites can implement effective age checks to keep children from encountering online porn and protect them from other harmful content. Pornography providers have until July to introduce age checks, it said. The office also published a statement on age assurance and children's access, and warned that its age assurance enforcement program is open for business.
Sign up for a free preview to unlock the rest of this article
The documents are part of Ofcom's implementation of the U.K. Online Safety Act (OSA). It was enacted in 2023, but many of its provisions became effective this month, according to an Ofcom compliance timeline.
The law sets obligations on user-to-user (including social media) and online search platforms, such as requiring them to have systems and processes that reduce the risk of their services being used in ways that could harm users, CMS attorneys Anna Soilleux-Mills and Laura Bilinski posted. Ofcom and the Information Commissioner's Office (ICO) will monitor compliance.
The OSA will be implemented in phases, the attorneys noted. Services covered by the OSA must now begin assessing the risks of illegal harm on their platforms, and complete their assessments by March 16. If draft codes awaiting approval are adopted, providers must begin shielding users from illegal content and activity online March 17.
The act defines illegal content as "content that amounts to a relevant offense." It distinguishes priority offenses such as those related to terrorism, stalking and intimate image abuse, from nonpriority offenses (existing offenses under U.K. law that aren't priority).
"Ofcom has made clear that it will not hesitate to use its enforcement powers," including criminal prosecution, when it discovers noncompliance, the attorneys wrote. Under the OSA, Ofcom can impose a fine of up to $22 million (GBP18 million), or 10% of a company's qualifying worldwide revenue, whichever is more. It can also ask courts for orders limiting access to the platforms in the U.K. The regulator has confirmed that its "preferred approach to enforcement is a collaborative one -- but businesses will need to act quickly in conducting adequate risk assessments."
The second phase of the OSA concerns child safety, pornography and the protection of women and girls. Thursday's decisions are part of this, and all subject companies must now begin assessing children's access. In February, Ofcom expects to publish draft guidance on protecting women and girls.
In Phase 3, certain user-to-user and search services will be classified as Category 1, 2A or 2B services and will be subject to additional requirements. Final steps of OSA implementation are likely to take place early next year, the attorneys said.
In a joint statement last May, Ofcom and the ICO described how they plan to cooperate on regulating online services under the OSA where online safety and data protection intersect, an ICO spokesperson emailed Thursday. Since then, the spokesperson said, "We've been working closely with Ofcom, including keeping our joint statement under review."
The offices agreed to monitor "collaboration themes" of common interest. Themes identified so far include age assurance (age estimation and verification); proactive tech and relevant AI tools for activities such as content identification, user profiling and behavior identification technology; and default and geolocation settings for children. "It is envisaged these themes will evolve over time," the statement noted.
The ICO welcomed Ofcom's new guidance on age checks, adding the offices will "continue to work together, so companies are protecting people's information when designing and operating their online safety systems and processes."