UK’s Online Safety Act set to protect children from harmful internet content

children online
 Thomas Park | Unsplash

The way people in the UK navigate the internet is set to change significantly under the new Online Safety Act, which requires online platforms to implement measures such as age checks to prevent children from accessing illegal and harmful material that started on July 25.

The law, enforced by Ofcom—the UK’s media regulator—aims to make the internet safer, particularly for children, by imposing strict duties on platforms to monitor and remove harmful content. Failure to comply can result in fines up to £18 million or 10% of a company’s global revenue, whichever is higher, and potentially jail time for executives. In severe cases, Ofcom can seek a court order to block access to offending sites in the UK.

Key Protections Under the Children’s Codes

From July 25, platforms must prevent minors from viewing content related to suicide, self-harm, eating disorders, and pornography. Popular services, especially adult sites, will begin enforcing age verification for UK users.

The law also requires platforms to take a series of proactive steps to protect children online. These include filtering out misogynistic, violent, hateful, or abusive material, as well as combating online bullying and curbing dangerous viral challenges. To meet these standards, platforms must adjust their algorithms to ensure harmful content is excluded from children’s feeds.

They are also expected to implement reliable age verification systems, swiftly remove harmful material when identified, and provide appropriate support for affected children. In addition, each platform must designate a specific individual responsible for children’s safety and conduct annual reviews of their risk management processes.

Additional Measures and New Offenses

The Act also requires firms to commit to removing illegal content, including child sexual abuse, coercive behavior, extreme sexual violence, promotion of suicide or self-harm, illegal drugs or weapons sales, and terrorism-related material.

New offenses introduced by the Act include cyber-flashing (sending unsolicited sexual images) and sharing AI-generated “deepfake” pornography.

Criticism and Calls for Stronger Protections

Some campaigners argue the rules do not go far enough. Ian Russell, chairman of the Molly Rose Foundation—which was established after his daughter’s suicide at age 14—criticized Ofcom’s codes for lacking ambition.

The Duke and Duchess of Sussex have also called for more robust social media protections, unveiling a memorial in New York dedicated to children harmed by the internet. Prince Harry told BBC Breakfast, “We want to make sure that things are changed so that no more kids are lost to social media.”

The NSPCC children’s charity highlights that encrypted private messaging apps remain a major risk, arguing the Act does not sufficiently address these platforms. Conversely, privacy advocates warn that digital age verification could infringe on users’ rights and lead to security breaches, errors, and censorship.

Children’s Internet Use and Parental Controls

Ofcom research shows children aged 8 to 17 spend between two and five hours online daily, with nearly all over-12s owning mobile phones and frequently using video platforms like YouTube and TikTok.

While about half of children believe online activity benefits their mental health, a Children’s Commissioner survey found many 13-year-olds encounter “hardcore, misogynistic” pornography and pervasive content about suicide, self-harm, and eating disorders. Violent content is often described as “unavoidable.”

The NSPCC encourages parents to engage actively with their children about online safety. Internet Matters, supported by UK internet companies, reports two-thirds of parents use parental controls, offering guides to manage children’s social media, video streaming, and gaming platforms like Roblox and Fortnite.

However, about 20% of children can disable these controls. Instagram’s “teen accounts” automatically enable privacy settings, though researchers have noted some protections can be bypassed.

Mobile phone and broadband providers often block explicit sites until users verify they are over 18. Both Android and Apple devices provide parental controls to limit app access, block explicit content, and restrict purchases. Gaming consoles also offer parental controls to ensure age-appropriate content and regulate in-game spending.

Was this article helpful?

Help keep The Christian Daily free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CDI's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.

Most Recent