The communications watchdog is prioritising the interests of tech companies over the safety of under-18s, according to the children’s commissioner for England.
Dame Rachel de Souza said she warned Ofcom last year that its proposals for protecting children under the Online Safety Act were too weak. New codes of practice issued by the watchdog on Thursday have ignored her concerns, she said.
“I made it very clear last year that its proposals were not strong enough to protect children from the multitude of harms they are exposed to online every day,” she added. “I am disappointed to see this code has not been significantly strengthened and seems to prioritise the business interests of technology companies over children’s safety.”
De Souza, whose government-appointed role promotes and protects the rights of children, said she had received the views of more than a million young people and the online world was one of their biggest concerns. The codes of practice would not allay those fears, she said. “If companies can’t make online spaces safe for children, then they shouldn’t be in them. Children should not be expected to police the online world themselves.”
Measures announced by Ofcom include:
-
Requiring social media platforms to deploy “highly effective” age checks to identify under-18s.
-
Ensuring algorithms filter out harmful material.
-
Making all sites and apps have procedures for taking down dangerous content quickly.
-
Ensuring children must have a “straightforward” way to report content.
Last year de Souza published a response to an Ofcom consultation on protecting children from online harm in which she made a number of recommendations including regular consultations with children.
The Molly Rose Foundation, a charity established by the family of the British teenager who took her own life after viewing harmful online content, also criticised the measures, which it said were “overly cautious”. The foundation said flaws in the codes included a lack of annual harm reduction targets.
Ofcom rejected de Souza’s criticism, saying harmful and dangerous online content would be reduced.
“We don’t recognise this characterisation of our rules, which will be transformational in shaping a safer life online for children in the UK,” said a spokesperson.
Melanie Dawes, Ofcom’s chief executive, said the measures were a “reset” and companies failing to act would face enforcement.
“They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content,” she added.
The technology secretary, Peter Kyle, is considering a social media curfew for children following TikTok’s introduction of a feature that encourages under-16s to switch off the app after 10pm.
Kyle told the Telegraph he was “watching very carefully” the impact of the curfew feature.
“These are things I am looking at,’” he said. “I’m not going to act on something that will have a profound impact on every single child in the country without making sure that the evidence supports it – but I am investing in [researching] the evidence.”
Kyle said the Ofcom codes should be a “watershed moment” that turned the tide on “toxic experiences on these platforms”.
He added: “Growing up in the digital age should mean children can reap the immense benefits of the online world safely, but in recent years too many young people have been exposed to lawless, poisonous environments online which we know can lead to real and sometimes fatal consequences. This cannot continue.”
Under the children’s codes, online platforms will be required to suppress the spread of harmful content, such as violent, hateful or abusive material and online bullying. More seriously harmful content, including that relating to suicide, self-harm and eating disorders, will need to be kept off children’s feeds entirely, as will pornography.
If companies fail to comply with the requirement to protect children from harmful content, Ofcom can impose fines of up to £18m or 10% of global revenue. In extreme cases, Ofcom can ask a court to prevent the site or app from being available in the UK.
Senior managers at tech companies will be criminally liable for repeated breaches of their duty of care to children and could face up to two years in jail if they ignore enforcement notices.