Michelle Donelan, the Digital Secretary, says enforcing age limits will be ‘front and centre’ of the new Online Safety Act.
Teenagers will have to prove they are over 13 to be on social media in the same way that supermarkets demand ID checks to buy alcohol, says the Digital Secretary.
In an exclusive interview with The Telegraph to mark this week’s royal assent of the Online Safety Act, Michelle Donelan said she expected social media platforms to model their age checks for children on Challenge 25 – where shops require ID to buy alcohol if they suspect you are younger.
It would mean any teenager appearing to be under 21 would have to provide “additional information” like age-verified ID to join or remain on a 13-plus social media site like Facebook, Instagram or Snapchat, said Ms Donelan.
Social media firms will be expected to deactivate any under-aged children’s account or face “humongous” fines for failing in their statutory duty to enforce age limits, she said.
“The reason why so many under-aged children are online at the moment is because it’s a completely unregulated space. It’s like a wild west. If they find that some of their users are not at least 13 – and some platforms have a higher age requirement – then they will have to deactivate accounts,” said Ms Donelan.
Social media bosses could face jail
She also warned social media bosses they would face jail if there was a repeat of the Molly Russell scandal, said she expected social media firms to stamp out misogyny and pledged there would be no compromise on encryption where it put children’s safety at risk from paedophiles.
Enforcing age limits on social media will be “front and centre” of the new legislation, said Ms Donelan, with the companies expected to introduce “highly robust” age checks.
Ofcom, which will be responsible for policing the new regime, estimates as many as 60 per cent of eight to 11-year-olds have social media profiles, equivalent to some 1.6 million children across the UK despite most sites having 13-plus age limits.
“If a platform doesn’t have a higher age rating than 13, then they could potentially try to assess whether the individual was 21, for example. So it’s about going above and beyond the requirement to ensure that you actually meet the requirement, which is the same as what we see in supermarkets and off licences,” she said.
“Because if they fail, and are weak in their response to this, they will face these humongous fines and worse, potentially.”
Ofcom to access algorithms
The Act gives Ofcom powers to jail tech bosses for up to two years if they persistently breach their legal duties, a measure demanded by campaigners including Ian Russell, the father of Molly, who took her life aged 13 after being bombarded with self-harm and suicide posts as a result of social media’s algorithms.
Ms Donelan said the law would enable the regulator Ofcom to get access to algorithms in any similar case. “It isn’t just always the content, it is the quantity of the content that is seen. That was very much the case in Molly Russell’s case. She was bombarded with absolutely horrific content because of the algorithms,” she said.
“That was why we were so determined to make sure the Act delivered in that sense. If a social media company doesn’t just evade the law, but they also evade the regulator, when it comes to child abuse and exploitation, they could face jail time. That means that we’re going to ensure there is compliance with this Act.”
Ms Donelan said women and girls faced a “disproportionate” number of threats and abuse online but believed that the “triple shield” that she introduced would counter illegal content and hold companies to account by forcing them, by law, to abide by their terms and conditions, most of which ban misogyny.
She said user empowerment – the third element of the triple shield – would enable people to require tech firms to filter out content to ensure they were “prevented from seeing outpourings of heinous material”.
She said the filter would also help protect free speech as it would enable social media users to “see what they want to”. The triple shield replaced proposals to regulate “legal but harmful” content which Ms Donelan admitted “did infringe on free speech by the creation of this quasi-legal category”.
Ms Donelan warned that companies would not be able to go ahead with encryption unless they could assure Ofcom they had measures in place to maintain the same level of protection of children from abusers or paedophiles seeking to groom them.
“The Government strongly believes that encryption is important for privacy, and maintaining privacy. But what we don’t believe and subscribe to is that it is a binary choice between protecting privacy or protecting children from online predators,” said Ms Donelan.
“That is why we have injected a safety mechanism whereby if there was a situation where a platform had encrypted or was about to encrypt, and there was real concerns and evidence that their platform was a hotbed of paedophilia, Ofcom can work with them to see if there are other mitigations they can put in place.
“And if they don’t prove successful and don’t prove robust enough, they could then ask them to invest in research in a technology that will protect encryption whilst also enabling the identification of these heinous individuals, and then potentially deploy it.
“That’s the right thing for any government to do, because I don’t think it would be morally right for the Government to turn a blind eye to online predators. We have a duty to protect our children and the future generation from such harms.”News