Ian Russell is campaigning for recognition of the disparities between the protections in place for children in their offline lives versus the dangers facing them online.
Child safety campaigner Ian Russell has warned that children should not have to pay the price for technology companies’ failure to make the necessary changes to protect them from harmful content.
He branded tech companies “a disgrace” for what he sees as a repeated failure to take action to protect children from damaging content.
“It’s not the kids’ fault that this content is there and it’s not kids that should have to pay the price for it,” Mr Russell told i.
“As the years tick past the more that belief seems to be confirmed, sadly, that there is a reluctance for big platforms to change simply because they used to doing things a certain way.
“Maybe that’s not a surprise, but when it so profoundly and detrimentally affects young people’s lives as digital technology can – I don’t think it’s really ever intended to, but it does – and they’re powerful and rich and they don’t take sufficient steps quickly enough to deal with it, then I think it’s just a disgrace.”
Mr Russell, who founded suicide prevention charity the Molly Rose Foundation, is working with child online safety group 5Rights Foundation on new campaign Twisted Toys to highlight the stark differences between what is not accepted offline and what is allowed to happen online.
The campaign includes parody videos including surveillance camera-equipped teddy bear ‘Share Bear’ that collects childrens’ data and the ‘Stalkie Talkie’, a walkie talkie that connects children to random adults to demonstrate how unacceptable online dangers would be in a physical toy.
Mr Russell recalled the “horrible dawning” he experienced in the weeks following Molly’s death about the dangers of the online world, compared to the comparative protections in place in their offline lives.
“When we discovered what Molly had been seeing and liking and viewing online, despite being the youngest of three daughters, growing up in a house when we talked about e-safety and all those things that people do to protect their children, we were shocked and horrified,” he said.
“I don’t think we were naive enough to suspect the internet didn’t contain such horrors, but we didn’t realise they were so widely and easily available. We didn’t realise that the platforms were pushing them algorithmically to children, and even sending emails to connect Molly to other harmful content.
“We didn’t think that those companies could behave like that because it’s just so illegal and immoral, and yet there’s nothing to stop them. And they did, and in some case, still do behave like that.”
Research conducted by 5Rights found that 90 per cent of 982 parents surveyed said they thought the internet could be harmful to children, while 80 per cent said they did not trust tech companies to protect young people online.
An additional 71 per cent said they thought the Government could be doing more to ensure child safety on the internet.
Technology and social media companies should adopt a mandatory safety-by-design approach when building or running anything that could affect a child, Baroness Beeban Kidron, crossbench peer in the UK House of Lords and chair of the 5Rights Foundation.
“I think we are at a last resort,” she said. “This should have been the last resort a decade ago. This shouldn’t be happening to children and we must not allow or accept it.”
“The work of Baroness Kidron and her peers, including the development of the Age Appropriate Design Code, informs the expansive work we do every day to protect the safety and privacy of young people using our apps,” a Facebook spokesperson said.
“Thousands of parents work at Facebook, and we feel a collective responsibility to make sure that young people can enjoy all the benefits of our apps while protecting them from harm.”
While Instagram, which is owned by Facebook, made changes to its community guidelines after details of Molly’s death were made public in 2019, including banning graphic self-harming images and video and adding ‘sensitivity screens’ to blur images, Mr Russell has previouslt said there is “still too much harmful content available on the platform”.
“Some of the platforms have made attempts to remove harmful content, but it’s still there and still too easy to find. I accept it’s a very difficult task, but I’m sure there must be something more that could be done,” he told i in March.