Youngsters need an independent online advocate who would act like a consumer watchdog to promote and protect their interests, five of Britain’s leading children’s charities have urged the Government.
They are backing a cross-party amendment to the Online Safety Bill led by Tory peer Baroness Newlove, the former victims’ commissioner and deputy speaker of the House of Lords.
The advocate would represent their interests to Ofcom, the regulator, which has the power to force the companies to make changes that might be causing harm to children.
Ofcom will be able to fine companies up to 10 per cent of their global turnover and jail bosses for major failures in their statutory duty of care to protect children from online harms.
The advocate, funded by a levy on the tech firms rather than the taxpayer, would act as an “early warning system” as children and their parents would be able to alert Ofcom to new potentially dangerous trends or problems with social media platforms.
It would also provide a “counterbalance” to the power of the tech giants who have multi-million-pound budgets to pay for lobbyists to protect their commercial interests.
The charities supporting an advocate include the NSPCC, Barnardo’s, 5Rights – whose founder Baroness Kidron is a sponsor of the amendment – and the Molly Rose foundation, set up by the family and friends of Molly Russell, 14, who took her own life after being bombarded with self-harm and suicide posts by tech firms’ algorithms.
It also includes the Breck Foundation, set up by Lorin La’Fave, whose 14-year-old son was groomed and murdered by 18-year-old Lewis Daynes whom he had met online.
In an exclusive online article for The Telegraph, Baroness Newlove said meeting Breck’s mother had helped convince her that more needed to be done to tackle the profit-driven algorithms of tech firms that led to children being overwhelmed by harmful content.
“Companies have unleashed poorly designed products that have contributed in no small part to online child abuse spiralling to record levels,” said Baroness Newlove.
“Testimony by parents like Lorin La’Fave has been crucial to get the online safety legislation into a place where it will deliver systemic protections for users. But I am convinced more needs to be done to meet the Government’s ambition and ensure children receive a higher standard of protection.”
Sir Peter Wanless, NSPCC chief executive, told The Telegraph: “Years of unregulated social media has been defined by companies overlooking the needs and experiences of children on their sites, resulting in record levels of abuse and harm with tragic and sometimes fatal consequences.
“The Online Safety Bill will bring in much-needed regulation and give children greater protection. But as in every regulated industry there will be pushback from companies, and the regulator would benefit from expert support.
“The amendment to create a statutory child safety advocate is crucial if the legislation is to deliver on the Government’s goal to make the UK the safest place for a child to be online. It will give young users a seat at the table with access to children’s voices in real time helping to hardwire child protection into the decision-making of the most influential tech company executives.”
Children must know they have a champion on their side
By Baroness Newlove, Deputy Lords Speaker and former Victims Commissioner
“I am feeling sick with fear. I was talking with this guy online and trusted him. I sent him quite a lot of nude pictures of myself and now he is threatening to send them to my friends and family unless I send him more nudes or pay him.
“I reported it to Instagram, but they still haven’t got back. I don’t want to tell the police because my parents would then know what I did and would be so disappointed.”
These are the words of Holly* who called Childline in despair when she was 14. She felt alone, and fearful with no one else to turn to for help.
Holly is one of thousands of children who are paying the cost for online child abuse that is becoming ‘normalised’ on social media for a generation of children.
Last week the NSPCC released figures showing online child abuse image crimes were up by two-thirds. The grooming which produces these images is up 80 per cent.
The Government’s Online Safety Bill rightly aims to make the UK the safest place in the world for a child to be online.
As the legislation has progressed MPs, and Lords like me have heard first-hand the harm unregulated social media has had on children.
We have been told how in a rush to roll out new technology, companies have unleashed poorly designed products that have contributed in no small part to online child abuse spiralling to record levels.
While in my role as Victims’ Commissioner, I had the honour of meeting Lorin La’Fave, a mother who set up the Breck Foundation in memory of her 14-year-old son who was groomed and murdered by someone he met online.
We have heard from parents whose children have tragically died by suicide after being bombarded with suicide and self-harm content by algorithms built to deliver profit over safety.
This testimony has been crucial to get the legislation into a place where it will deliver systemic protections for users.
But I am convinced more needs to be done to meet the Government’s ambition and ensure children receive a higher standard of protection.
Families have been let down
That’s why today I am backing the NSPCC’s call for the Bill to create a child safety advocate which can stand up for the children and families who have been let down or left to fight alone.
No longer should young people like Holly or bereaved parents like Ian Russell, Ruth Moss, Judy and Andy Thomas, Lorin La’Fave and Stuart and Amanda Stephens have to battle against the might of big tech and navigate the complexities of Parliamentary procedures just to get their voices heard.
No longer should child victims and grieving families have to put their head above the parapet in order to be heard and treated with decency and respect.
The NSPCC has led calls for the creation of statutory child safety advocate in the Online Safety Bill to represent the interests of young users who are inherently vulnerable and feel increasingly powerless.
User advocacy bodies are a standard feature in most regulated markets, funded by an industry levy in accordance with the ‘polluter pays’ principle.
A child safety advocate must not be seen as a ‘nice to have’ extra but an essential component of building regulation that can deliver the systemic change our children so desperately need.
By focusing on company systems and processes rather than responding to individual complaints the advocate would act systemically to provide expert advice to Ofcom so it can respond effectively and quickly to the harms that children face.
This would help build a regulatory regime that addresses risk before children come to harm, creating an online world that is safer by design.
Early warning system
In a sector that is characterised by rapid technological change, the advocate would act as an early warning system to quickly spot emerging risks and support the regulator to take swift action.
By identifying dangerous sites and risky design feature it can ensure safeguarding is fused into regulation and not considered as an afterthought.
We know there will be well-funded efforts from Big Tech to lobby for a regulatory regime that works in their best interests, not the best interests of children.
A child safety advocate will therefore be crucial to take on the influencing muscle and resources of some of the largest companies in the world and enable the regulator to understand the online world as children experience it, not how tech lobbyists want to paint it.
This is crucial to protect children from damaging content, such as suicide and self-harm material as well as sexual abuse.
A few weeks ago Lords saw first-hand the incredibly distressing and alarming material that 14-year-old Molly Russell viewed before her tragic death.
In the safe environment of a Parliamentary meeting room, many were left physically shaken and visibly shocked.
Yet an already vulnerable 14-year-old girl was bombarded with this content on Instagram which ultimately contributed to her death.
During the inquest, the Meta representative said some of the content Molly saw was safe for children. Their response to coroner’s Preventing Future Deaths report has been labelled a PR move and “business as usual” by her father.
The disconnect between what Silicon Valley executives deem safe for our children and what families want, need, and deserve is staggering.
That is why we need to ensure that, unlike Molly and Holly, children know they have a champion on their side.
The amendment to the Online Safety Bill backed by Lord Knight, Baroness Kidron and me to create a child-safety advocate will do that. I urge the Government to roll up their sleeves and back it.
Source: Children’s advocate needed in Online Safety Bill, charities urge (msn.com)
Categories: News