Facebook ‘less safe’ for children after ‘morally reprehensible’ decision

Parents have been warned by the National Crime Agency about allowing children on Facebook after Meta’s decision to introduce encrypted messaging raised concerns for their safety.

The decision was described by one minister as “morally reprehensible” and the agency believes that it will mean police being alerted to thousands fewer cases of child abuse each year.

It estimates that 92 per cent of the referrals it receives from Facebook and 85 per cent of reports from Instagram will no longer be passed on to police.

Last week Meta decided to encrypt its messenger service, meaning the company could no longer see what its users are sharing with each other.

The agency said the move would allow child abusers to groom children and paedophiles to share sexual images of children.

Chris Philp, the policing minister, told The Times: “Meta’s decision is grossly irresponsible and will prevent thousands of paedophiles from being arrested. As a parent, it is morally reprehensible Meta is putting profit before the safety of protecting children from predatory sexual abusers.

“It should reverse this terrible decision immediately. I agree with the NCA’s advice that children are not exposed to this level of risk.”

Damian Hinds, the schools minister, also urged Meta to “rethink its decision”. He told Times Radio it was “absolutely paramount” that law enforcement agencies could intercept those peddling and engaging in child abuse. He added: “It’s not about protecting people’s privacy. This is really a question about ability to intercept and to ultimately investigate, bring to justice people who are engaging in child abuse.”

Rob Jones, director general of operations for the NCA, said that it had lobbied the company not to make the move and believed the decision was based on “profitability”.

He said he would advise parents to “think very carefully” about allowing their children on the platform.

He said: “Over the years Meta have acted very responsibly in referring images to us, and that was because they could see what was happening on their platform. They have made the business decision to no longer see what’s happening on their platform by introducing encryption, and I am astounded that that has happened.

“Ever since they announced they were moving to end-to-end encryption, we’ve implored them not to do this, and this week they’ve flipped the switch and things have got a lot harder for us because of that. The net impact is that the platform is not as safe as it was for children because nobody knows what’s going on in there.

“Children are masquerading as adults because there’s no effective age verification, and paedophiles are masquerading as children to establish relationships and groom potential victims.”

He added that although users have to be 13 to sign up to the website, “if you run a platform like that you have got to accept that children will get on to it”.

Graeme Biggar, director-general of the NCA, said: “We arrest 800 people a month (regarding child abuse) and safeguard 1,200 children a month. Meta’s contribution has been really important to that so we suspect that will go down.

“We are really disappointed by the decision that they have made, we are not convinced that the measures they are planning to put in place will provide anything like the protection of previous arrangements.

“Meta platforms are less safe for children than they were.”

The NSPCC has previously accused Mark Zuckerberg’s firm of “choosing to turn a blind eye to crimes against children” by moving ahead with the rollout.

Sir Peter Wanless, the chief executive, said the technology made it easier for abusers to exploit young victims and share images with other offenders.

“This flies in the face of the priority the public attaches to basic child safety online,” he said.

End-to-end encryption was a key sticking point as the government’s Online Safety Bill made its way through parliament earlier this year, as it proposed giving regulator Ofcom the power to force platforms to scan messages for abusive or dangerous content.

WhatsApp and Signal were among the platforms that threatened to pull out of the UK if the measure was used.

A spokesman for Meta said: “Encryption helps keep people safe from hackers, fraudsters and criminals.

“We don’t think people want us reading their private messages so have developed robust safety measures to prevent, detect and combat abuse while maintaining online security.

“Our recently published report detailed these measures, such as restricting over-19s from messaging teens who don’t follow them and using technology to identify and take action against malicious behaviour.

“As we roll out end-to-end encryption, we expect to continue providing more reports to law enforcement than our peers due to our industry leading work on keeping people safe.”