Home Secretary Priti Patel is planning to deliver a keynote speech at a child protection charity’s event focused on exposing the perceived ills of end-to-end encryption and asking for stricter regulation of the technology. At the same time a new report will say that technology companies need to do more to protect children online.
Patel will headline an April 19 roundtable organized by the National Society for the Prevention of Cruelty to Children (NSPCC), according to a draft invitation seen by WIRED. The event is set to be deeply critical of the encryption standard, which makes it harder for investigators and technology companies to monitor communications between people and detect child grooming or illicit content, including terror or child abuse imagery.
End-to-end encryption works by securing communications between those involved in them—only the sender and receiver of messages can see what they say and platforms providing the technology cannot access the content of messages. The tech has been increasingly made standard in recent years with WhatsApp and Signal using end-to-end encryption by default to protect people’s privacy.
The Home Office’s move comes as Facebook plans to roll out end-to-end encryption across all its messaging platforms—including Messenger and Instagram—which has sparked a fierce debate in the UK and elsewhere over the supposed risks the technology poses to children.
During the event, the NSPCC will unveil a report on end-to-end encryption by PA Consulting, a UK firm that has advised the UK’s Department for Digital Culture Media and Sport (DCMS) on the forthcoming Online Safety regulation. An early draft of the report, seen by WIRED, says that increased usage of end-to-end encryption would protect adults’ privacy at the expense of children’s safety, and that any strategy adopted by technology companies to mitigate the effect of end-to-end encryption will “almost certainly be less effective than the current ability to scan for harmful content.”
The report also suggests that the government devise regulation “expressly targeting encryption”, in order to prevent technology companies from “engineer[ing] away” their ability to police illegal communications. It recommends that the upcoming Online Safety Bill—which will impose a duty of care on online platforms—make it compulsory for tech companies to share data about online child abuse, as opposed to voluntary.
The Online Safety Bill is expected to require companies whose services use end-to-end encryption to show how effectively they are tackling the spread of harmful content on their platforms—or risk being slapped with fines by communication authority Ofcom, which will be in charge of enforcing the rules. As a last resort, Ofcom could demand that a company use automated systems to winnow out illegal content from their services.
The NSPCC says that this set-up does not go far enough in reining in encryption: in a statement released last week, the charity urged the digital secretary, Oliver Dowden, to strengthen the proposed regulation, preventing platforms from rolling out end-to-end encryption until they can demonstrate that they can safeguard children’s safety. Facebook currently tackles the circulation of child sex abuse content on WhatsApp by removing accounts displaying forbidden images in their profile pictures, or groups whose names suggest an illegal activity. WhatsApp says it bans more than 300,000 accounts per month that it suspects of sharing child sexual abuse material.
“Ofcom will have to meet a series of tests before it could take action on a regulated platform,” says Andy Burrows, NSPCC’s head of child safety online policy. “That is about being able to require evidence of serious and sustained abuse, which is going to be practically very difficult to do because of end-to-end encryption will take away a significant amount of the reporting flow.”