The U.K. Online Safety Bill, which has been in development for years, is now ready to become law. However, there are mixed reactions.
It was originally intended to make social media companies accountable, but its scope has grown over time.
To be enforced by regulator Ofcom, it requires companies—small as well as large—to remove illegal content and prevent children from seeing harmful material. However, the law has expanded over time to include other offenses such as cyberflashing, animal cruelty and fraud online.
Michelle Donelan said, “Our commonsense strategy will bring a better future to the British public by ensuring that online what’s prohibited offline remains illegal.” The plan puts the safety of children as its top priority, and helps us catch keyboard criminals to stop them from committing heinous crimes.
Companies that fail to comply could face fines of up to £18 million or 10% of their global annual revenue, whichever is greater—billions of pounds, in the case of the largest platforms.
Its final version will not do much to ease concerns. Ofcom has the authority to give notices requiring companies to scan messages for illegal materials.
Earlier in the month, Lord Parkinson of Whitley Bay made a public statement that appeared to show a slight retreat by the government.
When deciding whether or not to issue an notice [to scan for CSAM] Ofcom will work with the service to identify reasonable, technically feasible solutions to address the child sexual exploitation and abuse risk including drawing on evidence from a skilled person’s report,” he said. Ofcom will work with the service to identify reasonable, technically feasible solutions that address child sexual exploitation and abuse risk. This includes drawing on evidence from a skilled person’s report.
Several groups have welcomed the new bill. From Which?, an advocacy group for consumer protection, to charitable organizations such as the National Society for the Prevention of Cruelty to Children.
NSPCC CEO Sir Peter Wanless said, “Technology companies now have an opportunity to design safety into their products.”
Signal and WhatsApp will not disappear from the U.K. soon, thanks to the recent decision of the government to relax the requirements for companies that can break encryption.
Some rights groups are still not happy.
“While the UK government has admitted it’s not possible to safely scan all of our private messages, it has granted Ofcom the powers to force tech companies to do so in the future,” says Open Rights Group campaigns manager James Baker.
The powers would be more appropriate for an authoritarian system, not democracy. They could also harm whistleblowers and journalists as well domestic abuse survivors, children and parents who are trying to secure their online communications from predators or stalkers.
Meanwhile, says the Electronic Frontier Foundation, “If the regulators claim their right to require the creation of dangerous backdoors in encrypted services, we expect encrypted messaging services to keep their promises and withdraw from the UK, if that nation’s government compromises their ability to protect other users.”
The new requirement for scanning to be “technically feasible” allows Ofcom to kick the can of end-to-end encryption down the road—quite possibly, indefinitely.
But, says WhatsApp head Will Cathcart in a tweet, “The fact remains that scanning everyone’s messages would destroy privacy as we know it. It was true both last year and today. #WhatsApp “will never be able to crack our encryption, and is vigilant against attempts at doing so.”
SME Paid Under