The UK Government is facing a growing revolt from big name messaging platforms, against its revised Online Safety Bill.

Will Cathcart, Meta’s head of WhatsApp, during a visit to the UK described the Online Safety Bill as the most concerning piece of legislation currently being discussed in the western world, and that WhatApp would refuse to remove end-to-end encryption if it became law.

The WhatsApp intervention on the Online Safety bill is the second time a messaging platform has signalled its opposition. Rival messaging app Signal recently said it could stop providing services in the UK if the bill required it to scan messages.

UK exit?

Meta’s WhatApp however is much bigger than Signal.

Indeed according to Ofcom WhatsApp is the most popular messaging platform in the UK, used by more than seven in 10 online adults.

Now the chat app’s boss has said that WhatsApp would refuse to comply with requirements in the online safety bill that attempted to outlaw end-to-end encryption.

Will Cathcart made the comments when he was speaking during a UK visit in which he will meet legislators to discuss the government’s flagship internet regulation, the Guardian newspaper reported.

“It’s a remarkable thing to think about. There isn’t a way to change it in just one part of the world,” Cathcart was quoted as saying. “Some countries have chosen to block it: that’s the reality of shipping a secure product. We’ve recently been blocked in Iran, for example. But we’ve never seen a liberal democracy do that.”

“The reality is, our users all around the world want security,” Cathcart added. “Ninety-eight per cent of our users are outside the UK. They do not want us to lower the security of the product, and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those 98 percent of users.”

End-to-end encryption

WhatsApp switched on its end-to-end encryption back in 2016 and Cathcart made clear it would not be removed to suit the UK legislation.

“End-to-end” encryption is used in messaging services to prevent anyone but the recipients of a communication from being able to decrypt it, the Guardian noted. WhatsApp cannot read messages sent over its own service, and so cannot comply with law enforcement requests to hand over messages, or pleas to actively monitor communications for child protection or antiterror purposes.

The UK government already has the power to demand the removal of encryption thanks to the 2016 investigatory powers act, but WhatsApp has never received a legal demand to do so, Cathcart reportedly said. The online safety bill is a concerning expansion of that power, because of the “grey area” in the legislation.

The Guardian noted that under the bill, the government or Ofcom could require WhatsApp to apply content moderation policies that would be impossible to comply with without removing end-to-end encryption.

If the company refused to do, it could face fines of up to 4 percent of its parent company Meta’s annual turnover – unless it pulled out of the UK market entirely.

EC protections

Similar legislation in other jurisdictions, such as the EU’s digital markets act, explicitly defends end-to-end encryption for messaging services, Cathcart reportedly said, and he called for similar language to be inserted into the UK bill before it passed.

“It could make clear that privacy and security should be considered in the framework. It could explicitly say that end-to-end encryption should not be taken away,” said Cathcart. “There can be more procedural safeguards so that this can’t just happen independently as a decision.”

The Guardian noted that although WhatsApp is best known as a messaging app, the company also offers social networking-style features through its “communities” offering, which allows group chats of more than a 1,000 users to be grouped together to mimic services such as Slack and Discord. Those, too, are end-to-end encrypted, but Cathcart argued that the chances of a large community causing trouble was slim.

“When you get into a group of that size, the ease of one person reporting it is very high, to the extent that if there’s actually something serious going on it is very easy for one person to report it, or easy if someone is investigating it for them to get access.”

The company also officially requires UK users to be older than 16, but Cathcart declined to advise parents whose children have an account on the service to delete it, saying “it’s important that parents make thoughtful choices”.

The UK government is expected to return the online safety bill to parliament in the summer.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

X’s Community Notes Fails To Stem US Election Misinformation – Report

Hate speech non-profit that defeated Elon Musk's lawsuit, warns X's Community Notes is failing to…

2 days ago

Google Fined More Than World’s GDP By Russia

Good luck. Russia demands Google pay a fine worth more than the world's total GDP,…

2 days ago

Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…

4 days ago

Meta Warns Of Accelerating AI Infrastructure Costs

Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…

4 days ago

AI Helps Boost Microsoft Cloud Revenues By 33 Percent

Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…

4 days ago
  翻译: