WhatsApp, Signal Oppose UK Move to Force Companies to Break End-to-End Encryption

Social

Products You May Like

WhatsApp and other messaging services have united to oppose Britain’s plan to force tech companies to break end-to-end encryption in private messages in its proposed internet safety legislation.

Meta-owned WhatsApp, Signal and five other apps signed an open letter saying the law could give an “unelected official the power to weaken the privacy of billions of people around the world”.

Britain’s Online Safety Bill was originally designed to create one of the toughest regimes for regulating platforms such as Facebook, Instagram, TikTok, and YouTube.

The proposals were watered down in November, when a requirement to stop “legal but harmful content” was removed to protect free speech, and instead the focus was put on illegal content, particularly related to child safety.

The British government said the bill in “no way represented a ban on end-to-end encryption, nor would it require services to weaken encryption“.

But it wants regulator Ofcom to be able to make platforms use accredited technology, or try to develop new technology, to identify child sexual abuse content.

The letter signatories said this was incompatible with end-to-end encryption, which enables a message to be read only by the sender and recipient.

“The bill provides no explicit protection for encryption, and if implemented as written, could empower Ofcom to try to force the proactive scanning of private messages on end-to-end encrypted communication services – nullifying the purpose of end-to-end encryption as a result and compromising the privacy of all user,” they said.

The bill poses an “unprecedented threat to the privacy, safety and security of every UK citizen and the people with whom they communicate around the world, while emboldening hostile governments who may seek to draft copy-cat laws”, they said.

A British government spokesperson said: “We support strong encryption, but this cannot come at the cost of public safety.

“Tech companies have a moral duty to ensure they are not blinding themselves and law enforcement to the unprecedented levels of child sexual abuse on their platforms.”

© Thomson Reuters 2023


Affiliate links may be automatically generated – see our ethics statement for details.

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *