UK Government targeting WhatsApp is Another Security Red Herring

By Amnesty tech expert Joe Westby. Follow Joe on Twitter @JoeWestby

Anyone who hoped that the debate about encryption had already been put to bed, sadly, was wrong. Today, UK Home Secretary Amber Rudd will meet with technology companies including Facebook and Google to discuss encrypted messaging services, with a view to “persuading” the companies to gain access to encrypted communications.

Earlier this week, in the wake of the Westminster terrorist attack, Rudd became the latest state official to blame encrypted messaging services like WhatsApp for ostensibly facilitating terrorist attacks. Meanwhile, yesterday the EU promised to put forward tough new rules on encrypted messaging in June.  We. have. been. here. before.

As with earlier attempts by the UK and other governments to crack down on encrypted messaging apps, Rudd’s proposal to force technology companies to give the police and intelligence agencies access to end-to-end encrypted messages is misguided, ineffective, and dangerous – and risks undermining the rights of us all.

There is little good that can come out of Rudd’s attack on end-to-end encryption, the particular technology used by WhatsApp and several other messaging apps. With end-to-end encrypted communications, no other party except the people in the conversation – not even the company providing the service – can see the content of messages.

Although on the face of it, it sounds reasonable to ask companies to surrender the communications of terrorists, in reality this is impossible without compromising the security and rights of all of the users of these services.

End-to-end encryption is an effective method of securing communications, including sensitive personal data, from falling into the wrong hands. There is consensus within the tech community that there is no way to put in place a system of special access (referred to as a ‘backdoor’) to encrypted messages that could only be used by the intended state authorities. A door is a door is a door. If a backdoor exists, you have to assume that others – be they criminals, hackers, or other governments – will also figure out how to access the information.

It is also now widely recognised that encryption is a vital enabler of human rights, in particular the rights to privacy and to freedom of expression and opinion. Encrypting our information helps to create a “zone of privacy” online within which we are free to express our beliefs and ideas without fear of interference. Activists around the world rely on encryption to protect themselves from persecution.

Measures to weaken encryption on popular commercial services would only serve to undermine the human rights and information security of all the ordinary people using them – and would still not stop people intending to commit criminal or terrorist acts from using end-to-end encryption.

The widespread accessibility of encryption tools across the world means that it will be near impossible to prevent terrorist and criminal groups from using encryption in their communications.  Only 1 out of 9 apps reportedly identified as “safe” or “safest” by the armed group that calls itself the Islamic State is not open source, meaning that the majority are freely available online and would not be affected by regulation in one jurisdiction.

Moreover, it is critical to understand the context in which end-to-end encryption is being more and more widely introduced by companies. We live in a golden age of mass surveillance. In December last year, the UK adopted the Investigatory Powers Act, one of the world’s most far-reaching pieces of electronic surveillance legislation. Not only did it give government agencies access to huge personal data sets but also allowed them to undertake surveillance and hacking at a mass scale.

Existing surveillance powers in the UK are already incredibly broad, and contrary to human rights in and of themselves. Worse still, we know very little about how these powers are used. But what we do know should give us pause.

We know, for instance, that the UK government has spied on Amnesty International. We know they have spied on confidential lawyer-client communications. We know that the government acted unlawfully in its data sharing arrangements with the USA.

Human rights law does not prevent surveillance, provided it is carried out for a legitimate purpose, subject to adequate safeguards and oversight and, importantly, based on individualised reasonable suspicion of wrongdoing. In short, surveillance must be targeted. 

End-to-end encryption limits but does not prevent this sort of legitimate, targeted surveillance. The widespread adoption of end-to-end encrypted messaging apps does, however, make the kind of untargeted, and unlawful, mass surveillance programs uncovered by Edward Snowden much more difficult.

Terrorist attacks that deliberately maim and kill bystanders at random are an attack on all of our human rights. Preventing these sorts of acts, and bringing perpetrators to justice, are challenges that demand a strong and coherent response from government, industry and civil society.

But we must resist the impulse to chase seemingly ‘easy’ solutions whose impacts are likely illusory and whose downsides are immense. Weakening encryption on WhatsApp and others services only weakens security for all of us, rather than enhancing it.

Read more about Encryption and Human Rights

Follow Amnesty’s Security and Human Rights work on Facebook

Topics