The attack outside the United Kingdom’s parliament in London last Wednesday was over in just 82 seconds, but the backlash from the incident is continuing to develop. On Sunday, a firestorm was triggered when a leading British government minister, in a television interview, appeared to apportion some blame for the incident on WhatsApp, which allows smartphone users to send and receive encrypted text messages that are difficult for police and spy agencies to monitor.
At about 14:40 on London’s Westminster Bridge, 52-year-old Khalid Masood drove his rented Hyundai Tucson at high-speed into a crowd of people, killing three and injuring upwards of 40. He continued towards the nearby parliament, where he got out of his car and, wielding two knives, stabbed and killed a policeman. Seconds later, an armed officer on the scene shot Masood two or three times, and he fell to the ground, later dying from his injuries.
In the aftermath of the incident, it was reported that Masood – a British citizen born with the name Adrian Elms – had used WhatsApp minutes prior to launching his rampage. On Sunday, the British government’s home secretary Amber Rudd called WhatsApp’s encryption “completely unacceptable,” when asked about Masood’s alleged use of the app.
“There should be no place for terrorists to hide,” she said. “We need to make sure that organizations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other.”
Rudd’s anti-encryption rhetoric was in line with a position the British government has taken for years – that no communication service should be impossible for the authorities to tap. Former prime minister David Cameron pushed this policy following the Charlie Hebdo killings in 2015. And one of the core principles behind a sweeping new surveillance law in the U.K. is that “there must be no guaranteed safe spaces online” for terrorists, criminals, and pedophiles to “communicate beyond the reach of the law.”
There are problems with using the Masood case as an example to drive this agenda, however. First, it has not been established whether Masood actually sent or received any encrypted messages on WhatsApp prior to launching his attack. There is evidence that he may have accessed WhatsApp shortly before he drove down Westminster Bridge, because journalists who obtained his phone number verified that his account was active on the service at that time. But this may have been to send a message to – or read a message from – his wife or a friend. There is no evidence yet to support any suggestion that Masood was operating under the direction of a group such as the Islamic State or al Qaeda.
In fact, police investigators have said so far that they have no indication that he was anything other than a “lone wolf,” who was radicalized after watching jihadist propaganda online.
Moreover, even if all of Masood’s WhatsApp chats were unencrypted – meaning they could have been easily intercepted by spy agencies and police – it is unlikely that would have prevented his murderous spree. U.K. prime minister Theresa May has acknowledged that he was not under investigation at the time of the attack, so his communications would not have been the focus of any eavesdropping.
If it does turn out that Masood was sending encrypted chats to terrorist handlers, these messages are not necessarily beyond the reach of investigators. After he was shot dead, police officers will have thoroughly searched Masood’s vehicle and his home, seizing his belongings. If the officers obtained his smartphone, they should be able to access it and recover the WhatsApp texts stored on the device, much like how the FBI eventually broke into the San Bernardino attackers’ iPhones. (London’s Metropolitan Police did not respond to an inquiry Tuesday on whether it had obtained Masood’s phone.)
A spokesperson for the British government’s Home Office told The Intercept that it could not answer questions about Rudd’s WhatsApp comments. “The government supports encryption in cyber security,” the spokesperson said in a statement. “But it is irresponsible to give terrorists a way to plot online which cannot be intercepted by the police and intelligence agencies who are trying to protect the public from further attacks.”
It is difficult to assert support for “encryption in cyber security,” however, while also advocating for encryption to be weakened for the purposes of surveillance. As security experts have pointed out, a surveillance “backdoor” cannot be built into WhatsApp or any other service for the sole purpose of allowing the British government to spy on terrorists and other serious criminals. The backdoor would create a gaping security hole, which could potentially be exploited by hostile foreign intelligence agencies, hackers, criminal fraudsters, and a variety of other undesirables. And that should give the British government pause for concern, especially given that officials at its highest levels also use WhatsApp to discuss sensitive issues.
Jim Killock, executive director of the London-based Open Rights Group, says he believes the government is “grandstanding” with its WhatsApp criticism. “They are trying to make a political point rather than making serious demands,” he says.
Killock points out that under the U.K.’s new surveillance law – the Investigatory Powers Act – the government can try to compel companies to weaken the encryption on their services. By issuing what is called a “technical capability notice,” the authorities can force companies to “provide any assistance” in the context of surveillance, which can include “obligations relating to the removal by a relevant operator of electronic protection applied by or on behalf of that operator to any communications or data.”
In practice, it would be difficult for the U.K. government to force companies domiciled outside of British territory to comply with such an order. But WhatsApp is owned by Facebook, which has a significant presence in the U.K and is planning to open a large new office in London’s West End later this year. Killock says if the government really wanted to crack down on the company’s encryption, it could perhaps try to strong-arm Facebook by threatening sanctions on its British assets. “If the government had serious demands it would have issued [technical capability] notices and would be trying to get WhatsApp to change the technology in secret,” he says.
For Facebook, the furor over WhatsApp will not feel entirely new; it bears striking resemblance to another case in recent history. In 2013, British soldier Lee Rigby was savagely murdered on a London street in broad daylight by two Islamist extremists. In the aftermath of the killing, some British politicians attacked Facebook for not reporting messages one of the attackers had allegedly sent on the platform indicating his intention to murder a soldier. But a parliamentary investigation revealed a more complex picture. One of the killers had been closely monitored under five separate police and security service operations – and police had received a tip that the other extremist was affiliated with al-Qaeda. Yet both men still slipped under the radar.
“The government has a pattern of doing this,” says Killock. “It’s never a social issue, or a government failing – it’s easier to just blame the technology companies.”
Top photo: Armed police officers secure the area near the Houses of Parliament in central London on March 23, 2017 the day after a terror attack.