Sunday 15 February 2015

The war against encryption and privacy

encryption and privacy

Jim Killock talks to cryptographer Ben Laurie about the government plans to snoop on Internet communications when a court order is present

Most people’s first reaction to the Charlie Hebdo massacre was sympathy and solidarity in face of an assault on our liberty. But from British government, the second reaction was a call for new surveillance powers. David Cameron appeared to declare war on encryption, asking if ‘we want to allow a means of communication between two people which, even in extremis with a signed warrant from the home secretary personally, that we cannot read?’ Cameron’s answer was ‘no, we must not. The first duty of any government is to keep our country and our people safe’.


Cameron’s demand seems initially reasonable. Why shouldn’t the Home Secretary get access to messages when there’s a warrant? I asked Ben Laurie, founding director of The Apache Software Foundation, and a cryptologist who works on Internet security technologies, including SSL, for his thoughts about the apparent battle between government and security technologists.

Laurie says, ‘It’s a bit like asking: “Why shouldn’t the Home Office be able, with a warrant, to tell buildings not to fall on people?” It’s physics. Court orders are apparently going to magically enable a technology that means access would only take place when a court order is presented.’ But any such system would mean that ‘whatever you do, there would be a backdoor that anyone could use, which would be particularly dangerous when you’re trying to protect people, and the backdoor is being used by your enemies rather than your friends’.

So why are politicians demanding the impossible? ‘Politicians are the public too,’ says Laurie. ‘They don’t have any special ability to understand these things, and the people that brief them don’t have any motivation to tell them the downsides. I’m sure that GCHQ and the NSA are advising Cameron that it will work without problems.’

I put it to Laurie that Cameron might hope to use his ‘reasonable’ position of principle to push Internet platforms such as WhatsApp, and to stop providing ‘end-to-end’ encryption, where people can talk to each other, but the service can’t listen to them. ‘The best approach,’ says Laurie, ‘would have been not to pull the stunts that Edward Snowden showed they were pulling. Does Cameron not realise the US companies now hate GCHQ?’

Laurie thinks US tech companies are likely to resist pressure for general agreements to spy on users and hand over data. He observes that, for the most part, agencies want metadata – the who, when and how of your communications – which is often accessible by tech companies, even when content isn’t accessible. Even so, companies have been doing much more to protect user privacy since the Snowden revelations, which in turn is annoying agencies that have become used to being able to access the content and data of personal communications in bulk. So what’s driving this change?

‘It’s a combination of outspoken user desire and a commercial desire to capitalise on Snowden,’ says Laurie. ‘Snowden has shaken loose a lot of funding for privacy tech and public paranoia.’ These privacyfriendly changes go further than a few commercial applications. There’s renewed interest in creating new, secure technologies. He points to efforts to make the metadata from communications hard to gather.

‘The push is the same as for encryption in general; people don’t want to be spied on. The wake of the Snowden revelations is that people are realising that metadata is the only data that actually matters, and it’s being collected on a huge scale, which makes people angry. That said, it’s actually quite difficult to obscure metadata, so I’m not sure how successful that’s going to be, but that’s the way people seem to be pushing. If you look at the latest generation of messaging systems, such as Pond and BitMessage, they’re designed to try make it impossible to know who is talking to whom.’

Other systems concentrate on protecting your address books, and providing methods to let you know that your friends are online, without revealing this information to the security agencies. Are they commercial or non-commercial?

‘The projects that are trying to provide end-to-end anonymity, where I can talk to you but nobody knows I’m talking to you, are generally non-commercial at the moment. There’s a good reason for that, which is that they’re quite expensive in terms of resources, such as bandwidth.

‘To provide a mass commercial service, you’re going to have to pay for the resources, whereas free services tend to rely on the goodwill of participants who donate machine time or bandwidth.

‘So, mostly, these services are open source and free to use. I don’t know what their future is, but it’s a trend that’s growing. Use of Tor [the anonymous browsing system], for instance, is growing and growing.’ These systems come with downsides though. Is the ‘expense’ that Laurie highlights also a cost to the users who operate the software?

‘It’s not normally more expensive for the users; the service providers pay the cost. With Tor, you do some extra computation on the user side, but ultimately, you’re using essentially three times as much bandwidth, and it’s other people in the network who provide it. If it was a commercial service, it would presumably cost users more, though, because providers would have to charge more for it.’

Another area of technology Laurie says is being developed is private information retrieval. The idea of this technology is to allow individuals to query a database, but ensure the database doesn’t know what they’re retrieving. This tech would allow you to interact with complicated technologies and services without the service getting to know everything about your requests.

‘The obvious way is letting you get the whole database,’ says Laurie, ‘but can you do better than that?’ He gives the example of homomorphic encryption, which ‘lets you get somebody else to perform a calculation on your behalf without knowing what you’re calculating, by doing an equivalent calculation.’

This is important because it enables privacy-friendly, centralised information tools. One of the basic objections to encryption of end users’ content, such as your photos or financial information, is that cloudbased services can’t interact with the encrypted material. However, private information retrieval systems would enable you to obscure what and who is making queries, so you can effectively have your privacy cake and eat your cloud-based service.

Still, Laurie says that homomorphic encryption ‘across whole databases is insanely expensive and not really practical at the moment’. There are more efficient ways of serving the same goal, he says, ‘by asking two different servers two different things and combining the results to get the answer you want’. This system works as a privacy technology as long as the servers are trusted not to talk to each other to work out what you’re doing.

All technologies can have downsides and limitations, says Laurie. ‘Tor, in fact, is not that great against attackers who have a view of the whole network. If you wanted to make it great you would have to slow down users to provide truly good anonymity.’ But the bigger picture, he says, includes institutional hypocrisy on the part of governments.

‘There’s a brilliant piece of doublethink going on,’ says Laurie. ‘The perceived need for privacy technologies is also perceived by governments. They would like people to not be using them, unless they’re people who they would like to be using them. The people they would like to be using them are, for example, the military and covert operations, but also dissidents in countries whose regimes they don’t approve of. So Obama funds Radio Free Asia, which funds the Open Tech Fund, which funds exactly the things Cameron wants to ban.’