Posts

Are the Authorities Confusing a PRISM Problem with an Encryption Problem?

CNN has its own version of updated reporting from the Paris attack. It provides a completely predictable detail inexplicably not included in the weekend’s big NYT story: that the one phone with any content on it — as distinct from a pure burner — had Telegram loaded on it.

Several hours earlier, at 2:14 p.m., while they were still at the Alfortville hotel, the Bataclan attackers had downloaded the encryption messaging app Telegram onto their Samsung smart phone, according to police reports. No recovered content from the messaging app is mentioned in the French police documents, suggesting there were likely communications by the Bataclan attackers that will never be recovered.

As well as offering end-to-end encryption, the Telegram messaging app offers an option for users to “self-destruct” messages. At 4:39 p.m. on November 13, one of the attackers downloaded detailed floor plans of the Bataclan venue onto the Samsung phone and conducted online searches for the American rock band playing there that night, the Eagles of Death Metal.

I predicted as much in my post on that NYT story.

My suspicion is that, as had been reported, rather than emails ISIS relied on Telegram, but used in such a fashion that would make it less useful on burner phones (“secret” Telegram chat are device specific, meaning you’d need a persistent phone number to use that function). But if these terrorists did use Telegram, they probably eluded authorities not because of encryption, but because it’s fairly easy to make such chats temporary (again, using the secret function). Without Telegram being part of PRISM, the NSA would have had to obtain the metadata for chats via other means, and by the time they IDed the phones of interest, there may have been no metadata left.

If ISIS’ use of Telegram (which was publicly acknowledged when Telegram shut down a bunch of ISIS channels in the wake of the attack) is what anonymous sources keep insisting is an encryption problem, then it suggests the problem is being misportrayed as an encryption one.

True, Telegram does offer the option of end to end encryption for its messaging. There are questions about its encryption (though thus far it hasn’t been broken publicly). So it does offer users the ability to carry out secret chats and to then destroy them, which may be where the concern about all the “scoured” “email” in the NYT piece comes from, the assumption these terrorists have used Telegram but deleted those messages.

But as the Grugq points out, it’s a noisy app in other ways that the NSA should be able to exploit.

Contact Theft

When registering an account with Telegram, the app helpfully uploads the entire Contacts database to Telegram’s servers (optional on iOS). This allows Telegram to build a huge social network map of all the users and how they know each other. It is extremely difficult to remain anonymous while using Telegram because the social network of everyone you communicate with is known to them (and whomever has pwned their servers).

Contact books are extremely valuable information. We know that the NSA went to great lengths to steal them from instant messenger services. On mobile the contact lists are even more important because they are very frequently linked to real world identities.

Voluminous Metadata

Anything using a mobile phone exposes a wide range of metadata. In addition to all the notification flows through Apple and Google’s messaging services, there is the IP traffic flows to/from those servers, and the data on the Telegram servers. If I were a gambling man, I’d bet those servers have been compromised by nation state intelligence services and all that data is being dumped regularly.

This metadata would expose who talked with who, at what time, where they were located (via IP address), how much was said, etc. There is a huge amount of information in those flows that would more than compensate for lacking access to the content (even if, big assumption, the crypto is solid).

He spends particular time on Telegram’s Secret chat function (the one that allows a person to destroy a chat). But he doesn’t talk about how that might play into the extensive use of burners that we’ve seen from ISIS. Secret chats are device specific (that is, they can be sent only to a numbered device, not an account). That would make the function very hard to integrate with disciplined burner use, because the whole point of burners is not to have persistent telephone numbers. How will a terrorist remember the new number he wants to associate with a Telegram secret chat? Write it on a piece of paper?

In other words, it seems you could use one (disciplined burners) or another (full use of Telegram with persistent phones), the latter of which would provide its own kind of intelligence. It may well be ISIS does merge these two uses, but if so we shouldn’t expect to see Telegram on their true burner phones. Plus, assuming the bearer of the phone speaks that dialect the Belgians were struggling to translate, voice calls on burners would be just as useful as transient use of Telegram.

But that’s probably not the real problem for authorities. In fact, if known terrorists had been using, say, WhatsApp rather than Telegram for such encrypted chats, authorities might have had more information on their network than they do now. That’s because WhatsApp metadata would be available under PRISM, whereas to get Telegram data, non-German authorities are going to have to go steal it.

If that supposition is correct, it would suggest that the US should drop all efforts to make Apple phones’ encryption weaker. So long as it has the presumed best security (notwithstanding the iMessage vulnerability just identified by researchers at Johns Hopkins), people from around the world will choose it, ensuring that the world’s best SIGINT agency could have ready access. If Telegram is perceived as being better — or even being close, given the location — people of all sorts will prefer that.

That won’t give you the content, in either case (even if you had the Moroccan translators you needed to translate, if that indeed remained a problem for authorities). But you’re better off having readily accessible metadata than losing it entirely.

To Clarify the Debate, Tim Cook Should Start Shopping for Land in Cork, Ireland

There’s so much blathering from National Security and plain old pundits about FBI’s demand that Apple’s programmers write it a custom operating system that I think, to facilitate reasonable debate, Tim Cook should travel to Cork, Ireland (where Apple already has a presence) and start shopping for land for a new headquarters.

I say this not because my spouse and I are Irish (though the Irish spouse insists that Cork is the Irish equivalent of Texas), and not because I want Apple to take all its Silicon Valley jobs and move them to Ireland, and not because Apple has already been using Ireland as a tax haven, but because it would be the best way to get people who otherwise seem to misunderstand the current state of the world on encryption to better think it through.

FBI’s problem with Apple is that the company tries to offer its users around the globe the strongest possible security as a default option. Plenty of other companies (like Android) offer less perfect security.  Plenty of other apps offer security. Some (like Signal) may even offer better security, but relying on devices (Android phones and desktops) that themselves may be insecure. But the problem with Apple is that all its more recent phones are going to be harder (though not impossible, unless law enforcement fucks up when they first seize the phone, as they did here) to access by default.

Thus far, however, Apple still serves as a valuable law enforcement partner — something lots of the pundits have ignored. Before the All Writs Act order on February 16, Apple had turned over metadata covering the entire period Farook used the phone (he apparently was using the phone into November), as well as the content that was backed up into iCloud until October 19. Presumably, Apple turned over all the same things on the victims Farook killed, up to 14 iPhones full of communications, including with Farook, set to auto-backup as Farook’s phone originally had been. Apple can and surely does turn over all the same things when an iPhone user in Paris or Beijing or Beirut sparks the interest of NSA.

If Apple were to move its headquarters and servers to Cork (perhaps with some redundant servers in Brazil, for example), that would be far less accessible to both US law enforcement and intelligence. And contrary to what you might think from those attacking Apple’s alleged non-compliance here, that would result in significantly less intelligence (or evidence) than both are getting now.

That’s because by offering the best encryption product in the world that relies on US-based servers, Apple ensures that at least the metadata — not to mention any content backed up to iCloud (which in Farook’s case, included content through October plus that from his colleagues) — is readily available. If Apple were to move to Cork, any backed up content would be far harder to get and NSA would have to steal Internet packets to get iMessage metadata (admittedly, that’s probably pretty easy to do from Ireland, given its proximity to GCHQ’s gaping maw, but it does require some work).

The counterexample is the way the terrorists behind the Paris attack used Telegram. Because that’s a non-US messaging system, data including metadata from it was not easily available (though as I understand it its encryption would be fairly trivial for NSA to overcome). Thus, terrorists were able to use an inferior product and obtain more obscurity (until Telegram, under pressure, shut down a bunch of ISIS channels) than they would have if they had used the superior iPhone because Apple’s servers are in the US. If US national security officials force multinational companies to choose between quality of product and US location, one or two may choose to offshore. Alternately, eventually the foreign products may come to rival what Apple is currently offering.

Right now, US officials are guaranteed that if intelligence and criminal targets use the best product in the world, they’ll have evidence readily available. Even ignoring all the economic reasons to want Apple to stay in the US (or better yet to actually pay its fair share of taxes in the US!) that could change if Apple were to decide it could not longer legally offer a secure product while remaining in the US.

Brennan Was Probably Talking about the Telegram PRISM Gap as Much as Encryption

I noted the other day that at a pre-scheduled appearance Monday, Josh Rogin cued John Brennan to explain how the Paris attack happened without warning. In my opinion, the comment has been badly misreported as an indictment solely of Edward Snowden (though it is that) and encryption. I’ve put the entire exchange below but the key exchange was this:

And as I mentioned, there are a lot of technological capabilities that are available right now that make it exceptionally difficult, both technically as well as legally, for intelligence and security services to have the insight they need to uncover it. And I do think this is a time for particularly Europe, as well as here in the United States, for us to take a look and see whether or not there have been some inadvertent or intentional gaps that have been created in the ability of intelligence and security services to protect the people that they are asked to serve. And in the past several years because of a number of unauthorized disclosures and a lot of handwringing over the government’s role in the effort to try to uncover these terrorists, there have been some policy and legal and other actions that are taken that make our ability collectively internationally to find these terrorists much more challenging. And I do hope that this is going to be a wake-up call, particularly in areas of Europe where I think there has been a misrepresentation of what the intelligence security services are doing by some quarters that are designed to undercut those capabilities.

Brennan talks about technology that makes it difficult technically and legally to uncover plots. Encryption is a technical problem — one the NSA has proven its ability to overcome — that might be called a legal one if you ignore that NSA has the ability to overcome the lack of a legal requirement to provide back doors. But I agree this passage speaks to encryption, if not other issues.

In the next sentence, though, he talks about inadvertent or intentional gaps created “particularly in Europe.” He talks about plural unauthorized disclosures — as I noted, Josh Rogin’s own disclosure that the US had broken AQAP’s online conferencing technique may have been more directly damaging than most of Snowden’s leaks —  and “handwringing.” Those have led to “policy and legal and other actions” that have made it harder to find terrorists. In the next sentence, Brennan again emphasizes that “particularly in areas of Europe,” there needs to be a “wake-up call” because “there has been a misrepresentation” of what the spooks are doing, which he suggests was deliberately “designed to undercut those capabilities.”

So the paragraph where he speaks of these problems, he twice emphasizes that Europe in particular needs to adjust its approach.

Last I checked, Europe didn’t pass USA Freedom Act (which would not, in any way, have restricted review of Parisian targeters). Some countries in Europe are more vigorously considering limits on encryption, but those would be just as ineffective as eliminating the code that’s already out there.

What Europe has done, however, is make it harder for our PRISM providers to share data back and forth between Europe (and with providers considering moving servers to Europe, it will raise new questions about the applicability of PRISM for that data). And Europe (not just Europe, but definitely including Europe) has created a market need for US tech companies to distance themselves from the government.

And in the case of Germany, politicians have been investigating how much its BND has done for NSA, and especially which impermissible German people and companies were targeted as part of the relationship. I noted that Brennan raised similar issues just days after the BND investigation turned scandalous in March, and recent revelations have raised new pressure on BND.

With that in mind, in particular, consider what one of the more responsible reports on Brennan’s speech, that of Shane Harris, focused on — terrorists’ use of Berlin headquartered social messaging app Telegram. If terrorists were using WhatsApp (which a lot of the fearmongering focused on), the metadata, at least, would be available via Facebook. But since Telegram is not a US company, it cannot be obliged under Section 702 of FISA, and that surely creates just the kind of gap Brennan was talking about.

Since Brennan’s speech, Telegram has started deleting the special channels set up by ISIS to communicate.

I’m sure Brennan is complaining about encryption and if he can get Congress to force domestic back doors, I’m sure he will (though ISIS reportedly shies away from Apple products, so forcing Apple to give up its encrypted iMessage won’t help track down ISIS). But his speech seemed focused much more intently on ways in which, in the aftermath of the Snowden leaks, Europeans have opportunistically localized data and, in the process, made that data far less accessible to the NSA. Brennan, as I made clear in March, definitely would prefer the Europeans rely on Americans for their SIGINT (and in the process agree to some inappropriate spying in their home country), and the gap created by terrorists’ reliance on Telegram is one way to exert pressure on that point.

Read more