Posts

Why Did Apple “Object” to All Pending All Writs Orders on December 9?

As I noted the other day, a document unsealed last week revealed that DOJ has been asking for similar such orders in other jurisdictions: two in Cincinnati, four in Chicago, two in Manhattan, one in Northern California (covering three phones), another one in Brooklyn (covering two phones), one in San Diego, and one in Boston.

According to Apple, it objected to at least five of these orders (covering eight phones) all on the same day: December 9 (note, FBI applied for two AWAs on October 8, the day in which Comey suggested the Administration didn’t need legislation, the other one being the Brooklyn docket in which this list was produced).

Screen Shot 2016-02-24 at 7.23.53 PM

The government disputes this timeline.

In its letter, Apple stated that it had “objected” to some of the orders. That is misleading. Apple did not file objections to any of the orders, seek an opportunity to be heard from the court, or otherwise seek judicial relief. The orders therefore remain in force and are not currently subject to litigation.

Whatever objection Apple made was — according to the government, anyway — made outside of the legal process.

But Apple maintains that it objected to everything already in the system on one day, December 9.

Why December 9? Why object — in whatever form they did object — all on the same day, effectively closing off cooperation under AWAs in all circumstances?

There are two possibilities I can think of, though they are both just guesses. The first is that Apple got an order, probably in an unrelated case or circumstance, in a surveillance context that raised the stakes of any cooperation on individual phones in a criminal context. I’ll review this at more length in a later post, but for now, recall that on a number of occasions, the FISA Court has taken notice of something magistrates or other Title III courts have done. For location data, FISC has adopted the standard of the highest common denominator, meaning it has adopted the warrant standard for location even though not all states or federal districts have done so. So the decisions that James Orenstein in Brooklyn and Sheri Pym in Riverside make may limit what FISC can do. It’s possible that Apple got a FISA request that raised the stakes on the magistrate requests we know about. By objecting across the board — and thereby objecting to requests pertaining to iOS 8 phones — Apple raised the odds that a magistrate ruling might help them out at FISA. And if there’s one lawyer in the country who probably knows that, it’s Apple lawyer Marc Zwillinger.

Aside the obvious reasons to wonder whether Apple got some kind of FISA request, in his interview with ABC the other day, Tim Cook described “other parts of government” asking for more and more cases (though that might refer to state and city governments asking, rather than FBI in a FISA context).

The software key — and of course, with other parts of the government asking for more and more cases and more and more cases, that software would stay living. And it would be turning the crank.

The other possibility is that by December 9, Apple had figured out that — a full day after Apple had started to help FBI access information related to the San Bernardino investigation, on December 6 — FBI took a step (changing Farook’s iCloud password) that would make it a lot harder to access the content on the phone without Apple’s help. Indeed, I’m particularly interested in what advice Apple gave the FBI in the November 16 case (involving two iOS 8 phones), given that it’s possible Apple was successfully recommending FBI pursue alternatives in that case which FBI then foreclosed in the San Bernardino case. In other words, it’s possible Apple recognized by December 9 that FBI was going to use the event of a terrorist attack to force Apple to back door its products, after which Apple started making a stronger legal stand than they might otherwise have done pursuant to secret discussions.

That action — FBI asking San Bernardino to change the password — is something Tim Cook mentioned several times in his interview with ABC the other night, at length here:

We gave significant advice to them, as a matter of fact one of the things that we suggested was “take the phone to a network that it would be familiar with, which is generally the home. Plug it in. Power it on. Leave it overnight–so that it would back-up, so that you’d have a current back-up. … You can think of it as making of making a picture of almost everything on the phone, not everything, but almost everything.

Did they do that?

Unfortunately, in the days, the early days of the investigation, an FBI–FBI directed the county to reset the iCloud password. When that is done, the phone will no longer back up to the Cloud. And so I wish they would have contacted us earlier so that that would not have been the case.

How crucial was that missed opportunity?

Assuming the cloud backup was still on — and there’s no reason to believe that it wasn’t — then it is very crucial.

And it’s something they harped on in their motion yesterday.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker’s accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks.21 Had the FBI consulted Apple first, this litigation may not have been necessary.

Plus, consider the oddness around this iCloud information. FBI would have gotten the most recent backup (dating to October 19) directly off Farook’s iCloud account on December 6.

But 47 days later, on January 22, they obtained a warrant for that same information. While they might get earlier backups, they would have received substantially the same information they had accessed directly back in December, all as they were prepping going after Apple to back door their product. It’s not clear why they would do this, especially since there’s little likelihood of this information being submitted at trial (and therefore requiring a parallel constructed certified Apple copy for evidentiary purposes).

There’s one last detail of note. Cook also suggested in that interview that things would have worked out differently — Apple might not have made the big principled stand they are making — if FBI had never gone public.

I can’t talk about the tactics of the FBI, they’ve chosen to do what they’ve done, they’ve chosen to do this out in public, for whatever reasons that they have.What we think at this point, given it is out in the public, is that we need to stand tall and stand tall on principle. Our job is to protect our customers.

Again, that suggests they might have taken a different tack with all the other AWA orders if they only could have done it quietly (which also suggests FBI is taking this approach to make it easier for other jurisdictions to get Apple content). But why would they have decided on December 9 that this thing was going to go public?

Update: This language, from the Motion to Compel, may explain why they both accessed the iCloud and obtained a warrant.

The FBI has been able to obtain several iCloud backups for the SUBJECT DEVICE, and executed a warrant to obtain all saved iCloud data associated with the SUBJECT DEVICE. Evidence in the iCloud account indicates that Farook was in communication with victims who were later killed during the shootings perpetrated by Farook on December 2, 2015, and toll records show that Farook communicated with Malik using the SUBJECT DEVICE. (17)

This passage suggests it obtained both “iCloud backups” and “all saved iCloud data,” which are actually the same thing (but would describe the two different ways the FBI obtained this information). Then, without noting a source, it says that “evidence in the iCloud account” shows Farook was communicating with his victims and “toll records” show he communicated with Malik. Remember too that the FBI got subscriber information from a bunch of accounts using (vaguely defined) “legal process,” which could include things like USA Freedom Act.

The “evidence in the iCloud account” would presumably be iMessages or Facetime. But the “toll records” could be too, given that Apple would have those (and could have turned them over in the earlier “legal process” step. That is, FBI may have done this to obscure what it can get at each stage (and, possibly, what kinds of other “legal process” it now serves on Apple).


October 8: Comey testifies that the government is not seeking legislation; FBI submits requests for two All Writs Act, one in Brooklyn, one in Manhattan; in former case, Magistrate Judge James Orenstein invites Apple response

October 30: FBI obtains another AWA in Manhattan

November 16: FBI obtains another AWA in Brooklyn pertaining to two phones, but running iOS 8.

November 18: FBI obtains AWA in Chicago

December 2: Syed Rezwan Farook and his wife killed 14 of Farook’s colleagues at holiday party

December 3: FBI seizes Farook’s iPhone from Lexus sitting in their garage

December 4: FBI obtains AWA in Northern California covering 3 phones, one running iOS 8 or higher

December 5, 2:46 AM: FBI first asks Apple for help, beginning period during which Apple provided 24/7 assistance to investigation from 3 staffers; FBI initially submits “legal process” for information regarding customer or subscriber name for three names and nine specific accounts; Apple responds same day

December 6: FBI works with San Bernardino county to reset iCloud password for Farook’s account; FBI submits warrant to Apple for account information, emails, and messages pertaining to three accounts; Apple responds same day

December 9: Apple “objects” to the pending AWA orders

December 10: Intelligence Community briefs Intelligence Committee members and does not affirmatively indicate any encryption is thwarting investigation

December 16: FBI submits “legal process” for customer or subscriber information regarding one name and seven specific accounts; Apple responds same day

January 22: FBI submits warrant for iCloud data pertaining to Farook’s work phone

January 29: FBI obtains extension on warrant for content for phone

February 14: US Attorney contacts Stephen Larson asking him to file brief representing victims in support of AWA request

February 16: After first alerting the press it will happen, FBI obtains AWA for Farook’s phone and only then informs Apple

Share this entry

Why Isn’t Jim Comey Crusading against This Tool Used to Hide Terrorist Secrets?

Several times over the course of Jim Comey’s crusade against strong encryption, I have noted that, if Comey wants to eliminate the tools “bad guys” use to commit crimes, you might as well eliminate the corporation. After all, the corporate structure helped a bunch of banksters do trillions of dollars of damage to the US economy and effectively steal the homes from millions with near-impunity.

It’d be crazy to eliminate the corporation because it’s a tool “bad guys” sometimes use, but that’s the kind of crazy we see in the encryption debate.

Yesterday, Ron Wyden pointed to a more narrow example of the way “bad guys” abuse corporate structures to — among other things — commit terrorism: the shell corporation.

In a letter to Treasury Secretary Jack Lew, he laid out several cases where American shell companies had been used to launder money for crime — including terrorism, broadly defined.

Screen Shot 2016-02-26 at 9.51.49 AM

He then asked for answers about several issues. Summarizing:

  • The White House IRS-registration for beneficial information on corporations probably won’t work. Does Treasury have a better plan? Would the Senate and House proposals to have states or Treasury create such a registry provide the ability to track who really owns a corporation?
  • FinCen has proposed a rule that would not only be easily evaded, but might weaken the existing FATCA standard. Has anyone review this?
  • Does FinCen actually think its rule would identify the natural person behind shell companies?
  • Would requiring financial institutions to report balances held by foreigners help information sharing?

They’re good questions but point, generally, to something more telling. We’re not doing what we need to to prevent our own financial system from being used as a tool for terrorism. Unlike encryption, shell companies don’t have many real benefits to society. Worse, it sounds like Treasury is making the problem worse, not better.

Of course, the really powerful crooks have reasons to want to retain the status quo. And so FBI Director Jim Comey has launched no crusade about this much more obvious tool of crime.

Share this entry

FBI Waited 50 Days before Asking for Syed Rezwan Farook’s iCloud Data

Apple’s motion to vacate the All Writs Act order requiring it to help FBI brute force Syed Rezwan Farook’s iPhone is a stupendous document worthy of the legal superstars who wrote it. To my mind, however, the most damning piece comes not from the lawyers who wrote the brief, but in a declaration from another lawyer: Lisa Olle, Apple’s Manager of Global Privacy and Law, the last 3 pages of the filing.

Olle provides an interesting timeline of FBI’s requests from Apple, some of which I’ll return to. The most damning details, however, are these.

First, FBI first contacted Apple in the middle of the night on December 5.
Screen Shot 2016-02-25 at 6.09.00 PM

That means FBI first contacted Apple the day before FBI (according to their own statement) asked San Bernardino County to reset Farook’s Apple password — a move that, FBI stated in the filing, would have made the AWA demand on Apple unnecessary.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker’s accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks.21 Had the FBI consulted Apple first, this litigation may not have been necessary.

In other words, Apple was fully engaged in this case, and yet FBI still didn’t ask their advice before taking action that eliminated the easiest solution to get this information.

And then they waited, and waited, and waited.

Screen Shot 2016-02-25 at 6.16.11 PM

FBI waited 50 days from the time they seized the phone on December 3 until they asked Apple for the iCloud information on January 22 (they had to renew the warrant on the phone itself on January 29).

50 days.

And yet the FBI wants us to believe they think this phone will have important information about the attack.

Share this entry

Working Thread, Apple Response

Apple’s response to the phone back door order is here.

(1) Apple doesn’t say it, but some people at Apple — probably including people who’d have access to this key (because they’d be involved in using it, which would require clearance) — had to have been affected in the OPM hack.

Screen Shot 2016-02-25 at 3.33.26 PM

(2) Remember as you read it that Ted Olson lost his wife on 9/11.

Screen Shot 2016-02-25 at 3.19.26 PM

(3) Several members of Congress — including ranking HPSCI member Adam Schiff — asked questions in hearings about this today.

Screen Shot 2016-02-25 at 3.21.44 PM

(4) Apple hoists Comey on the same petard that James Orenstein did.

Screen Shot 2016-02-25 at 3.29.30 PM

(8) More hoisting on petarding, in this case over DOJ generally and Comey specifically choosing not to seek legislation to modify CALEA.

Screen Shot 2016-02-25 at 3.40.13 PM

(11) Apple beats up FBI for fucking up.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker’s accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks.21 Had the FBI consulted Apple first, this litigation may not have been necessary.

(11) This is awesome, especially coming as it does from Ted Olson, who Comey asked to serve as witness for a key White House meeting after the Stellar Wind hospital confrontation.

Screen Shot 2016-02-25 at 3.44.41 PM

(12) This is the kind of information NSA would treat as classified, for similar reasons.

Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks. Neuenschwander Decl. ¶ 22. Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer.

(16) I’ll have to double check, but I think some of this language quotes Orenstein directly.

Congress knows how to impose a duty on third parties to facilitate the government’s decryption of devices. Similarly, it knows exactly how to place limits on what the government can require of telecommunications carriers and also on manufacturers of telephone equipment and handsets. And in CALEA, Congress decided not to require electronic communication service providers, like Apple, to do what the government seeks here. Contrary to the government’s contention that CALEA is inapplicable to this dispute, Congress declared via CALEA that the government cannot dictate to providers of electronic communications services or manufacturers of telecommunications equipment any specific equipment design or software configuration.

(16) This discussion of what Apple is has ramifications for USA Freedom Act, which the House report said only applied to “phone companies” (though the bill says ECSPs).

Screen Shot 2016-02-25 at 3.55.55 PM

(18) Loving Apple wielding Youngstown against FBI.

Nor does Congress lose “its exclusive constitutional authority to make laws necessary and proper to carry out the powers vested by the Constitution” in times of crisis (whether real or imagined). Youngstown Sheet & Tube Co. v. Sawyer, 343 U.S. 579, 588–89 (1952). Because a “decision to rearrange or rewrite [a] statute falls within the legislative, not the judicial prerogative[,]” the All Writs Act cannot possibly be deemed to grant to the courts the extraordinary power the government seeks. Xi v. INS, 298 F.3d 832, 839 (9th Cir. 2002).

(20) Reading this passage on how simple pen register rulings shouldn’t apply to far more intrusive surveillance, I’m reminded that Olson left DOJ in 2004 before (or about the same time as) Jim Comey et al applied PRTT to conduct metadata dragnet of Americans.

In New York Telephone Co., the district court compelled the company to install a simple pen register device (designed to record dialed numbers) on two telephones where there was “probable cause to believe that the [c]ompany’s facilities were being employed to facilitate a criminal enterprise on a continuing basis.” 434 U.S. at 174. The Supreme Court held that the order was a proper writ under the Act, because it was consistent with Congress’s intent to compel third parties to assist the government in the use of surveillance devices, and it satisfied a three-part test imposed by the Court.

(22) This is one thing that particularly pissed me off about the application of NYTelephone to this case:  there’s no ongoing use of Apple’s phone.

This case is nothing like Hall and Videotapes, where the government sought assistance effectuating an arrest warrant to halt ongoing criminal activity, since any criminal activity linked to the phone at issue here ended more than two months ago when the terrorists were killed.

(24) I think this is meant to be a polite way of calling DOJ’s claims fucking stupid (Jonathan Zdziarski has written about how any criminal use of this back door would require testimony about the forensics of this).

Use of the software in criminal prosecutions only exacerbates the risk of disclosure, given that criminal defendants will likely challenge its reliability. See Fed. R. Evid. 702 (listing requirements of expert testimony, including that “testimony [be] the product of reliable principles and methods” and “the expert has reliably applied the principles and methods to the facts of the case,” all of which a defendant is entitled to challenge); see also United States v. Budziak, 697 F.3d 1105, 1111–13 (9th Cir. 2012) (vacating order denying discovery of FBI software); State v. Underdahl, 767 N.W.2d 677, 684–86 (Minn. 2009) (upholding order compelling discovery of breathalyzer source code). The government’s suggestion that Apple can destroy the software has clearly not been thought through, given that it would jeopardize criminal cases. See United States v. Cooper, 983 F.2d 928, 931–32 (9th Cir. 1993) (government’s bad-faith failure to preserve laboratory equipment seized from defendants violated due process, and appropriate remedy was dismissal of indictment, rather than suppression of evidence). [my emphasis]

(25) “If you outlaw encryption the only people with encryption will be outlaws.”

And in the meantime, nimble and technologically savvy criminals will continue to use other encryption technologies, while the law-abiding public endures these threats to their security and personal liberties—an especially perverse form of unilateral disarmament in the war on terror and crime.

(26) The parade of horribles that a government might be able to coerce is unsurprisingly well-chosen.

For example, under the same legal theories advocated by the government here, the government could argue that it should be permitted to force citizens to do all manner of things “necessary” to assist it in enforcing the laws, like compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant,25 or requiring a journalist to plant a false story in order to help lure out a fugitive, or forcing a software company to insert malicious code in its autoupdate process that makes it easier for the government to conduct court-ordered surveillance. Indeed, under the government’s formulation, any party whose assistance is deemed “necessary” by the government falls within the ambit of the All Writs Act and can be compelled to do anything the government needs to effectuate a lawful court order. While these sweeping powers might be nice to have from the government’s perspective, they simply are not authorized by law and would violate the Constitution.

(30) “Say, why can’t NSA do this for you?”

Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks.

(33) Love the way Apple points out what I and others have: this phone doesn’t contain valuable information, and if it does, Apple probably couldn’t get at it.

Apple does not question the government’s legitimate and worthy interest in investigating and prosecuting terrorists, but here the government has produced nothing more than speculation that this iPhone might contain potentially relevant information.26 Hanna Decl. Ex. H [Comey, Follow This Lead] (“Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t.”). It is well known that terrorists and other criminals use highly sophisticated encryption techniques and readily available software applications, making it likely that any information on the phone lies behind several other layers of non-Apple encryption. See Hanna Decl. Ex. E [Coker, Tech Savvy] (noting that the Islamic State has issued to its members a ranking of the 33 most secure communications applications, and “has urged its followers to make use of [one app’s] capability to host encrypted group chats”).

26 If the government did have any leads on additional suspects, it is inconceivable that it would have filed pleadings on the public record, blogged, and issued press releases discussing the details of the situation, thereby thwarting its own efforts to apprehend the criminals. See Douglas Oil Co. of Cal. v. Petrol Stops Nw., 441 U.S. 211, 218-19 (1979) (“We consistently have recognized that the proper functioning of our grand jury system depends upon the secrecy of grand jury proceedings. . . . [I]f preindictment proceedings were made public, many prospective witnesses would be hesitant to come forward voluntarily, knowing that those against whom they testify would be aware of that testimony. . . . There also would be the risk that those about to be indicted would flee, or would try to influence individual grand jurors to vote against indictment.”).

(35) After 35 pages of thoroughgoing beating, Apple makes nice.

Apple has great respect for the professionals at the Department of Justice and FBI, and it believes their intentions are good.

(PDF 56) Really looking forward to DOJ’s response to the repeated examples of this point, which is likely to be, “no need to create logs because there will never be a trial because the guy is dead.” Which, of course, will make it clear this phone won’t be really useful.

Moreover, even if Apple were able to truly destroy the actual operating system and the underlying code (which I believe to be an unrealistic proposition), it would presumably need to maintain the records and logs of the processes it used to create, validate, and deploy GovtOS in case Apple’s methods ever need to be defended, for example in court. The government, or anyone else, could use such records and logs as a roadmap to recreate Apple’s methodology, even if the operating system and underlying code no longer exist.

(PDF 62) This is really damning. FBI had contacted Apple before they changed the iCloud password.
Screen Shot 2016-02-25 at 6.09.00 PM

(PDF 62) Wow. They did not ask for the iCloud data on the phone until January 22, 50 days after seizing the phone and 7 days before warrant expired.

Screen Shot 2016-02-25 at 6.16.11 PM

Share this entry

What Claims Did the Intelligence Community Make about the Paris Attack to Get the White House to Change on Encryption?

I’m going to do a series of posts laying out the timeline behind the Administration’s changed approach to encryption. In this, I’d like to make a point about when the National Security Council adopted a “decision memo” more aggressively seeking to bypass encryption. Bloomberg reported on the memo last week, in the wake of the FBI’s demand that Apple help it brute force Syed Rezwan Farook’s work phone.

But note the date: The meeting at which the memo was adopted was convened “around Thanksgiving.”

Silicon Valley celebrated last fall when the White House revealed it would not seek legislation forcing technology makers to install “backdoors” in their software — secret listening posts where investigators could pierce the veil of secrecy on users’ encrypted data, from text messages to video chats. But while the companies may have thought that was the final word, in fact the government was working on a Plan B.

In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.

The approach was formalized in a confidential National Security Council “decision memo,” tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that may need to be changed to counter what FBI Director James Comey calls the “going dark” problem: investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Details of the memo reveal that, in private, the government was honing a sharper edge to its relationship with Silicon Valley alongside more public signs of rapprochement. [my emphasis]

That is, the meeting was convened in the wake of the November 13 ISIS attack on Paris.

We know that last August, Bob Litt had recommended keeping options open until such time as a terrorist attack presented the opportunity to revisit the issue and demand that companies back door encryption.

Privately, law enforcement officials have acknowledged that prospects for congressional action this year are remote. Although “the legislative environment is very hostile today,” the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, “it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.”

There is value, he said, in “keeping our options open for such a situation.”

Litt was commenting on a draft paper prepared by National Security Council staff members in July, which also was obtained by The Post, that analyzed several options. They included explicitly rejecting a legislative mandate, deferring legislation and remaining undecided while discussions continue.

It appears that is precisely what happened — that the intelligence community, in the wake of a big attack on Paris, went to the White House and convinced them to change their approach.

So I want to know what claims the intelligence community made about the use of encryption in the attack that convinced the White House to change approach. Because there is nothing in the public record that indicates encryption was important at all.

It is true that a lot of ISIS associates were using Telegram; shortly after the attack Telegram shut down a bunch of channels they were using. But reportedly Telegram’s encryption would be easy for the NSA to break. The difficulty with Telegram — which the IC should consider seriously before they make Apple back door its products — is that its offshore location probably made it harder for our counterterrorism analysts to get the metadata.

It is also true that an ISIS recruit whom French authorities had interrogated during the summer (and who warned them very specifically about attacks on sporting events and concerts) had been given an encryption key on a thumb drive.

But it’s also true the phone recovered after the attack — which the attackers used to communicate during the attack — was not encrypted. It’s true, too, that French and Belgian authorities knew just about every known participant in the attack, especially the ringleader. From reports, it sounds like operational security — the use of a series of burner phones — was more critical to his ability to move unnoticed through Europe. There are also reports that the authorities had a difficult time translating the dialect of (probably) Berber the attackers used.

From what we know, though, encryption is not the reason authorities failed to prevent the French attack. And a lot of other tools that are designed to identify potential attacks — like the metadata dragnet — failed.

I hate to be cynical (though comments like Litt’s — plus the way the IC used a bogus terrorist threat in 2004 to get the torture and Internet dragnet programs reauthorized — invite such cynicism). But it sure looks like the IC failed to prevent the November attack, and immediately used their own (human, unavoidable) failure to demand a new approach to encryption.

Update: In testimony before the House Judiciary Committee today, Microsoft General Counsel Brad Smith repeated a claim MSFT witnesses have made before: they provided Parisian law enforcement email from the Paris attackers within 45 minutes. That implies, of course, that the data was accessible under PRISM and not encrypted.

Share this entry

Reuters Asks Even Stupider Questions about Apple-FBI Fight than Pew

In my post on Pew’s polling on whether Apple should have to write a custom version of its operating system so FBI can brute force the third phone, I gave Pew credit for several aspects of its question, but suggested the result might be different if Pew had reminded the people the FBI has already solved the San Bernardino attack.

Imagine if Pew called 1000 people and asked, “would you support requiring Apple to make iPhones less secure so the FBI could get information on a crime the FBI has already solved?”

As I said, at least Pew’s question was fair.

Not so Reuters’ questions on the same topic. After asking a bunch of questions to which three-quarters said they would not be willing to give up their own privacy to ward against terrorism or hacking, Reuters than asked this question:

Apple is opposing a court order to unlock a smart phone that was used by one of the shooters in the San Bernardino attack. Apple is concerned that if it helps the FBI this time, it will be forced to help the government in future cases that may not be linked to national security, opening the door for hackers and potential future

Do you agree or disagree with Apple’s decision to oppose the court order?

While Reuters explains why Apple opposes the order — because it will be [in fact, already has been] asked to help break into more phones that have nothing to do with terrorism, creating vulnerabilities for hackers — the wording of the question could easily be understood to imply that Syed Rezwan Farook’s phone “was used [] in the San Bernardino attack.” It’s not clear Farook even used the phone after November, two days before his attack. And to the extent Farook and his wife used phones during the attack — as implied by the question — they are believed to be the phones they tried unsuccessfully to destroy.

Yet, even with his problematically framed question, 46% of respondents (on an online poll, which likely skews towards tech facility) supported Apple’s actions.

There’s a problem, too, with the only question for which a plurality supported the FBI’s snooping. a graph of which Reuters highlighted in its story.

The government should be able to look at data on Americans’ phones in order to protect against terror threats.

There are cases where investigators find information on a smart phone that helps prevent follow-on attacks (in happened in Paris with a phone that was not encrypted). Border searches(which I admittedly believe to be one of the real reasons FBI objects to default encryption), too, might prevent terror attacks. But more often, we’re talking about investigating crimes deemed to be terrorism after the fact (or, far, far more often, solving drug crimes).

Nothing the FBI could do with the data on Farook’s work phone will prevent the deaths of the 14 people he already killed. There are other kinds of surveillance far better suited to doing that.

Share this entry

This Apple Fight Is (Partly) about Solving Car Accidents

I am going to spend my day laying out what a cynical man FBI Director Jim Comey is — from setting up a victims’ brief against Apple even before the government served Apple here, to this transparently bogus garbage post at Lawfare.

But first I wanted to reemphasize a detail I’ve noted before. On February 9, at a time when FBI already knew how it was going to go after Apple, Jim Comey said this in a hearing to the Senate Intelligence Committee:

I’d say this problem we call going dark, which as Director Clapper mentioned, is the growing use of encryption, both to lock devices when they sit there and to cover communications as they move over fiber optic cables is actually overwhelmingly affecting law enforcement. Because it affects cops and prosecutors and sheriffs and detectives trying to make murder cases, car accident cases, kidnapping cases, drug cases. It has an impact on our national security work, but overwhelmingly this is a problem that local law enforcement sees.

Even before he served Apple here, Comey made it clear this was about law enforcement, not terrorism cases, his cynical invocation of the San Bernardino victims notwithstanding.

And not just law enforcement: “car accidents.”

Since it got its All Writs Act, FBI has said this Apple request is a one-off request, just for this terrorism case they already know the perpetrators of. But at a time when it already knew it was going to get an AWA order, Jim Comey was more frank. This is about car accidents. Car accidents, murder, kidnapping, and drugs (the last All Writs Act request was about drugs, in a case where they had enough evidence to get the guy to plead guilty anyway, if there are any doubts they would demand an AWA going forward).

Car accidents.

Share this entry

District Attorneys Use Spying as Cover To Demand a Law Enforcement Back Door

In response to a question Senate Intelligence Committee Chair Richard Burr posed during his committee’s Global Threat hearing yesterday, Jim Comey admitted that “going dark” is “overwhelmingly … a problem that local law enforcement sees” as they try to prosecute even things as mundane as a car accident.

Burr: Can you, for the American people, set a percentage of how much of that is terrorism and how much of that fear is law enforcement and prosecutions that take place in every town in America every day?

Comey: Yeah I’d say this problem we call going dark, which as Director Clapper mentioned, is the growing use of encryption, both to lock devices when they sit there and to cover communications as they move over fiber optic cables is actually overwhelmingly affecting law enforcement. Because it affects cops and prosecutors and sheriffs and detectives trying to make murder cases, car accident cases, kidnapping cases, drug cases. It has an impact on our national security work, but overwhelmingly this is a problem that local law enforcement sees.

Much later in the hearing Burr — whose committee oversees the intelligence but not the law enforcement function of FBI, which functions are overseen by the Senate Judiciary Committee — returned to the issue of encryption. Indeed, he seemed to back Comey’s point — that local law enforcement is facing a bigger problem with encryption than intelligence agencies — by describing District Attorneys from big cities and small towns complaining to him about encryption.

I’ve had more District Attorneys come to me that I have the individuals at this table. The District Attorneys have come to me because they’re beginning to get to a situation where they can’t prosecute cases. This is town by town, city by city, county by county, and state by state. And it ranges from Cy Vance in New York to a rural town of 2,000 in North Carolina.

Of course, the needs and concerns of these District Attorneys are the Senate Judiciary Committee’s job to oversee, not Burr’s. But he managed to make it his issue by calling those local law enforcement officials “those who complete the complement of our intelligence community” in promising to take up the issue (though he did make clear he was not speaking for the committee in his determination on the issue).

One of the responsibilities of this committee is to make sure that those of you at at the table and those that comp — complete the complement of our intelligence community have the tools through how we authorize that you need. [sic]

Burr raised ISIS wannabes and earlier in the hearing Comey revealed the FBI still hadn’t been able to crack one of a number of phones owned by the perpetrators of the San Bernardino attack. And it is important for the FBI to understand whether the San Bernardino attack was directed by people in Saudi Arabia or Pakistan that Tashfeen Malik associated with before coming to this country planning to engage in Jihad.

But only an hour before Jim Comey got done explaining that the real urgency here is to investigate drug cases and car accident cases, not that terrorist attack.

The balance between security, intelligence collection, and law enforcement is going to look different if you’re weighing drug investigations against the personal privacy of millions than if you’re discussing terrorist communications, largely behind closed doors.

Yet Richard Burr is not above pretending this about terrorism when it’s really about local law enforcement.

Share this entry

More Evidence Secret “Tweaks” To Section 702 Coming

Way at the end of yesterday’s Senate Intelligence Committee Global Threats hearing, Tom Cotton asked his second leading question permitting an intelligence agency head to ask for surveillance, this time asking Admiral Mike Rogers whether he still wanted Section 702 (the first invited Jim Comey to ask for access to Electronic Communications Transactions Records with National Security Letters, as Chuck Grassley had asked before; Comey was just as disingenuous in his response as the last time he asked).

Curiously, Cotton offered Rogers the opportunity to ask for Section 702 to be passed unchanged. Cotton noted that in 2012, James Clapper had asked for a straight reauthorization of Section 702.

Do you believe that Congress should pass a straight reauthorization of Section 702?

But Rogers (as he often does) didn’t answer that question. Instead, he simply asserted that he needed it.

I do believe we need to continue 702.

At this point, SSCI Chair Richard Burr piped up and noted the committee would soon start the preparation process for passing Section 702, “from the standpoint of the education that we need to do in educating and having Admiral Rogers bring us up to speed on the usefulness and any tweaks that may have to be made.”

This seems to parallel what happened in the House Judiciary Committee, where it is clear some discussion about the certification process occurred (see this post and this post).

Note this discussion comes in the wake of a description of some of the changes made in last year’s certification in this year’s PCLOB status report. That report notes that last year’s certification process approved the following changes:

  • NSA added a requirement to explain a foreign intelligence justification in targeting decisions, without fully implementing a recommendation to adopt criteria “for determining the expected foreign intelligence value of a particular target.” NSA is also integrating reviewing written justifications in its auditing process.
  • FBI minimization procedures were revised to reflect how often non-national security investigators could search 702-collected data, and added new limits on how 702 data could be used.
  • NSA and CIA write justifications for conducting back door searches on US person data collected under Section 702, except for CIA’s still largely oversight free searches on 702-collected metadata.
  • NSA and CIA twice (in January and May) provided FISC with a random sampling of its tasking and US person searches, which the court deemed satisfactory in its certification approval.
  • The government submitted a “Summary of Notable Section 702 Requirements” covering the rules governing the program, though this summary was not comprehensive nor integrated into the FISC’s reauthorization.

As the status report implicitly notes, the government has released minimization procedures for all four agencies using Section 702 (in addition to NSA, CIA, and FBI, NCTC has minimization procedures), but it did so by releasing the now-outdated 2014 minimization procedures as the 2015 ones were being authorized. At some point, I expect we’ll see DEA minimization procedures, given that the shutdown of its own dragnet would lead it to rely more on NSA ones, but that’s just a wildarseguess.

Share this entry

What Secrets Are the Spooks Telling HJC about Section 702?

There’s a paper that has been making waves, claiming it has found a formula to debunk conspiracies based on the likelihood if they were real, they would have already been leaked. Never mind that people have already found fault with the math, the study has another glaring flaw. It treats the PRISM program — and not, say, the phone dragnet — as one of its “true” unknown conspiracies.

PRISM — one part of the surveillance program authorized by Section 702 of the FISA Amendments Act — was remarkable in that it was legislated in public. There are certainly parts of Section 702 that were not widely known, such as the details about the “upstream” collection from telecom switches, but even that got explained to us back in 2006 by Mark Klein. There are even details of how the PRISM collection worked — its reliance on network mapping, the full list of participants. There are details that were exposed, such as that the government was doing back door searches on content collected under it, but even those were logical guesses based on the public record of the legislative debates.

Which is why it is so remarkable that — as I noted here and here — House Judiciary Committee Chair Bob Goodlatte has scheduled a classified hearing to cover the program that has been the subject of open hearings going back to at least 2008.

The hearing is taking place as we speak with the following witnesses.

  • Mr. Robert S. Litt
    General Counsel
    Office of the Director of National Intelligence
  • Mr. Jon Darby
    Deputy Director for Analysis and Production, Signals Intelligence Directorate
    National Security Agency
  • Mr. Stuart J. Evans
    Deputy Assistant Attorney General for Intelligence, National Security Division
    U.S. Department of Justice
  • Mr. Michael B. Steinbach
    Assistant Director for Counterterrorism
    Federal Bureau of Investigation

This suggests there is either something about the program we don’t already know, or that the government is asking for changes to the program that would extend beyond the basic concept of spying on foreigners in the US using US provider help.

I guess we’re stuck wildarseguessing what those big new secrets are, given the Intelligence Community’s newfound secrecy about this program.

Some observations about the witnesses. First, between Litt and Evans, these are the lawyers that would oversee the yearly certification applications to FISC. That suggests the government may, in fact, be asking for new authorities or new interpretations of authorities.

Darby would be in charge of the technical side of this program. Since the PRISM as it currently exists is so (technologically) simple, that suggests the new secrets may involve a new application of what the government will request from providers. This might be an expansion of upstream, possibly to bring it closer to XKeyscore deployment overseas, possibly to better exploit Tor. Remember, too, that under USA Freedom Act, Congress authorized the use of data collected improperly, provided that it adheres to the new minimization procedures imposed by the FISC. This was almost certainly another upstream collection, which means there’s likely to be some exotic new upstream application that has caused the government some problems of late.

Note that the sole FBI witness oversees counterterrorism, not cybersecurity. That’s interesting because it would support my suspicions that the government is achieving its cybersecurity collection via other means now. But also that any new programs may be under the counterterrorism function. Remember, the NatSec bosses, including Jim Comey, just went to Silicon Valley to ask for help applying algorithms to identify terrorism content. Remember, too, that such applications would have been useless to prevent the San Bernardino attack if they were focused on the public social media content. So it may be that NSA and FBI want to apply algorithms identifying radicalizers to private content.

Finally, and critically, remember the Apple debate. In a public court case, Apple and the FBI are fighting over whether Apple can be required to decrypt its customers’ smart device communications. The government has argued this is within the legal notion of “assistance to law enforcement.” Apple disagrees. I think it quite possible that the FBI would try to ask for decryption help to be included under the definition of “assistance” under Section 702. Significantly, these witnesses are generally those (including Bob Litt and FBI counterterrorism) who would champion such an interpretation.

Share this entry