“Too much transparency defeats the very purpose of democracy”
In truly bizarre testimony he will deliver to the House Intelligence Committee next week, Paul Rosenzweig argues that “too much transparency defeats the very purpose of democracy.” He does so, however, in a piece arguing that the government needs what amounts to be almost full transparency on all its citizens.
The first section of Rosenzweig analysis talks about the power of big data. It doesn’t provide any actual evidence that big data works, mind you. On the contrary, he points to one failure of big data.
When we speak of the new form of “dataveillance,” we are not speaking of the comparatively simple matching algorithms that cross check when a person’s name is submitted for review¾when, for example, they apply for a job. Even that exercise is a challenge for any government, as the failure to list Abdulmutallab in advance of the 2009 Christmas bombing attempt demonstrates.[11] The process contains uncertainties of data accuracy and fidelity, analysis and registration, transmission and propagation, and review, correction, and revision. Yet, even with those complexities, the process uses relatively simple technologically—the implementation is what poses a challenge.
By contrast, other systems of data analysis are far more technologically sophisticated. They are, in the end, an attempt to sift through large quantities of personal information to identify subjects when their identities are not already known. In the commercial context, these individuals are called “potential customers.” In the cyber conflict context, they might be called “Anonymous” or “Russian patriotic hackers.” In the terrorism context, they are often called “clean skins” because there is no known derogatory information connected to their names or identities. In this latter context, the individuals are dangerous because nothing is known of their predilections. For precisely this reason, this form of data analysis is sometimes called “knowledge discovery,” as the intention is to discover something previously unknown about an individual. [my emphasis]
Nevertheless, having not provided evidence big data works, he concludes that “There can be little doubt that data analysis of this sort can prove to be of great value.”
The reference to Abdulmutallab is curious. At the beginning of his testimony he repeats the reference.
In considering this new capability we can’t have it both ways. We can’t with one breath condemn government access to vast quantities of data about individuals, as a return of “Big Brother”[4] and at the same time criticize the government for its failure to “connect the dots” (as we did, for example, during the Christmas 2009 bomb plot attempted by Umar Farouk Abdulmutallab.
This formulation — and the example of Abdulmutallab even more so — is utterly crazy. Having big data is not the same thing as analyzing it correctly. Criticism that the Intelligence Community failed to connect the dots — with the UndieBomb attack, but even with 9/11 — assumes they had the dots but failed to analyze them or act on that analysis (as the IC did fail, in both cases). Indeed, having big data may actually be an impediment to analyzing it, because it drowns you. And while Rosenzweig suggests the only big data failure with Abdulmutallab involved not placing him on a watch list, that’s false. The NSA had wiretaps on Anwar al-Awlaki which, according to the government, collected information tying Abdulmutallab to an attack.
Yet they didn’t respond to it.
And you know what? We measly citizens don’t know why they didn’t respond to it — though we do know that the FBI agents who were analyzing the Awlaki data were … you guessed it! Overwhelmed.
Before anyone involved in government claims that big data helps — rather than hinders — they should have to explain why a full-time tap on Anwar al-Awlaki didn’t find the guy who was texting him about a terrorist attack. Particularly in the absence of any other compelling evidence big data works (and the Administration’s claims of 54 “terrorist events stopped” barely makes a claim to justify Section 702 collection and certainly doesn’t justify Section 215), then logical conclusion is that it in fact does the opposite.
Having made the unsubstantiated claim that giving the government full transparency on citizens and others provides a benefit, Rosenzweig then dismisses any privacy concerns by redefining it.
Part of that involves claiming — reports of the collection of address books notwithstanding — that so long as we don’t get identified it doesn’t matter.
The anonymity that one has in respect of these transactions is not terribly different from “real-world anonymity.” Consider, as an example, the act of driving a car. It is done in public, but one is generally not subject to routine identification and scrutiny.
He then proposes we not limit what can be seen, but instead ensure that nothing unjustified can happen to you based on the discovery of something about you.
In other words, the veil of anonymity previously protected by our “practical obscurity” that is now so readily pierced by technology must be protected by rules that limit when the piercing may happen as a means of protecting privacy and preventing governmental abuse. To put it more precisely, the key to this conception of privacy is that privacy’s principal virtue is a limitation on consequence. If there are no unjustified consequences (i.e., consequences that are the product of abuse or error or the application of an unwise policy) then, under this vision, there is no effect on a cognizable liberty/privacy interest. In other words, if nobody is there to hear the tree, or identify the actor, it really does not make a sound.
If nothing bad in real life happens to you because of this transparency the government should have on citizens, Rosenzweig argues, nothing has happened.
For the moment, I’ll just bracket the many examples where stuff happens in secret — being put on a no fly list, having your neighbor recruited as an informant using data the NSA found, having your computer invaded based on equations of Anonymous with hacker — that still have effects. On those, no one can now assess whether something bad has happened unjustly, because no one will ever see it. And I’ll bracket all the things everyone has ever written about how living in a Panopticon changes behavior and with it community.
Here’s how Rosenzweig justifies setting up a (what he fancies to be anonymous but isn’t, really) Panopticon while denying citizens the same right to see; here’s how he supports his “too much transparency” comment.
Finally, let me close this statement of principles by noting that none of this is to diminish the significance of the transparency and oversight, generally. Transparency is a fundamental and vital aspect of democracy. Those who advance transparency concerns often, rightly, have recourse to the wisdom of James Madison, who observed that democracy without information is “but prologue to a farce or a tragedy.”[13]
Yet Madison understood that transparency was not a supreme value that trumped all other concerns. He also participated in the U.S. Constitutional Convention of 1787, the secrecy of whose proceedings was the key to its success. While governments may hide behind closed doors, U.S. democracy was also born behind them. It is not enough, then, to reflexively call for more transparency in all circumstances. The right amount is debatable, even for those, like Madison, who understand its utility.
What we need is to develop an heuristic for assessing the proper balance between opacity and transparency. To do so we must ask, why do we seek transparency in the first instance? Not for its own sake. Without need, transparency is little more than voyeurism. Rather, its ground is oversight–it enables us to limit and review the exercise of authority.
Man, that series of sentences … “without need, transparency is little more than voyeurism” … “why do we seek transparency for its own sake” are pretty ironic in testimony defending the NSA’s collection of records of every phone-based relationship in the US, of having access to 75% of the Internet traffic in the US, and of tapping 35 world leaders just because it could.
But first, Madison.
Because Madison participated in a series of secret meetings the results of which and eventually the details of which were subsequently made public to the entire world, Rosenzweig suggests Madison might support a system where citizens never got to learn how close to all their data the government collects and how it uses it.
Then he argues the only purpose of transparency — the thing separating it from voyeurism — is “oversight,” which he describes as limit[ing] and review[ing] the exercise of authority.
If he thought this through, he might realize that even if that’s the only legitimate purpose for transparency, it’d still require some oversight over the Executive and the Legislature that, in his delegated model of oversight simply would not and could not (and does not) exist. One thing we’re learning about the dragnet, for example, is that a good deal of collection on US persons goes on under Executive Order 12333 that gets almost no Congressional review at all. And that’s just the most concrete way we’re learning how inadequate the oversight practiced by the Intelligence Committees is.
But that’s not the only purpose of transparency.
One other purpose of transparency — arguably, the purpose of democracy — is to exercise some rationality to assess the best policies. The idea is that if you debate policies and only then decide on them, you end up with more effective policies overall. It doesn’t always work out that way, but the idea, in any case, is that policies subjected to debate end up being smarter than policies thought up in secret.
It’s about having the most effective government.
So in addition to making sure no one breaks the law (Rosenzweig seems unconcerned that NSA has been caught breaking the law and violating court orders repeatedly), transparency — democracy — is supposed to raise the chances of us following better policies.
I presume Rosenzweig figures the debate that goes on within the NSA and within the National Security Counsel adequate to the task of picking the best policies (and the Constitution certainly envisions the Executive having a great deal of that debate take place internally, though surely not on programs as monumental as this).
But here’s the thing: the public evidence — whether it be missing the Abdulmutallab texts on an attack, the thin claims of 54 terrorist events, or Keith Alexander’s reports that the US has been plundered like a colony via cyberattacks under his watch — it’s actually not clear this approach is all that effective. In fact, there’s at least reason to believe some parts of the approach in place are ineffective.
That’s why we need more transparency. Not to be voyeurs on a bunch of analysts at NSA (really?). But to see if there’s a better way to do this.
Ultimately, though, Rosenzweig defeats himself. He’s right that we need to find “the proper balance between opacity and transparency” (though he might step back and reconsider what the “very purpose of democracy” is before he chooses that balance). But it is utterly illogical to suggest the balance be set for almost complete transparency when the government looks at citizens — records of all their phone-based relationships and access to 75% of the Internet data — but then argue that delegated transparency (but with almost no transparency on the delegated part) is adequate for citizens looking back at their government.
Related: Homeland Security Czar Lisa Monaco endorses the idea that just because we can collect it doesn’t mean we should. Michael Hayden learns surveillance isn’t actually all that fun. And Keith Alexander says we should get rid of journalism.
Bizarre indeed.
Though one does get the strong impression that Mr Rosenzweig is talking his own book: “Paul Rosenzweig is the founder of Red Branch Consulting PLLC, a homeland security consulting company…”
He does make a point with which I am in full agreement (as should you be, I think):
the veil of anonymity previously protected by our “practical obscurity” that is now so readily pierced by technology must be protected by rules that limit when the piercing may happen as a means of protecting privacy and preventing governmental abuse…
Only to follow it with a complete non-sequitur:
To put it more precisely, the key to this conception of privacy is that privacy’s principal virtue is a limitation on consequence.
Surely the key to ‘this conception of privacy’ is transparency in government (and its contractors), rules which strictly circumscribe the extent of snooping, and serious consequences for those that breach such rules.
You missed this other minor classic from the testimony, which explains where he’s coming from:
Fourth, our current system of intelligence oversight generally works.
!!!
It appears that Mr Rosenzweig has become one of Daniel Ellsburg’s morons.
http://www.motherjones.com/kevin-drum/2010/02/daniel-ellsberg-limitations-knowledge
…It’s from Ellsberg’s book Secrets, and the setting is a meeting with Henry Kissinger in late 1968 when he was advising him about the Vietnam War:
“Henry, there’s something I would like to tell you, for what it’s worth, something I wish I had been told years ago. You’ve been a consultant for a long time, and you’ve dealt a great deal with top secret information. But you’re about to receive a whole slew of special clearances, maybe fifteen or twenty of them, that are higher than top secret…
…”You will deal with a person who doesn’t have those clearances only from the point of view of what you want him to believe and what impression you want him to go away with, since you’ll have to lie carefully to him about what you know. In effect, you will have to manipulate him. You’ll give up trying to assess what he has to say. The danger is, you’ll become something like a moron. You’ll become incapable of learning from most people in the world, no matter how much experience they may have in their particular areas that may be much greater than yours…“
Rosenzweig works for Skeletor and the Heritage Foundation! Of course.
Did Rosie forget about Undie #2 which was a False Flag Op, and would be continued time and time again. At least that is what the official leaks claimed, to justify the Undie #2 attack on an airliner.
Being a member of the Secret Government, Rosie is not allowed to discuss forbidden history. That includes Patrick Kennedy whose testimony seems to be vanishing from history.
Bin Laden was probably set up by Ollie North or Graham Fuller, Iran-Contra veterans. Most of the Al Qaeda “terror attacks” are likely to be False Flag Ops by the Secret Government.
quote”In the terrorism context, they are often called “clean skins” because there is no known derogatory information connected to their names or identities. In this latter context, the individuals are dangerous because nothing is known of their predilections.”unquote
Clean Skins. unhunh. Nothing is known of their predilections…right.
BWAHAHAHAHAHAHAHAHAHAHA…HOHOHOHOHOHHOHOH…HEHEHEHEHEHEHEHE..HAHAHAHAHA!
This schmuck redefines paranoid. In other words, 99.9999999% of the human beings on this planet are …ahem..dangerous.
No wonder we’re ALL considered..potential terrorists now. They don’t know what our ..ahem..”predilections” are. Ya know..I’d submit that if the NSA knew how many people have predilections to revolt should the USG commit another Great Moment in Waco2 Stupidity..the FEMA camps would be filling exponentially.
The Homeland Security Model—from Rosenzweig’s Red Branch Consulting website:
—“In short, few in the United States know as much about the homeland security paradigm as we do at Red Branch Law & Consulting.
We know how the system works and we know the players within the system.
We know what foreign concerns are and we understand how the rest of the world is slowly migrating towards a homeland security model similar, at least to some degree, on that developed in the United States. There are many opportunities in the changing world and many pitfalls.
We can guide corporations and foreign governments as they navigate these treacherous waters.”—
http://www.redbranchconsulting.com/
Big data works. That’s not saying that big data works for every purpose it’s used. Stupid people can work with big data and get stupid results. Stupid or evil people can misinterpret big data-derived results.
I went to a talk at MIT last year on “Algorithmic Criminology” by Richard Berk of U Penn. He’s a criminologist working for state agencies using big data techniques to predict the risk presentsed by, for example, prisoners at intake into their incarceration or who are being released on parole. Quantitative techniques have been in use for awhile in these fields, but his work has shifted to using big data techniques and he says – and has the peer reviewed papers to back him up – that those techniques are now doing better than older methods. “We’re not Minority Report yet, but we’re getting there.” Actual quote from a sober-minded academic.
The Committee member morons in DC who don’t even understand, or bother to learn [Google or Bing congress people, fer crissakes], how websites are designed or computers programmed should have to spell, define, and use the word “heuristic” in a sentence before asking any question on transparency (also, too that word).
I wonder if Julia Louis Dreyfus thinks Veep is more reality than parody.
@Frank33:
Holy mother of hidden history. And here I thought nothing could surprise me anymore. Ha. The cesspool of USG/ruling class debauchery is bottomless. Having spent the last 30 yrs trying to unravel my propaganda indoctrination since birth, I’m finally starting to realize Revelations is alive and on track. What is surprising though is the depth to which the ruling class have provided living proof of it’s truth. Thankfully though, they will not escape the fate that awaits this little planet in the universe. I only hope a few will face LIVING hell before they are judged for eternity.
I want to distinguish between what Rosenzweig says and what you make of it. I support almost nothing about what he says, but you have to see that there is a sense in which you almost contradict yourself, and it’s important.
Much of the data the NSA uses and Rosenzweig refers to is, arguably, *already* government data. In this class we could include tax records, medical records, the personnel files and communications of all government employees (including relatively nonpolitical ones like teachers, professors, bus and subway drivers, civil clerks, and many others), census data, voting records, and much else. The important thing to recognize is that this *is*, on almost any reasonable construal, information “about” the government, but it’s also information about people. That’s in part because our government is “of the people, by the people, and for the people.”
The contradiction in what you are saying is that you clearly do NOT want this data to be released *within* the government, let alone outside of government. You have objected, rightly and repeatedly, to the NSA viewing people as necessarily transparent. The solution to that is not to say “transparency = democracy,” as you directly do here. It is to recognize that democracy requires a great deal of privacy as well as transparency. I think people who have been so vigilant about the intelligence agency excess need to be more sensitive to the fact that “full transparency” actually tends exactly toward the total surveillance to which you and I rightly object. It is only recently that anyone has ever even suggested that democracy should err on the side of transparency, or in other words, against privacy. Privacy should actually be the default. And the line between “government” and “the people” is not only one that needs to be assessed on a case-by-case basis; it is a line the Founders purposely said does not really exist. It doesn’t, it shouldn’t, and the bad formulations of ideologues like Rosenzweig should not sway us into going much too far in the opposite direction.
@Saul Tannenbaum: “We’re not Minority Report yet, but we’re getting there.”
quote”“We’re not Minority Report yet, but we’re getting there.”
Actual quote from a sober-minded academic. ” unquote
Sober minded. right. Ummm..in what parallel universe? Sounds like he’s a certifiable psychopath to me. I mean, who in their right mind would create things that reduce human beings to bits in an algorithm? Oh..DOH! MIT “academics”…of course. sheeeeeezushfuckingchrist. If only they had a soul.
Quants on Wall Street essentially crashed the global economy (it’s still happening in slow motion) and destroyed millions of lives, all because they thought they had financial “big data” all figured out.
That testimony is surreal. I don’t even have words for my reaction to it.
And nobody should forget just how much money is wrapped up in big data and the privatized intelligence community and how much of it is moving there right now because of the major shift in military strategy, the plans to build fewer planes, ships, tanks, etc. The great vacuum, the great collector of the holy data is threatened. What happens if Congress reins it in? Some huge percentage of the work is privatized. They could continue if they wanted to, as long as the risk of investigation and prosecution was very low and there are ways of handling that. We are at such a critical crossroads right now.
rosensweig?
“too much transparency defeats the very purpose of democracy”
has this self-deluding son-of-a-bitch never read george orwell?
does he really believe a bastard use of james madison will cloak his rightwing political drivel?
behind his whole piece is the unexpressed thesis “give up, citizens. you can not stop high-technology spying.”
let me suggest an antidote:
“no commercial or government entity may collect information of any kind on any american citizen without the written consent of that
citizen, specifying precisely what kind of information will be collected. futhermore, no service or commercial product may be withheld if an individual refuses to sign a release to collect her personal data.”
in other words, big data is no longer free. let’s see what happens then :))
@Saul Tannenbaum: Oh, big data CAN work, no doubt about it. It often, though not always, works for marketing, and when studying something for which you have an N of millions, it surely can work.
But for neither terrorism nor hacking does it appear they yet understand the problem to have valid algorithms yet. And there are a lot of cases — the most visible being the no fly list — that suggest their algos are badly flawed.
Problem is you can’t correct the algos if you never check the results, and right now the secrecy of the program is ensuring that they never check the results.
Having spent 40 years in the computer business and the last 20 building analytical systems for major corporations I am confident that it is impossible to build any significantly predictive system. And will continue to be impossible for any foreseeable future; my prediction :-)
I say this for two reasons; 1/ historical data has no relationship to the future, 2/ it is at best a statistical indicator and will never identify specific events.
So, to point 1, having Big Data is irrelevant. It simply provides more of the same trend information that permits educated extrapolation of historical events. It may permit fine tuning the analysis but will not reliably help predict the future because it is not cognizant of low-probability but high-impact events. And it is these events that have the greatest impact on the future.
Nassim Nicholas Taleb covers this in his excellent book, The Black Swan. See here for background http://en.wikipedia.org/wiki/Black_swan_theory
As to point 2, the idea that spying on and collecting ALL details of the lives and activities of every citizen is going to reduce terrorism is fanciful. The best it can do is expose correlations between cause and effect permitting a good analyst to quantify the number of events of a certain type that may occur but not the specific events themselves.
For example, an analyst can reliable predict that with every drone kill in Pakistan the chance of a retaliatory terrorist event is increased by some percentage with some estimated probability. But there is no chance we will know what that terrorist event is, when it will occur or who will be involved.
Which is why I have said in previous posts, the whole terrorist-act-prevention industry is a huge insurance scam. There are billion$ being spent on this insurance and it has NO chance of success because it does not address the cause.
There is no way to reliably predict future events.
The NSA can likely predict their will be another whistleblower but who that will be or when is unpredictable. And while they may look at historical trends and see that they have a predictable level of whistleblowers in any period, they have no way to know when the next low-probability but high-impact whistleblower like Snowden will occur or who it will be.
@Nigel:
quote:“Paul Rosenzweig is the founder of Red Branch Consulting PLLC, a homeland security consulting company…”unquote
BWHAHAHAHAHAHAHAHAHAHAHA! Get a load of that picture of him on his site.
Bizarre indeed. With a tie like that, it’s a wonder every member on the committee wasn’t rolling in the isles in gut splitting laughter. All he needs is a red ball on his nose and this schmuck would make Barnum history.
@Greg Bean (@GregLBean):
quote:”There is no way to reliably predict future events.”
Dang. And here I was convinced my stomach was kidding.
@Greg Bean (@GregLBean):
thanks for this comment. it is intuitively appealing to me.
the emperor simply cannot be allowed to get awsy with selling the nation swamp water as predictive analysis that will protect us.
Five years ago I heard about “Data-Driven policing” which required expensive computer equipment, consulting services, and a video camera on every block. Five years before that it was “Profiling” which required extensive guidance from outside ‘experts’ and intensive surveillance to “get inside the criminal mind.” Before that the FBI was pushing the clipper chip and the idea that all communications must be sent through their custom contracted hardware. Now it is “Big Data.”
Every five years the buzzword changes. Every single day though the push is the same, less privacy for the public, less transparency for the government, and more money for the consultants.
The only thing that makes this schmuck stand out is his incoherence.
@Greg Bean (@GregLBean): Quite true.
And to add to it, it shouldn’t take extensive data to predict that killing civilians with drones will make them want to kill us.
I don’t why anyone would be surprised by the substance and slant to his proposed testimony: look who employs him!
The real question (and travesty) is that Congress calls people like him to testify therefore framing the conversation to the advantage of the Status Quo.
@Nigel: The Ellsberg quote is one of my favorites! Can’t hear the word “intelligence” without thinking of it. I quoted it myself before, here: http://www.emptywheel.net/2011/12/30/we-request-to-inform-you-that-you-inform-us-we-killed-another-drone-target/#comment-330641 – and included more of it, before and after the part that you used. And put it in the company of quotes from Vaclav Havel and Julian Assange. Maybe it’s a big mishmash but it all hangs together for me. I doubt that Rosenzweig would get it though.
@Frank33: So glad you mentioned that! I was thinking we were never going to get to the reality (maybe) of the Undiebomber and the databases and the clues beforehand that went… well, here’s the State Department’s Patrick Kennedy testifying to Congress, youtube and transcript:
Maybe Kennedy told Cardin in private who of “our partners in the intelligence or law enforcement communities” told the State Department to let the Undiebomber on the plane, which would make Rosenzweig… happy? Hopefully Cardin or someone with oversight power in Congress asked Kennedy to tell him in private. (Do we know?) (Does Rosenzweig think it matters?)
@Frank33: Your link goes to a very long and researched article – I haven’t the time to read it all now but appreciate the link and effort. One thing I want to take exception to, though, and I think this goes to Saul Tannenbaum’s Minority Report comment @5 too. At your link, you say:
As if Abdulmutallab’s identity as a terrorist is definite / total / forever / unquestionable. But the kids he went to school with called him “the Pope” because of his concern for others less fortunate, selflessness and prayers: http://www.emptywheel.net/2012/02/03/when-was-the-last-time-a-pope-was-shit-faced-drunk-on-the-streets-of-paris/#comment-333416
Minority Report-wise, if we are so all-seeing and wise, why weren’t we looking for a way to find the pre-saint in Umar instead of the pre-criminal? Why do we write scripts for terrorists and then go casting for marks to play the roles? In the class photo in this article ( http://www.nydailynews.com/news/national/teacher-umar-farouk-abdulmutallab-flight-253-terrorist-openly-supported-taliban-article-1.432414 ) Umar is the one pointing to the sign that says “STOP KILLING KIDS.” Does the all-seeing, wise and unquestionable IC ever consider that we STOP KILLING KIDS? Or is that out of their compartment?
@blubert: I’m trying to wrap my head around this. Reminds me of Yoko Ono’s conceptual architecture project from the ’60s:
(I think – the page is cropped: https://www.youtube.com/watch?v=oLTp11aWn-o @00:29)
(Love the conceptual pricing: “priced according to contractor’s arrangements and cost of property”)
Must think.
@orionATL:
“Surrender Dorothy”
witches
@thatvisionthing:
But that would have required transparency and debate and voices that could make a difference. Democracy.
Rosenzweig:
Doesn’t catch what spellcheck misses.
What could possibly go wrong.
@C:
C this
quote is in the youtube info, not the clip. But I’m thinking of Chaplin all the time these days.
@Greg Bean (@GregLBean): I think your comment gets the closest to what I want to say. Simply, and forgive me but I’m quoting Pat Boone from a Moonlighting episode: “If you want a friend, be a friend.” But the IC and its puppet masters want, nay need enemies. There’s the problem. It’s fundamenental to them and antithetical to the purpose of democracy.
@thatvisionthing: That whole let-the-Undiebomber-on-the-plane scenario sounds like Keith Alexander:
How’s that working?