Edward Snowden Wins ‘Debate’ With NSA Lawyer

Former NSA systems administrator Edward Snowden called in from Moscow via video chat for a conversation with Berkman Fellow Bruce Schneier during a symposium on "Privacy in a Networked World" at Harvard SEAS. (Photo by Scott Eisen, courtesy of Harvard IACS.)

Former NSA systems administrator Edward Snowden called in from Moscow via video chat for a conversation with Berkman Fellow Bruce Schneier during a symposium on “Privacy in a Networked World” at Harvard SEAS. (Photo by Scott Eisen, courtesy of Harvard IACS.)

At a public event last month, Edward Snowden argued that the NSA has developed a “culture of impunity,” that its people “are not villains, but they think they can do anything because it is for a just cause.” John DeLong, an NSA Director, responded that “the idea that NSA activities were unauthorized is wrong, it’s wrong in a magnificent way.”

The two came as close as possible to a live debate at the “Privacy in a Networked World” symposium at Harvard University’s Institute for Applied Computational Science where Snowden had a wide-ranging discussion via a video link with security expert Bruce Schneier.  John DeLong (Harvard Law, former Director of Compliance at the NSA, current Director of the NSA’s Commercial Solutions Center), immediately followed with his talk about privacy, insisting he did not want to turn it into a “point-by-point, Oxford-style debate.”

Indeed, the specific details about this surveillance program or that particular law are far less important than the answer to a single question: In defending your country, do you do the right thing or do you do things right?  Do you do what’s morally right or do you just follow the rules (or make sure it looks like you follow the rules)?

I think Snowden won the “debate” hands-down because I much prefer his view of the good people of the NSA. Snowden sees them as thinking humans, aware of the values of the country they are trying to defend, and capable of making difficult decisions and weighing all the ramifications of their actions. They went wrong because they got carried away by their mission.  DeLong, in contrast, views  NSA employees (himself included) as no different from the machines they work with, capable only of following unquestionably rules that are handed down to them by others, rules that can be (and should be) codified into machine language. They haven’t done anything wrong, because they have never broken the law.

As you can see in the video below, most of the Snowden-Schneier discussion revolved around the familiar themes we’ve seen since the 2013 revelations: The solution to governments intruding into our lives is technology or the encryption of all communications; mass surveillance has exploded because it’s cheap and easy but it has never stopped a terrorist attack; the technical advantage the NSA used to have over the bad guys has been all but eliminated; the NSA has shifted its focus and a much larger proportion its effort is in offence, not defense.

[youtube https://www.youtube.com/watch?v=7Ui3tLbzIgQ?rel=0]


But about 35 minutes into the conversation, Schneier brought up what I think is the crux of the matter by pointing out the distinction between “Are we following the rules?” and “Are these the right rules”? He suggested that “the way you get this greater oversight is these discussions of what makes sense, what is moral in our society, what is proper.”

Unfortunately, Snowden—in this case—didn’t take the bait and responded by pointing out the financial cost of the NSA’s actions in terms of the damage to the business of American high-tech corporations. Showing the photo of the NSA tapping into a Cisco router, he said: “this has a real cost, not just legally, not just morally, not just ethically, but financially.”

In his Wired interview with James Bamford, however, Snowden was quite eloquent about the moral and ethical costs (referencing Arendt’s Eichmann in Jerusalem: A Report on the Banality of Evil) and how a “culture of impunity” has developed at the NSA and other parts of our government:

“It’s like the boiling frog… You get exposed to a little bit of evil, a little bit of rule-breaking, a little bit of dishonesty, a little bit of deceptiveness, a little bit of disservice to the public interest, and you can brush it off, you can come to justify it. But if you do that, it creates a slippery slope that just increases over time, and by the time you’ve been in 15 years, 20 years, 25 years, you’ve seen it all and it doesn’t shock you… [Clapper] saw deceiving the American people as what he does, as his job, as something completely ordinary. And he was right that he wouldn’t be punished for it, because he was revealed as having lied under oath and he didn’t even get a slap on the wrist for it. It says a lot about the system and a lot about our leaders.”

Eichmann said in his defense that he was just following orders. Rudolf Hoess, the commander of Auschwitz, where more than a million people were murdered, said in his autobiography: “The reasons behind the extermination program seemed to me right. I did not reflect on it at the time: I had been given an order, and I had to carry it out.”

In his passionate defense of the NSA, John DeLong brought up the testimony of Professor Geoffrey Stone who served on the President’s Review Group that assessed the actions of the NSA after Snowden’s revelations, quoting from Stone’s Huffington Post blog: “Not only did I find that the NSA had helped to thwart numerous terrorist plots against the United States and its allies in the years since 9/11, but I also found that it is an organization that operates with a high degree of integrity and a deep commitment to the rule of law. The Review Group found no evidence that the NSA had knowingly or intentionally engaged in unlawful or unauthorized activity.”

But there is more (and more to the point) in the same post:

This is not to say that the NSA should have had all of the authorities it was given. The Review Group found that many of the programs undertaken by the NSA were highly problematic and much in need of reform. But the responsibility for directing the NSA to carry out those programs rests not with the NSA, but with the Executive Branch, the Congress, and the Foreign Intelligence Surveillance Court, which authorized those programs — sometimes without sufficient attention to the dangers they posed to privacy and civil liberties. The NSA did its job — it implemented the authorities it was given.

Of course, “I was only following orders” is not always an excuse. But in no instance was the NSA implementing a program that was so clearly illegal or unconstitutional that it would have been justified in refusing to perform the functions assigned to it by Congress, the President, and the Judiciary. Although the Review Group found that many of those programs need serious re-examination and reform, none of them was so clearly unlawful that it would have been appropriate for the NSA to refuse to fulfill its responsibilities.

So following orders “is not an excuse” and the NSA followed orders that were “highly problematic” but the responsibility lies elsewhere because the NSA people are simple automatons?  And if “many of those programs need serious re-examination and reform,” why does Professor Stone, instead of thanking Snowden for being the catalyst for this much needed re-examination, think that he “deserves punishment”?

Similarly, DeLong said that in regards to Snowden, “we need to let the wheels of justice turn in his case.” And he talked about the President’s directive of January 2014 and updates on this presidential policy next month, all of course coming after Snowden’s revelations. “When the NSA says we are going to be more transparent in this place, you need to hold us more accountable,” DeLong exhorted the audience. Would he talk today about the need to move transparency “up a level” if not for Snowden?

DeLong, who served as Director of Compliance at the NSA from 2009 to late 2014, realized in that job that “to operate consistently under the rule of law, rules must be translated into technical requirements.” He went further to say that if you are the policy or legal guy writing these rules, “try to imagine that you are writing a recipe that will be cooked to scale, by many cooks, some cooks being humans, some cooks being machines.”

DeLong (who is also a Certified Compliance and Ethics Professional) is twice removed (two hops?) from the ethics of the decisions (e.g., hack into the Cisco router or not?). His first defensive wall is murky laws, no matter that they are subject to conflicting interpretations or were written for an analog world. His second moat is code. “If you think of the rules as essentially system and functional requirements,” he said, “what look like legalese, words on the page, might start to look like object-oriented code, complete with if-thens, go-tos, some spaghetti code now and then, some cut and paste, some patterns, some anti-patterns.”

I’m not making this up. DeLong also asserted that “the law is really the math and science of human interaction” and that “perhaps quantum entanglement is the physics of sharing human ideas.”  He is keen on “discussions,” on “lawyers talking to engineers,” on pulling “together people from different parts of society on equal grounds.” He thinks that protecting privacy and civil liberties today is “more art than science,” but that “the science of privacy,” allowing “society to work with data,” is going to be seen as “one of the great engineering, math, science, legal, policy feats of our time.” And he knows how to achieve it: “To understand each other’s language and to translate difficult law and policy decisions into repeatable and understandable computations, takes some time, patience, and lots of conversations. So I would sternly urge everyone to join the effort. The self-reinforcing circles that might in the short-term make us more comfortable, in the long term don’t really advance–moving us forward–in the art and science of privacy.”

Did DeLong refer to the membership of the presidential Review Group mentioned above when he talked about “self-reinforcing circles”? Are “computations” really the solution to what are essentially moral dilemmas?

We have here a serious and probably sincere person defending in such a way the actions of the NSA because Big Talk and Big Data have taken over. Big Talk is what Orwell called “Newspeak,” the language invented by the totalitarian state in 1984. Have you noticed that a certain “Duckspeak”—“just because we can do something, doesn’t mean we should do it”—has been repeated again and again by government officials since 2013?

They do it—collecting all the data they can collect—regardless of whether they should or shouldn’t (see the Wall Street Journal article on “U.S. Spies on Millions of Cars”), not only because, as Snowden said, it’s cheap and easy. They do it because they believe in the ideology of Big Data, a Google invention which has been adopted and amplified by many other data-collecting entities. “That building a giant haystack is the way to go, and that you don’t need even to know what needle you are looking for, it will simply ‘emerge’ from the data, is certainly what the NSA learned from big data advocates,” I wrote when the first Snowden revelations came out in June 2013. Now Mattathias Schwartz writes in “The Whole Haystack” (reviewing the one and only case, by NSA’s admission, where bulk phone-records collection stopped a terrorist activity—sending a few thousand dollars to Somalia): “By flooding the system with false positives, big-data approaches to counterterrorism might actually make it harder to identify real terrorists before they act.”

It’s not efficient, it’s not effective, it may even do more harm than good. But above of all, the ideology of big data turns us into machines, making decisions based on code, substituting rules and laws for ethics and values.

The lure of big data and the temptation to collect it all is everywhere you turn. At the symposium, I overheard a Harvard Computer Science professor compliment an undergraduate student on her wise choice of concentration, bio-informatics, telling her that a large pharmaceutical company told him “they have so much data, there are probably 3 or 4 drugs hidden there.” This blind belief in the automated discovery powers of lots and lots of data is fine when all that is at stake is a corporation’s profits. It is, however, a critical endurance test for the government of the people, by the people, for the people.

[Originally published on Forbes.com]