Cypherpunks: Freedom and the Future of the Internet Read online

Page 5

JULIAN: Under the PATRIOT Act?

  JACOB: No. This is the Stored Communications Act, essentially. The Wall Street Journal is saying that each of these services claims that the government wanted the metadata, and the government asserted it has the right to do this without a warrant. There’s an ongoing legal case about the government’s right to keep its tactics secret, not only from the public, but from court records. I read the Wall Street Journal and found out like everyone else.

  JULIAN: So Google sucked up to the US government in its Grand Jury investigation into WikiLeaks when the government subpoenaed your records—not a conventional subpoena, but this special sort of intelligence subpoena. But the news came out earlier in 2011 that Twitter had been served a number of subpoenas, from the same Grand Jury, but Twitter fought to be able to notify the people whose accounts were subpoenaed—for the gag order to be lifted. I don’t have a Twitter account, so I didn’t get one, but my name and Bradley Manning’s name were on all the subpoenas as the information that was being searched for. Jake, you had a Twitter account so Twitter received a subpoena in relation to you. Google also received a subpoena, but didn’t fight to make it public.59

  JACOB: Allegedly. That’s what I read in the Wall Street Journal. I might not be even allowed to reference it except for in connection to the Wall Street Journal.

  JULIAN: Is it because these orders also have a gag component? That has been found to be unconstitutional, hasn’t it?

  JACOB: Maybe not. For the Twitter case it is public that we lost the motion for a stay where we said that disclosing this data to the government would do irreparable harm as they can never forget this data once they receive it. They said, “Yeah well, your stay is denied, Twitter must disclose this data.” We’re in the process of appeal, specifically about the secrecy of docketing—and I can’t talk about that—but as it stands right now, the court said that on the internet you have no expectation of privacy when you willingly reveal information to a third party, and, by the way, everyone on the internet is a third party.

  JULIAN: Even if the organization like Facebook or Twitter says that it will keep the information private.

  JACOB: For sure. And this is the blurring of the state and corporation. This is actually probably the most important thing to consider here, that the NSA and Google have a partnership in cyber-security for US national defense reasons.

  ANDY: Whatever cyber-security means in this context. That’s a wide term.

  JACOB: They are trying to exempt everything from the Freedom of Information Act and to keep it secret. Then the US government also asserts it has the right to send an administrative subpoena, which has a lower bar than a search warrant, where the third party is gagged from telling you about it, and you have no right to fight because it is the third party that is directly involved, and the third party has no constitutional grounds to protect your data either.

  JULIAN: The third party being Twitter or Facebook or your ISP.

  JACOB: Or anyone. They said it was a one-to-one map with banking privacy and with dialing a telephone. You willingly disclose the number to the phone company by using it. You knew that, right? By using the telephone you obviously are saying, “I have no expectation of privacy,” when typing those numbers. There is even less explicit connection to the machine. People don’t understand how the internet works—they don’t understand telephone networks either—but courts have consistently ruled that this is the case, and in our Twitter case so far, which unfortunately I can’t really talk about because I don’t actually live in a free country, they assert essentially the same thing.60

  It’s absolute madness to imagine that we give up all of our personal data to these companies, and then the companies have essentially become privatized secret police. And—in the case of Facebook—we even have democratized surveillance. Instead of paying people off the way the Stasi did in East Germany, we reward them as a culture—they get laid now. They report on their friends and then, “Hey, so and so got engaged;” “Oh, so and so broke up;” “Oh, I know who to call now.”

  ANDY: There were people who were able to pressure Facebook to hand out all the data stored about them under European Data Protection law, and the smallest amount of data was 350 MB, the biggest one was around 800 MB.61 The interesting thing is the database structure of Facebook has been disclosed with this act. Every time you log in the IP number and everything gets stored, every click you make, every time, also the amount of times you stay on a page so they can assume you like it, you don’t like it and so on. But this disclosed that the key identifier of the database structure was the word “target.” They don’t call these people “subscribers” or “users” or whatever, they call them “targets,” to which you could say, “Ok, that’s a marketing term.”

  JULIAN: But it was internally private.

  ANDY: Yes, but in a military sense it could also be target, or it could be in an intelligence sense target. So it is just is a matter of the circumstances in which the data is being used.

  JULIAN: OK. That’s what’s so scary about it.

  ANDY: I think that is very helpful. We used to say with Facebook that the user is not actually the customer. The Facebook user is actually the product, and the real customer is the advertising companies. That’s the least paranoid, most harmless explanation of what’s going on there.

  But the problem is you can hardly blame a company for complying with the laws of the country. It’s called normal, and it’s called criminal if companies don’t comply with the laws of the country. So it’s a little bit of a hard thing to say, “Hey, they’re complying with the law.” What kind of accusation is that?

  JACOB: No, there is something I have to dispute about that. If you build a system that logs everything about a person and you know that you live in a country with laws that will force the government to give that up, then maybe you shouldn’t build that kind of system. And this is the difference between a privacy-by-policy and a privacy-by-design approach to creating secure systems. When you’re trying to target people and you know that you live in a country that explicitly targets people, then if Facebook puts its servers in Gaddafi’s Libya or puts them in Assad’s Syria that would be absolutely negligent. And yet none of the National Security Letters that went out, I think last year or two years ago, were for terrorism. Like, 250,000 of them were used for everything else, but not terrorism.62 So knowing that’s reality, these companies have some serious ethical liability that stems from the fact that they’re building these systems and they’ve made the economic choice to basically sell their users out. And this isn’t even a technical thing. This isn’t about technology at all, it’s about economics. They have decided that it is more important to collaborate with the state and to sell out their users and to violate their privacy and to be a part of the system of control—to be paid back for being a part of the surveillance culture, to be part of the culture of control—than to be resistant to it, and so they become a part of it. They’re complicit and liable.

  ANDY: Ethical liability is not exactly a major selling point right now, huh?

  FIGHTING TOTAL SURVEILLANCE WITH THE LAWS OF PHYSICS

  JÉRÉMIE: A question that may arise at this stage is what is the solution, either for an individual user or for society as a whole? There are technical solutions—decentralized services, everybody hosting their own data, encrypted data, everybody trusting providers close to them that help them with encrypted data services, and so on. And there are the policy options that we have discussed. I’m not sure that at this stage in time that we can answer the question of whether one of the two approaches is the best. I think we have to develop the two approaches in parallel. We need to have free software that everybody can understand, everybody can modify, and everybody can scrutinize in order to be sure of what it is doing. I think free software is one of the bases for a free online society, in order to have the potential to always control the machine and not let the machine control you. We need to have strong cryptography to be sure that when you wan
t your data to be read only by yourself, nobody else can read it. We need communication tools like Tor, or like the Cryptophone, to be able to communicate only with the people you want to communicate with. But the power of the state and the power of some companies may always exceed the power of the geeks we are, and our ability to build and spread those technologies. We may also need, while we are building those technologies, laws and tools that will be in the hands of citizens, to be able to control what is being done with technology—if not always in real time—and to be able to sanction those that use technology in an unethical way and in a way that violates citizens’ privacy.

  JULIAN: I want to look at what I see as a difference between a US cypherpunk perspective and the European perspective. The US Second Amendment is the right to bear arms. Just recently I was watching some footage that a friend shot in the US on the right to bear arms, and above a firearms store was a sign saying, “Democracy, Locked and Loaded.” That’s the way that you ensure that you don’t have totalitarian regimes—people are armed and if they are pissed off enough then they simply take their arms and they retake control by force. Whether that argument is still valid now is actually an interesting question because of the difference in the types of arms that has occurred over the past thirty years. We can look back to this declaration that code-making—providing secret cryptographic codes that the government couldn’t spy on—was in fact a munition. We fought this big war in the 1990s to try and make cryptography available to everyone, which we largely won.63

  JACOB: In the West.

  JULIAN: In the West we largely won and it is in every browser, although perhaps it is now being back-doored and subverted in different kinds of ways.64 The notion is that you cannot trust a government to implement the policies that it says it is implementing, and so we must provide the underlying tools, cryptographic tools that we control, as a sort of use of force, in that if the ciphers are good no matter how hard it tries a government cannot break into your communications directly.

  JACOB: The force of nearly all modern authority is derived from violence or the threat of violence. One must acknowledge with cryptography no amount of violence will ever solve a math problem.

  JULIAN: Exactly.

  JACOB: This is the important key. It doesn’t mean you can’t be tortured, it doesn’t mean that they can’t try to bug your house or subvert it in some way, but it means that if they find an encrypted message it doesn’t matter if they have the force of the authority behind everything that they do, they cannot solve that math problem. This, though, is the thing that is totally non-obvious to people that are non-technical, and it has to be driven home. If we could solve all of those math problems, it would be a different story and, of course, the government would be able to solve those math problems if anyone could.

  JULIAN: But it just happens to be a fact about reality, such as that you can build atomic bombs, that there are math problems that you can create that even the strongest state cannot break. I think that was tremendously appealing to Californian libertarians and others who believed in this sort of “democracy locked and loaded” idea, because here was a very intellectual way of doing it—of a couple of individuals with cryptography standing up to the full might of the strongest power in the world.

  So there is a property of the universe that is on the side of privacy, because some encryption algorithms are impossible for any government to break, ever. There are others that we know are extremely hard for even the NSA to break. We know that because they recommend those algorithms be used by US military contractors for the protection of top secret US military communications, and if there was some kind of back-door in them soon enough the Russians or the Chinese would find it, with severe consequences for whoever made the decision to recommend an insecure cipher. So the ciphers are fairly good now, we’re pretty confident in them. Unfortunately you can’t be confident at all in the machine that you’re running them on, so that’s a problem. But that doesn’t lead to bulk interception; it leads to the targeting of particular people’s computers. Unless you’re a security expert it’s very hard to actually secure a computer. But cryptography can solve the bulk interception problem, and it’s the bulk interception problem which is a threat to global civilization. Individual targeting is not the threat.

  Nevertheless, I have a view that we are dealing with really tremendously big economic and political forces, as Jérémie said, and the likely outcome is that the natural efficiencies of surveillance technologies compared to the number of human beings will mean that slowly we will end up in a global totalitarian surveillance society—by totalitarian I mean a total surveillance—and that perhaps there will just be the last free living people, those who understand how to use this cryptography to defend against this complete, total surveillance, and some people who are completely off-grid, neo-Luddites that have gone into the cave, or traditional tribes-people who have none of the efficiencies of a modern economy and so their ability to act is very small. Of course anyone can stay off the internet, but then it’s hard for them to have any influence. They select themselves out of being influential by doing that. It’s the same with mobile phones; you can choose not to have a mobile phone but you reduce your influence. It’s not a way forward.

  JÉRÉMIE: If you look at it from a market perspective, I’m convinced that there is a market in privacy that has been mostly left unexplored, so maybe there will be an economic drive for companies to develop tools that will give users the individual ability to control their data and communication. Maybe this is one way that we can solve that problem. I’m not sure it can work alone, but this may happen and we may not know it yet.

  JULIAN: Cryptography is going to be everywhere. It is being deployed by major organizations everywhere, edging towards networked city states. If you think about communication paths on the internet—fast transnational money flows, transnational organizations, inter-connections between sub-parts of organizations—all those communication flows go over untrusted communications channels. It is like an organism with no skin. You have organizations and states blurring into each other—each network of world influence competing for advantage—and their communications flows are exposed to opportunists, state competitors and so on. So new networks are being built up on top of the internet, virtual private networks, and their privacy comes from cryptography. That is an industrial power base that is stopping cryptography from being banned.

  If you look at the Blackberry phone for example, it has a built-in encryption system for use within the Blackberry network. Research In Motion, the Canadian company that runs it, can decrypt the traffic of regular users and it has data centers in Canada and the UK, at least, and so the Anglo-American intelligence sharing alliance can get at the world’s Blackberry to Blackberry communications. But big companies are using it in more secure ways. Western governments were fine with this until it spread beyond corporations and to individuals, and then we saw exactly the same hostile political reactions as we saw in Mubarak’s Egypt.65

  I think that the only effective defense against the coming surveillance dystopia is one where you take steps yourself to safeguard your privacy, because there’s no incentive for self-restraint by the people that have the capacity to intercept everything. A historical analogy could be how people learned that they should wash their hands. That required the germ theory of disease to be established and then popularized, and for paranoia to be instilled about the spread of disease via invisible stuff on your hands that you can’t see, just as you can’t see mass interception. Once there was enough understanding, soap manufacturers produced products that people consumed to relieve their fear. It’s necessary to install fear in to people so they understand the problem before they will create enough demand to solve the problem.

  There is a problem on the opposite side of the equation as well, which is that programs that claim to be secure, that claim to have cryptography in them, are often frauds, because cryptography is complex, and the fraud can be hidden in complexity.66
r />   So people will have to think about it. The only question is in which one of the two ways will they think about it? They will either think, “I need to be careful about what I say, I need to conform,” the whole time, in every interaction. Or they will think “I need to master little components of this technology and install things that protect me so I’m able to express my thoughts freely and communicate freely with my friends and people I care about.” If people don’t take that second step then we’ll have a universal political correctness, because even when people are communicating with their closest friends they will be self-censors and will remove themselves as political actors from the world.

  THE INTERNET AND POLITICS

  JÉRÉMIE: It is interesting to see the power of the hackers—“hackers” in the primary sense of the term, not a criminal. A hacker is a technology enthusiast, somebody who likes to understand how technology works, not to be trapped into technology but to make it work better. I suppose that when you were five or seven you had a screwdriver and tried to open devices to understand what it was like inside. This is what being a hacker is, and hackers built the internet for many reasons, including because it was fun, and they have developed it and have given the internet to everybody else. Companies like Google and Facebook saw the opportunity to then build business models based on capturing users’ personal data. But still we see a form of power in the hands of hackers. My primary interest these days is that we see these hackers gaining power, even in the political arenas. In the US there has been this SOPA (Stop Online Piracy Act) and PIPA (Protect IP Act) legislation—violent copyright legislation that basically gives Hollywood the power to order any internet company to restrict access and to censor the internet.67

  JULIAN: And banking blockades like the one WikiLeaks is suffering from.68

  JÉRÉMIE: Exactly. What happened to WikiLeaks from the banking companies was becoming the standard method to fight the evil copyright pirates that killed Hollywood and so on. And we witnessed this tremendous uproar from civil society on the internet—and not only in the US, it couldn’t have worked if it was only US citizens who rose up against SOPA and PIPA. It was people all around the world that participated, and hackers were at the core of it and were providing tools to the others to help participate in the public debate.