Daniel Patrick Moynihan said that everyone is entitled to their own opinion, but not their own facts. That was fifty years ago, however, and now the Moynihan rule is often observed in the breach. This is particularly true for the current debate over encryption, where assertions have achieved (or deserve) mythical status. Here are a few:

Encryption is good for human rights. True only if you live in an imaginary police state. In a real police state, using encryption is a way for activists to self-identify to the security services. In these countries, the security services control the networks, and if they see traffic they can’t read, they will take action against the user. If you are lucky, they will only bug your house. They’re more likely to detain you for questioning and impound the device.

Code is speech. And therefore efforts to control code violate our first amendment rights. When was the last time you saw someone give a speech in code, in a programming language? Does this apply to code for the development of chemical weapons or atomic bombs, so we should let people have free access to them (help yourself, ISIS). “Code is speech” is a tired retread from the 1990s.

China and Russia are waiting to see what the U.S. does. Parochial nonsense. Large, politically fragile regimes do not wait for the United States in deciding on what they believe is needed to protect their regimes. Russian law gives the FSB complete control over all communications and requires that service providers give it plain text on request. This reference note provides more details on Russian surveillance. China’s monitoring, while not as all-embracing as Russia’s, also gives its security services untrammeled access to communications and China has a series of deals with Apple and other IT companies that make its products “State Security friendly”—an unavoidable condition of doing business in the world’s largest market.

The encryption debate is about backdoors. Backdoor is the term for a program element that allows another person to read or more easily decrypt encrypted communications. The best backdoors are covert in that neither the producer nor the users know about them. Back doors are a bad idea, since there is a very good chance that someone else will find them. But if someone tells you that the current dispute over encryption is about backdoors it means they are either misinformed or trying to manipulate their audience. The issue is under what circumstances access to the plaintext of an encrypted message is possible—and access does not require a backdoor. Some privacy advocates believe that under no circumstances should access to a person’s communications be possible. This is not what the constitution said—it says no “unreasonable searches,” not “no searches.” End-to-end encryption, where only the sender and recipient can read the message, make even reasonable searches much harder to perform.

Encryption helps cyber security. The correct answer here is “it depends.” Apple phone encryption would not have stopped the OPM hack, the F-35 hack, or any of the big intelligence coups of the last decade. It would not have prevented Stuxnet. The encryption program at the center of the FBI/Apple debate protects people from surveillance. Surveillance is the real issue, and the encryption in question makes access hard for law enforcement, not cyber spying. We would get a clearer picture if we distinguished between stored data, the target of cybercrime and espionage, and communications data, the messages or data in transit between sender and recipient. For stored data, the use of strong encryption would provide real benefits against crime and espionage and if anything, the mandate should be to require its use for stored data.

Only End-to-End encryption is safe. The most secure message is one that no one can read, but this might be self-defeating. End-to-end encryption makes it hard for anyone but the sender and recipient to recover the readable plaintext from the encrypted message or data (absent difficult measures). Unrecoverable encryption increases the risk of data loss, which limits its appeal to companies and consumers—you don’t want your bank account or family pictures to disappear. Companies will not want their employees to use end-to-end encryption because it increases liability. They want encryption products where the plaintext can be recovered if a password is lost or an employee quits, and in some cases, their regulators require it. The apparent contradiction is that encryption can be strong (e.g. difficult to break) while still allowing recovery of plaintext. Using encryption that does not allow recovery of the plaintext by anyone but the sender and recipient would greatly reduce the ability of law enforcement agencies to monitor communications. This does not come without risk of increased crime and terrorism, but it might be attractive to those worried about government surveillance. How much risk is increased is a subject for debate, as is whether the cost to security is outweighed by the benefit to privacy.

Law Enforcement is “Going Dark.” Catchy phrase, but premature. Only a small number of cases have problems because of encryption. End-to-end encryption is still not widely used. How many people use encryption and who they are will determine risk. How much of a problem encryption use might create requires a complex mapping of authorities and government capabilities, and while the NSA can support the FBI for counterterrorism, they can’t support them for child pornography, drugs, or organized crime and, perhaps, in some domestic terrorism cases—and people who engage in these activities are strongly attracted to end-to-end encryption. What the FBI and other law enforcement agencies are worried about is that when big IT companies offer end-to-end encryption as an automatic service or if it’s available as an easily downloadable app, the number of users will grow and the number of cases stymied by encryption will grow with it.

Encryption can be made unbreakable. The answer here is that encryption can make it very difficult to decrypt data, but anything humans make can also be defeated by humans. In the past, design and implementation problems created exploitable flaws that made the recovery of plaintext possible from strong encryption products. The concern now is that when a few big service providers (like Apple or Google) offer strong encryption as a service, they will avoid these errors, make encryption easy to obtain, and lead to its widespread use. Even with this, it would be possible to recover plaintext if there was physical access to the device (such as putting a camera in the ceiling to record when you type in your password, or finding a way to implant spyware on it), but techniques like these usually don’t provide mass coverage, and are more expensive and riskier than remote monitoring of communications that unencrypted traffic allows.

Keeping the FBI out will protect your privacy. Sadly, no. If you are a person of interest, leading foreign intelligence agencies probably already have your personal data. Even if you aren’t of interest (and most people are not), there is a good chance that hackers have stolen personal data, which is now for sale in cybercrime black markets. Making it harder for foreign spies and criminals to collect data is a good thing, and requires greater use of encryption, but not necessarily end-to-end encryption. Even without crime and espionage, you have no real privacy, since most websites and major internet service providers collect and market your data. Data aggregators collect, store and sell personal information on most Americans going back years—back to the grade school you attended and what you’ve done, earned and spent since then. This is not necessarily bad, but your life really is an open book. The Apple debate is about stopping government communications surveillance, not about privacy—and if you are Chinese, not even that.

James Andrew Lewis is a senior fellow and director of the Strategic Technologies Program at the Center for Strategic and International Studies in Washington, D.C.

Commentary is produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).

© 2016 by the Center for Strategic and International Studies. All rights reserved.

Image
James Andrew Lewis
Senior Vice President; Pritzker Chair; and Director, Strategic Technologies Program