Export of cryptography from the United States
Based on Wikipedia: Export of cryptography from the United States
When Math Became a Weapon
In the 1990s, the United States government treated a piece of software the same way it treated a missile. Phil Zimmermann, a software developer from Boulder, Colorado, wrote a program called Pretty Good Privacy—PGP for short—that let ordinary people encrypt their emails. When someone uploaded it to the internet in 1991, federal investigators opened a criminal case against Zimmermann. His crime? Exporting munitions without a license.
The munition in question was math. Specifically, the mathematical algorithms that scramble information so thoroughly that even the most powerful computers cannot unscramble it without the right key. To the American government, this math was as dangerous as a tank.
This is the strange story of how the United States spent decades trying to control the export of cryptography—the science of secret codes—and how that effort collided with the rise of personal computers, the internet, and the basic human desire for privacy.
The Cold War Origins
The roots of American crypto export controls stretch back to World War II. The war had proven, beyond any doubt, that breaking enemy codes could determine the outcome of battles and even entire campaigns. The Allied effort to crack German Enigma machines and Japanese naval codes had shortened the war by years, according to some historians. When the Cold War began, American policymakers were determined to keep their cryptographic advantages.
The United States and its Western allies created an elaborate system called the Coordinating Committee for Multilateral Export Controls—mercifully abbreviated to CoCom. This organization existed to prevent Western technology from reaching the Soviet bloc. Two categories of technology fell under its purview: weapons of war, which everyone agreed should be restricted, and something called "dual-use technology," which had both military and civilian applications.
Cryptography fell squarely into that second category. A cipher that protects military communications works exactly the same way as a cipher that protects bank transactions. The math does not care who is using it.
On November 17, 1954, the United States formally added encryption technology to its Munitions List. This meant that exporting cryptographic equipment—or even the knowledge of how to build it—required approval from the State Department. The same bureaucracy that licensed the sale of fighter jets now controlled the sale of code machines.
For years, this arrangement worked reasonably well. The market for serious cryptography was almost entirely military. Banks used simple codes for wire transfers. Businesses rarely needed to protect their communications from sophisticated attackers. The idea that ordinary citizens might want or need strong encryption seemed far-fetched.
The Problem with Progress
Then computers changed everything.
By the 1960s, financial institutions were moving enormous sums of money electronically. A wire transfer from New York to London could move millions of dollars in seconds. Banks desperately needed ways to protect these transactions from criminals and fraudsters. They needed encryption—strong encryption—and they needed it for purely commercial purposes.
The U.S. government recognized this problem and, in 1975, introduced something called the Data Encryption Standard, or DES. This was a government-approved encryption algorithm that banks and businesses could use. It was strong enough for commercial purposes but, importantly, used a key size that the National Security Agency believed it could still crack if necessary.
DES created a paradox. The government was simultaneously promoting strong commercial encryption at home while restricting its export abroad. American companies found themselves in an awkward position. They could build products with robust security features, but they could not sell those products overseas without jumping through bureaucratic hoops.
Large corporations like IBM had the resources to navigate this maze. They employed lawyers and lobbyists who understood the export licensing process. They could file applications, wait for approvals, and negotiate with regulators on a case-by-case basis.
Smaller companies and individual developers had no such luxury. And as personal computers proliferated in the 1980s, more and more of these smaller players wanted to build encryption into their products.
Phil Zimmermann's Rebellion
Phil Zimmermann was not a corporation. He was a peace activist and programmer who believed that ordinary people deserved the same cryptographic protections that governments enjoyed. In the late 1980s, he began working on PGP, a program that would let anyone encrypt their files and emails using military-grade algorithms.
Zimmermann finished PGP in 1991. Almost immediately, someone—not Zimmermann himself, though this distinction would become legally important—uploaded the software to the internet. Within days, copies had spread around the world. The encryption genie was out of the bottle.
The federal government was not amused. Customs officials opened a criminal investigation into whether Zimmermann had illegally exported munitions. For three years, he lived under the threat of prosecution. The case became a cause célèbre among civil liberties advocates, who argued that the government was trying to criminalize the distribution of mathematical knowledge.
The investigation was eventually dropped in 1996, but by then the fundamental question had been raised: Could the government really control the spread of information in the internet age?
The Browser Wars and Broken Encryption
While the Zimmermann investigation dragged on, something remarkable was happening in the commercial world. The internet was becoming a place where people wanted to buy things. And buying things online required a way to transmit credit card numbers securely.
A company called Netscape developed a technology called Secure Sockets Layer—SSL—that encrypted communications between web browsers and servers. When you saw a little padlock icon in your browser, SSL was protecting your connection. This technology became the foundation of internet commerce.
But Netscape faced the same export restrictions that had bedeviled the cryptography industry for decades. Strong encryption was a munition. Exporting it required a license. So Netscape did something that seems almost absurd in retrospect: they created two versions of their browser.
The "U.S. edition" used full-strength encryption. It could generate keys of 1024 bits or more for the public-key portion and 128 bits for the symmetric portion. Breaking such encryption would require computational resources that simply did not exist.
The "International edition" was deliberately weakened. It used 512-bit public keys and 40-bit symmetric keys. This was not a minor reduction. The difference between 128-bit and 40-bit encryption is not threefold—it is astronomical. A 40-bit key can be broken by a single modern computer in a matter of days. Even in the 1990s, it provided only the illusion of security.
Here is the absurd part: most Americans ended up using the weak international version anyway. The process of obtaining the strong domestic version was complicated enough that many users simply downloaded whatever was easiest. They had no idea they were protecting their credit card numbers with encryption that any determined attacker could crack.
The same problem afflicted Lotus Notes, a popular business communication platform. International versions used deliberately crippled encryption, leaving corporate communications vulnerable to interception.
A Government at War with Itself
By the mid-1990s, the American government had developed what one observer called a "split personality" on encryption. Different agencies had completely contradictory goals.
The National Security Agency wanted weak encryption abroad so it could continue intercepting foreign communications. The NSA had spent decades building its capabilities to break codes, and it viewed strong commercial cryptography as a threat to national security.
The Commerce Department wanted American companies to succeed in global markets. Export restrictions put U.S. technology firms at a disadvantage. Foreign competitors, unburdened by American regulations, could offer products with strong encryption. American companies were losing sales.
The State Department was caught in the middle, theoretically responsible for enforcing export controls while also trying to promote American business interests abroad.
Law enforcement agencies had their own concerns. The FBI worried that unbreakable encryption would create a "going dark" problem—criminals and terrorists would be able to communicate in ways that police could never intercept, even with a warrant.
These tensions produced some remarkable compromises. In 1992, the NSA and the Software Publishers Association struck a deal. Encryption using 40-bit keys—the weak stuff that could be broken relatively easily—could be exported without much hassle. Stronger encryption still required extensive review.
The government also proposed something called "key escrow" or "key recovery." Under this scheme, encryption would be strong, but the government would hold a backup copy of everyone's keys. If law enforcement needed to read someone's communications, they could retrieve the key from escrow. Critics called this idea a "clipper chip" after one proposed implementation, and they savaged it as a backdoor that would inevitably be abused or stolen.
The Dam Breaks
The pressure for change came from multiple directions at once.
Civil liberties lawyers filed lawsuits arguing that computer code was a form of speech protected by the First Amendment. Peter Junger, a law professor, and Daniel Bernstein, a graduate student, both challenged the export regulations in court. Their argument was elegant: if the government cannot prevent you from publishing a book explaining how an encryption algorithm works, how can it prevent you from publishing a computer program that implements the same algorithm? Code is just another language.
Meanwhile, strong encryption software was proliferating outside the United States regardless of American laws. PGP had already escaped. Other programs followed. The restrictions were becoming increasingly pointless—they hurt American companies while doing little to keep encryption out of foreign hands.
In 1996, President Bill Clinton signed Executive Order 13026, which moved commercial encryption from the Munitions List to the Commerce Control List. This was a significant shift. The State Department, with its focus on weapons, would no longer control encryption exports. The Commerce Department, with its focus on trade, would take over.
The executive order also declared that software would not be treated as "technology" under export regulations. This subtle distinction had major implications. It meant that encryption software might be regulated less strictly than encryption hardware or the underlying technical knowledge.
The liberalization continued over the following years. In 1999, the regulations were changed to allow export of 56-bit encryption and 1024-bit RSA without any backdoors. In 2000, the rules were simplified further, particularly for commercial and open-source software. The key length restrictions could be removed entirely for products that went through a classification process.
Where Things Stand Today
The crypto wars of the 1990s ended in a qualified victory for the advocates of strong encryption. Today, Americans can freely use and distribute encryption software. The apps on your phone that protect your messages, the protocols that secure your banking transactions, the technology that keeps your medical records private—all of this exists because the export restrictions were eventually relaxed.
But the restrictions did not disappear entirely. They evolved.
Non-military cryptography exports are now controlled by the Commerce Department's Bureau of Industry and Security. Most commercial encryption products can be exported freely to most countries. However, restrictions remain for exports to countries the U.S. considers hostile or supportive of terrorism—a list that has included Cuba, Iran, North Korea, Sudan, and Syria.
Military encryption equipment still requires export licenses. So does specialized cryptographic hardware that has been hardened against electromagnetic eavesdropping—so-called TEMPEST-approved electronics. Custom cryptographic software and even cryptographic consulting services remain controlled.
The regulations are administered through a system of classification numbers and country groups. Each product gets an Export Control Classification Number that determines what rules apply. Countries are sorted into groups with labels like B (relatively open), D:1 (more restricted, including China and Russia), and E:1 (the most restricted terrorist-supporting countries). The Commerce Country Chart tells exporters whether they need a license for each combination of product classification and destination country.
The Wassenaar Arrangement, an international agreement among 42 countries, coordinates these export controls globally. The United States is not alone in restricting cryptography exports; its allies have agreed to similar rules. The most recent updates to American regulations, published in March 2021, implemented decisions made at the 2019 Wassenaar Plenary.
The Deeper Questions
The story of American crypto export controls raises questions that remain unresolved.
Can any government control the spread of mathematical knowledge in an interconnected world? The algorithms that power modern encryption are published in academic papers and implemented in open-source software. They exist in the minds of researchers around the globe. Once an idea is out, it cannot be recalled.
How should democratic societies balance security and privacy? The same encryption that protects your bank account can protect a terrorist's communications. The same tools that preserve journalist-source confidentiality can hide criminal conspiracies. There are no easy answers, only tradeoffs.
What happens when technology outpaces regulation? The export control regime was designed for a world of physical goods—tanks, missiles, radar systems. Applying it to software that could be transmitted instantly across borders was always awkward. The internet made the awkwardness obvious.
The crypto wars continue in new forms. Law enforcement agencies still argue for backdoors in encrypted communications. Technology companies still resist. Privacy advocates still push for stronger protections. The fundamental tension between the government's desire to monitor and the citizen's desire for privacy has not been resolved.
What has changed is the baseline expectation. Strong encryption is now normal. Billions of people use it every day without thinking about it. The idea that Americans might go to prison for distributing mathematical software seems almost quaint.
But it was not always so. And the fact that it is not so today is the result of lawsuits and lobbying, technological change and political pressure, and the efforts of people like Phil Zimmermann who insisted that privacy was worth fighting for—even when the government called his software a weapon.