From the pages of
Cryptography is an inherently difficult topic to discuss because information about cryptography often seems to be in code. To reveal truth about concealment, one must decipher strange language at every turn.
This phenomenon is manifest in the debate over the Clinton administration's recent technological proposal, a digital encryption device called the Clipper Chip. To comprehend the Clipper imbroglio you must decode ciphers of all kinds: the elliptical murmurs of government, the arcane vitriol of computer science. It's tempting to ignore the whole issue rather than storm its intellectual battlements. The war of the Clipper chip is currently raging over a battlefield so narrow with special interest, so muddy with jargon, that the vast majority of the citizens it will affect have been excluded from combat. Few people realize that the debate actually pertains to everyone.
Cryptography involves the translation of uncoded data (plaintext) into coded form (ciphertext), using a specific step-by step procedure, or algorithm, called the encryption key. The ciphertext is then transmitted to some other party, who employs a decryption key to change the ciphertext back into plaintext. In traditional methods of cryptography, the encryption and decryption keys are the same; to decode a message, one merely reverses the operation used to encode it. This is called a single-key system. Below is an extremely simplistic example of such a system:
- I want to send this top-secret message (the plaintext) to a friend: "Hello."
- I use a pre-determined key to transform the message into ciphertext (for instance, I might add (1) to the ordinal value of each letter, yielding "IFMMP").
- I send the message. If it is intercepted, it will be unreadable unless the third party can somehow deduce my encryption key through code-cracking techniques. The key will also prevent a third party from authoring a message in my name.
- The ciphertext arrives at its intended destination. Because we are using a single-key system, my friend can use the encryption key as a decryption key by reversing its operation. (In our example, he or she subtracts (1) from the ordinal value of each letter, yielding "Hello.")
While the algorithms of today's single-key systems are, of course, highly complex, my example illustrates their salient feature: identical encryption and decryption keys. This attractive symmetry conceals a serious drawback. Before communicating, users of a single-key system must agree on a designated key--a process that, in itself, requires communication. This technological Catch-22 is a security risk: sensitive data must be transmitted to establish a safe system for transmission of sensitive data. Robert Sedgewick elaborates: "The whole system depends on some separate prior method of communication between the sender and the receiver to agree on the key parameters... any security system is only as reliable as the people that have the key."
Single-key systems dominated research in computer science until 1976, when computer scientists Whitfield Diffie and Martin Hellmann introduced a technique known as public-key cryptography. In the Diffie-Hellmann system, every user has two keys. The first key, known only by the user, is called the private key. It is used to decrypt data that has been encrypted with the user's other key, the public key. Although the keys are mathematically related, it is virtually impossible to compute the secret key from knowledge of the public key. If I want to send a message to my friend using this system, I look up his or her public key in a directory (like a phone book) and use it to encrypt my message. My friend decrypts the coded transmission using a corresponding secret key (Rivest 470-471). Thus, the Diffie-Hellmann system provides an elegant solution to the symmetrical-key designation problem--my friend and I never need to establish a secret key between us.
Despite its advantages, public-key cryptography has failed to displace single-key systems. The same year that public-key technology was introduced, the highly secretive National Security Agency--a nebulous government agency normally charged with eavesdropping on the telecommunications of other countries--created an alternative, IBM-designed single-key standard called the Data Encryption Standard (DES). Skeptics initially suspected that the NSA had deliberately offered a system that was weakened to make code-breaking easier, but these doubts have gradually subsided because the DES algorithm has withstood over 15 years of rigorous testing by the private sector. DES is now considered an excellent source of digital security, and is routinely used by many private institutions to protect sensitive data today.
This situation remained static until April 1993, when the government suddenly proposed the replacement of the current standard with an NSA-designed, public-key cryptography algorithm. The new standard was formally adopted by the government on February 4 (Clipper 1). It is to be implemented in a tiny piece of hardware, the Clipper Chip.
When installed in computers and telephones, the Clipper Chip would employ Diffie-Hellmann technology to protect the security of information transmitted over telephone lines. Both voice and data transmission would be protected by slightly different versions of the Clipper system. (The "Clipper Chip" for computers is called Capstone, but the entire scheme still bears the moniker, "Clipper." As with DES, the government's new standard is voluntary; citizens would not be forced to encrypt using the chip. It all seems quite innocuous--just a technological update from benign forces in government--except from one small, sinister caveat: a "key-escrow back door."
The Clipper Chip has been designed to include a mechanism for the government to wiretap its transmissions. The keys for every chip will be divided into databases kept in separate escrow institutions (the Treasury Department and the National Institute of Standards and Technology, or NIST). Government agencies can be granted the two keys to a particular connection when they are "lawfully authorized." Therefore, the privacy guaranteed by the Clipper Chip is far from absolute. Whitfield Diffie, the co-creator of public-key technology, explains: "The effect is very much like that of the little keyhole in the back of the combination locks used on the lockers of schoolchildren. The children open the locks with the combinations, which is supposed to keep the other children out, but the teachers can always look in the lockers by using the key.."
From its introduction in April 1993 until now, the Clipper Chip has been (in the words of the New York Times) "breathtakingly unpopular." The proposal has engendered a coalition of concerned computer scientists and members of industry that predict dire effects of its implementation. Vocal opponents of the Clipper Chip are not extremists; they include representatives from companies such as IBM, Microsoft, Apple, and MCI. As adoption of the Clipper Chip became more imminent, the efforts against it increased in vigor and organization. A January open letter to President Clinton from 43 such protesters reads in part: "We believe that if this proposal and the associated standards go forward, even on a voluntary basis, privacy protection will be diminished, innovation will be slowed, government accountability will be lessened, and the openness necessary to ensure the successful development of the nation's communications infrastructure will be threatened."
These are not minor criticisms. But although the Clinton administration had invited feedback from such groups, the Clipper Chip was ultimately adopted with virtually no concessions to its critics.
With the support of a few allies in academia, government insists that its peculiar version of public-key technology is necessary. Vice President Al Gore defended the proposal during a February news conference: "Encryption is a law-and-order issue, since it can be used by criminals to thwart wiretaps and avoid detection. Our policy is designed to provide better encryption to individuals and businesses while ensuring that the needs of law enforcement and national security are met."
For the Clipper Chip to succeed politically, the government must assure us that it will never misuse the enormous power it has granted itself through key-escrow cryptography. Should we believe this guarantee?
Actually, the Clipper system has been engineered to prevent abuse--a court order will be necessary for wiretapping, and circumvention of this procedure would require corruption of both escrow institutions. More importantly, by specifying specific escrow institutions and a seemingly detailed due process for obtaining of secret keys, the government has attempted to exhibit public accountability and respect for the privacy of its citizens.
But as even cursory analysis reveals, this stance is ludicrous. At every stage of the Clipper Chip's life--development, proposal, debate, adoption--the government has operated with an almost total disregard for any interest but its own.
First, the Clipper algorithm is classified. This shatters the earlier precedent set by DES, when scientists were allowed to privately investigate the effectiveness of the new cryptography algorithm. ("Algorithm" is not to be confused with "key." Publishing an algorithm reveals how it works without divulging the actual values that will make it work.) The Clipper Chip even has a disturbingly James Bond-style twist: it is tamper-resistant and will self-destruct if opened. Unfortunately, much of the government's self-proclaimed "public accountability" self-destructs along with its cryptographic toy. In the case of DES, private-sector testing led to widespread acceptance of a technology that was initially met with distrust. The Clipper Chip, conversely, will never have the opportunity to demonstrate its merits to anyone but its creator, the NSA.
This brings us to another point: Why is an organization charged with the monitoring of communication establishing an encryption standard? Relying on the NSA to establish privacy, according to John Perry Barlow, "is like asking a peeping tom to install your window blinds" In our discussions of the Clipper Chip, we must remember that "government" in this case is largely represented by an agency whose very budget is unknown. When has the NSA ever been publicly accountable for anything at all? If we uncovered an abuse of Clipper technology, would we even know whom to fault?
Disillusioned, we return to another of the government's second argument. This largest segment of the Clipper Chip's rhetorical armor consists of the potential for rampant crime that might accompany uncontrolled private encryption. FBI agent James K. Kallstrom asks, "Are we going to have the tool of electronic surveillance, or are we going to let criminals use the national information infrastructure unfettered?" This is not an entirely unfounded perspective; even the Clipper Chip's most vocal protesters concede that crime is a viable concern. Barlow, co-founder of the Electronic Frontier Foundation and privacy activist, writes: "I can see what the Guardian Class is worried about... It is imaginable that, with the widespread use of digital cash and encrypted monetary exchange on the Global Net, economies the size of America's could appear as nothing but oceans of alphabet soup. Money laundering would no longer be necessary. The payment of taxes might become more or less voluntary. A lot of weird things might happen after that... I can imagine bogeymen whose traffic I'd want visible to authority." At this stage of the government's argument, we veer dangerously close to an either-or fallacy--a false choice between the Clipper Chip and anarchy. Unfortunately, the Clipper Chip is not likely to have any measurable impact on crime, and may actually encourage lawbreaking.
Remember, encryption using Clipper technology is voluntary. Citizens can still employ any form of cryptography they wish. Ray Kammer, one of the Clipper Chip's designers, admits, "It's obvious that anyone who uses Clipper for the conduct of organized crime is dumb."
But aren't the practitioners of organized crime among the smartest criminals? Is anyone stupid enough to use the Clipper Chip to organize crime going to succeed anyway? Perhaps Kammer should have made an additional statement: It's obvious that anyone who uses Clipper for the prevention of organized crime is dumb.
It's bad enough to imagine that the Clipper Chip will be ineffective against the type of crime it purports to fight. However, this assessment might be overly optimistic. The Chip could actually lead to crime; the two escrow institutions are an obvious target for terrorists, hostile foreign governments, or anyone interested in breaching the security of the United States. This includes our country's own citizens--surely, any secret encryption algorithm (especially one drenched in controversy) extends an implicit challenge to the nation's legions of highly capable computer hackers.
As we ponder the disaster that would result if the escrow institutions were compromised, we might also attempt to ascertain the strength of the real and imagined enemies that create the "need" for the Clipper Chip--drug smugglers, terrorists, foreign armies, etc. John Perry Barlow examines the dangers to America: "... it seems that America's greatest health risks derive from drugs that are legal, a position the statistics overwhelmingly support. And then there's terrorism, to which we lost a total of two Americans in 1992; even with the World Trade Center bombing, only six in 1993... And the last time we got into a shooting match with another nation, we beat them by a kill ratio of about 2,300 to 1."
They don't seem like particularly formidable threats, but government is prepared to cripple much of America's business for their sake. This economic result of the Clipper Chip is alarmingly easy to forecast and even simpler to delineate.
First, government--hoping for a nearly universal implementation of the Chip--will intentionally subvert market forces with its massive buying power. The Clinton administration will "strongly encourage Federal agencies to require that the chip be placed in the equipment they buy." Once the Clipper Chip is the de facto government standard, all technological companies will be forced to install it in merchandise they sell to the government. The promulgation of this device--a device that is bad business incarnate, a device designed to perform its task poorly--will continue into the companies that deal with these companies, and so forth. Eventually, Clipper will occupy a significant majority of the encryption market. At some critical point, American businesses will make a decision. Since it is inefficient to produce two models of a product--one including the Clipper Chip, one without--and their largest customer requires the Chip, non-Clipper products will be phased out.
Although this scenario represents an unwarranted, selective manipulation of a supposedly free market by the government, its impact on domestic business might be slight.
It's when you try to export Clipper that you run into trouble. It's a bit of a tough sell, don't you think? The Chip is an expensive additional cost of any device that uses it. Moreover, this extra cost provides, if anything, a negative feature--the illusion of security. Not that this will have to be explained to foreign customers; as the Clipper attempts to sail overseas, it will be preceded by destructive waves of bad press. Products with the Clipper Chip won't sell--they can't sell--and billions of dollars will be lost. Good-bye, privacy. Hello, trade deficit.
It's all quite simple: the Clinton administration is willing to negate the right of Americans to a private conversation and stunt the progress of our most rapidly expanding industries, all for a stillborn technological baby. Unless, of course, they make it mandatory. The government could outlaw all forms of encryption but the Clipper Chip. Oh, brother. Big Brother.
It's not a paranoid idea--merely the logical destination of the course plotted by the Clipper's crew. Computer Professionals for Social Responsibility, a group by no means given to radical positions, forecasts a similar chain of events: "The administration maintains that Clipper will be a `voluntary' standard outside of the government, but many industry observers question the reality of this claim. The government exerts enormous pressure in the marketplace, and it is unlikely that alternative means of encryption will remain viable. Further, the possibility of Clipper becoming mandatory at some time in the future is quite real given the underlying rationale for the system."
If a mandatory key-escrow system is ever established, we will all have to accept fear as the constant companion of telecommunication. If other, more secure types of encryption become illegal, the government could arrest you for transmitting unreadable messages. The means of secret communication, not just the communication itself, will become the target of law enforcement. It will be as if the government had outlawed whispering. It won't matter what I'm actually telling you--if I don't say it loudly enough, I'm a criminal.
Thus, no mere snapshot of the situation--no matter how candid or detailed--should resolve your opinion of the Clipper Chip. Even if the present proposal sounds fine to you, remember that we currently occupy just one point on an extremely disturbing curve. Acceptance of the Clipper Chip--a relatively small invasion of privacy--will almost certainly lead to further erosion of Constitutional guarantees in the future. This is not the reactionary kind of argument you might hear from the National Rifle Association ("First assault rifles, then all of our guns..."); my reasoning is perfectly logical. The Clipper Chip serves no useful function to anyone while it is voluntary, and inconveniences the government by stirring up controversy; therefore, we can anticipate a maneuver by those in power. Since the Clinton administration is interested in controlling cryptography (an interest demonstrated by Clipper's very proposal), it seems unlikely that the government will retreat in the direction of privacy. We have eliminated the possibilities of our opponent's entrenchment or retreat, and only one option remains: attack. This is why everyone who, after some education on the topic, finds the Clipper Chip a repugnant and scary idea--in other words, almost everyone--must unite against the idea while it is still somewhat vulnerable. It is not too late for the Clinton administration to reconsider its new policy. But time grows short; already, your future security has been demolished to clear the way for the electronic pavement of a twisted Information Highway. Construction begins soon.
Kenneth Ulrich is a free-lance writer interested in the ethics of emerging technologies. He lives in College Station, Texas.