the eļ¬orts of cryptanalysts. For cryptanalysts, the computer improves
eļ¬ciency in the analysis of encrypted messages and building systems
to undermine cryptography, thus making it easier to exploit any ļ¬‚aw
in the cryptographersā™ creations.
Cryptosystems before the twentieth century required tedious man-
ual processing of messages, using code books to match what was written
to what was to be communicated, or perhaps a great deal of scratch pa-
per to perform the necessary text substitution and transposition. The
process of encrypting and decrypting messages essentially consisted of
taking a handwritten message, looking up the correct corresponding
symbol on a chart, and writing the symbol on the paper that would
actually be delivered to the recipient, who would in turn look at the
chart and convert the ciphertext back to the plaintext by hand, one
letter at a time.
Later systems like Enigma, though more convenient than the āold
way,ā were still cumbersome and slow. (Early Enigma promotion mate-
rial boasted that the machine could process 300 characters per minute.)
8 CHAPTER 2
Though the internal mechanics were much more complicated, the user
of the Enigma might liken its operation to a typewriter where the keys
are randomly reassigned. The sender would type the letter according
to the keys written on the keyboard, knowing that when an A is struck,
a V, for example, will be written. The recipient will then need to know
the keyboard layout used by the sender in order to recognize that the
V in the message was created by striking the A key, and write āAā on a
scratch pad. Working letter by letter, the senderā™s message becomes vis-
ible. Enigma handled this substitution work automatically, preventing
operators from needing scratch paper.
Now, with computers, recipients can often click a few buttons and
have huge amounts of deciphered information almost instantly turned
into the senderā™s original message.
Perhaps no one understood the challenge and opportunity that emerged
in the post-war era better than the researchers at IBM. In the 1950s
and 1960s, with its systems designed to handle the heaviest information
processing needs of both corporations and government agencies, IBM
had to give serious consideration to the handling of sensitive data.
One of the earliest applications for computers was in the handling of
government informationā”some of which was protected by law. Security
was just as much a requirement for early computer systems as the
ability to store and to process information accurately.
The trend to establish standards for data security in automated
information systems became an important issue for IBM and its cus-
tomers. The possibility of computerized records being abused was not
lost on Americans, who were fascinated with computers and technol-
ogy, but also worried about the implications of their use in society. One
of the key ļ¬gures in helping IBM realize a workable, powerful security
scheme was a German Ā“migrĀ“ by the name of Horst Feistel. Feistel had
arrived in the United States decades earlier, in 1934. Despite his inter-
est in cryptography, he avoided working in the ļ¬eld during World War
II to avoid suspicion by the American authorities.
After the war, Feistel found employment at the U.S. Air Force Cam-
bridge Research Center, where he worked on identify friend-or-foe (IFF)
systems. IFF systems were (and still are) used on the battleļ¬eld to
Keeping Secrets 9
avoid āfriendly ļ¬reā incidents, where forces attack allied units instead
of the enemy. Radar systems with IFF capability, for example, report
not only the position of units in range, but whether they are friendly
or hostileā”thanks to the use of cryptography.
In the middle of the twentieth century, the highly secretive U.S. Na-
tional Security Agency (NSA) had a virtual monopoly on cryptographic
research and were trying hard to maintain it. Feistelā™s Air Force project
was canceledā”though details are shrouded in military secrecy, NSA is
generally credited with ensuring its hasty demise.
Feistel attempted to continue his work at Mitre Corporation in the
1960s, but again ran afoul of NSAā™s plans. Dependent on Department
of Defense contracts, Mitre had little choice but to ask Feistel to direct
his energies elsewhereā”presumably also at NSAā™s behest.
Determined to apply his hard-earned expertise in cryptography,
Feistel joined IBM before 1970, where he was ļ¬nally free to continue
his work, and headed up a research project known as Lucifer. The goal
of Lucifer was to develop cryptographic systems for use in commercial
products that would address the growing need for data security. IBM
consequently was able to oļ¬er clients a means of protecting data stored
in its computers.
Commercial users of computers were ļ¬nally seeing the need to
protect electronic information in their care, and an explosion began
in the commercial availability of cryptographic products. In the late
1960s, fewer than ļ¬ve companies were oļ¬ering cryptographic products,
but by the early 1970s, more than 150 companies were active in the
marketplaceā”and more than ļ¬fty of them were from outside of the
During this time, Feistel published an article in Scientiļ¬c American,
describing cryptography and how it relates to protecting private infor-
mation in computers. Although much of the article focused on cipher
machines of the sort that were used in World War II, it also contained
some descriptions for mechanisms for computer software to encrypt in-
formation. Those methods, known as Feistel Networks, are the basis of
many cryptosystems today.
Because the government kept their cryptographic technology un-
der lock and key, commercial cryptographers could only guess at what
their counterparts within government research facilities like NSA had
achieved. These commercial cryptographers began with the fragments
10 CHAPTER 2
that could be assembled from historical literature and began to lay the
foundation for open (i.e., not secret) cryptologic research.
At this time, though, very little was understood about how well
various cryptographic techniques could withstand analysis. For exam-
ple, one might believe that an encrypted message would be twice as
resistant to analysis if encrypted twice. Only after years of research did
cryptographers come to realize that for many kinds of ciphers, dou-
ble encryption is no stronger than single encryption. Many questions
played into a systemā™s strength. How strong would a rotor-based sys-
tem be if it used four rotors instead of three? How strong is strong
enough? How strong is a rotor-based machine system by comparison
with an encryption system implemented entirely in software?
In the early 1970s, no one outside of government cryptology knew
the answers to questions like these, and it would be years before suf-
ļ¬cient work in the ļ¬eld would be done to ļ¬nd answers. Thus, the
availability of cryptographic products was of little helpā”people simply
didnā™t know how good any of it was, and making meaningful compar-
isons was impossible. Even worse, no two vendors could agree on a
system, requiring that both sender and receiver use the same equip-
ment. It would be like buying a Ford only to discover that the nearest
gas station sold only fuel to work with Chrysler cars.
Knowing that information needed to be protected, computer system
managers had little choice but to buy something and hope for the best.
Data Encryption Standard
In the United States, the National Bureau of Standards (NBS) began
undertaking an eļ¬ort aimed at protecting communications data. As
part of the Department of Commerce, NBS had an interest in ensuring
that both its own systems and those of the commercial entities with
which it dealt were adequately protecting the information under their
The NBS eļ¬ort included the establishment of a single standard for
data encryption, which would allow products to be tested and certiļ¬ed
for compliance. The establishment of a single standard would solve
three major problems in the chaotic encryption marketplace. First,
products compliant with the standard would have to meet security spec-
iļ¬cations established by experts in cryptography; individual amateurish
eļ¬orts at merely obfuscating information would not pass muster. Sec-
ond, compliant products from diļ¬erent vendors would be able to work
with one another, allowing senders and recipients to buy from the ven-
dors of their choosing. And third, the tremendous costs incurred by
vendors in the creation of cryptographic systems could be reduced,
since they would be able to focus on making the systems convenient to
use, rather than spending huge amounts of money on development of
the cryptographic underpinnings.
Requirements for the standard cryptographic algorithmā”the deļ¬-
nition of the series of steps needed to turn plaintext into ciphertext and
back againā”were published in the Federal Register. Among the require-
ments were a high level of security, complete and open speciļ¬cation,
ļ¬‚exibility to support many diļ¬erent kinds of applications, eļ¬ciency,
and exportability to the global marketplace.
12 CHAPTER 3
NBS received many responses, though it ultimately determined that
none of the algorithms submitted satisļ¬ed all of these requirements.
Despite this apparent setback, NBS did not consider the eļ¬ort to be
a complete loss since it demonstrated that there was a substantial in-
terest in cryptography outside of military circles. The large number of
responses, in and of itself, was taken as a ļ¬rm and positive step in the
NBS published a second request in the Federal Register on August 27,
1974. Once again, several serious submissions were made. Some were
too specialized for the purposes NBS envisioned. Others were ineļ¬ec-
tive. One, however, showed great potential.
IBMā™s Lucifer project had an algorithm simply named āLucifer,ā
that was already in the latter stages of its development. IBM submitted
a variation of the algorithm, one with a 112-bit key, to NBS.
Before the signiļ¬cance of the 112-bit key can be fully appreciated, it
is important to note that modern computers are binary. That is, they
store and process data in bits, the binary units Claude E. Shannon
described in 1948. Anything with two settings can be used to represent
bits. Consider a light bulb. It has two settings and two settings only:
on and oļ¬.
All data in binary computers are represented in terms of bits, which
are represented as 0 or 1. Absolutely everything, to be stored into a
computer, must ultimately be represented with these two, and only
these two, digits.
The easiest way to grasp the security of algorithms like IBMā™s Lu-
cifer is to imagine a simple bicycle tumbler lock. Usually, such locks
are made up of four or ļ¬ve tumblers, each with ten positions, labeled 0
through 9. In digital computers, however, a cryptosystem with a 112-
bit key is like having a lock with 112 tumblers, each with two settings,
0 and 1.
IBMā™s algorithm therefore had a total of 2112 possible settings,
only one of which was the ākeyā to the system, the equivalent of the
setting of a bicycle lock which would allow its opening. Seeing that
number written outā”5,192,296,858,534,827,628,530,496,329,220,096ā”
shows why scientists prefer to use exponents when talking about large
Data Encryption Standard 13
numbers. The diļ¬erence is even more pronounced (pardon the pun)
when you hear the numbers spoken. āOne hundred twelve bitā is much
easier to say than āļ¬ve decillion one hundred ninety-two nonillion two
hundred ninety-six octillion eight hundred ļ¬fty-eight septillion ļ¬ve hun-
dred thirty-four sextillion eight hundred twenty-seven quintillion six
hundred twenty-eight quadrillion ļ¬ve hundred thirty trillion four hun-
dred ninety-six billion three hundred twenty-nine million two hundred
twenty thousand ninety-six.ā Such a vast number of possible solutions
made the Lucifer algorithm a powerful means to protect informationā”
satisfying two important NBS criteria at once: high security and secu-
rity coming from the key.
NBS saw IBMā™s submission as promising, but it had a serious
problemā”the algorithm was covered by some IBM patents which ruled
out interoperability. IBM agreed to work out rights for the patents,
such that even competitors would have the ability to produce systems
that implemented the algorithm without the need to pay IBM licens-
ing fees. Once this legal obstacle was removed, NBS went to work on
evaluation of the system itself.
Lacking a staļ¬ with its own cryptographic expertise, NBS turned to
the greatest source of cryptographic expertise known to existā”in other
words, NSAā”for help in evaluating the strength of the Lucifer algo-
rithm. After careful analysis, NSA proposed two signiļ¬cant changes.
The ļ¬rst was a change in the algorithmā™s S-boxes. S-boxes are the
part of the algorithm that control how the data are permutated as they
move from step to step along the process of being converted from the
readable message to the encrypted result (or vice-versa), much like the
rotors of Enigma.
The second, and more drastic, was the reduction of key length from
112 to 56 bits. This recommendation came as a result of debate inside
of NSA. While the code-making part of NSA wanted to produce a
standard that was strong and could protect U.S. interests, the code-
breaking part of NSA was concerned that if the standard were too
strong, it could be used by foreign governments to undermine NSAā™s
foreign signal intelligence eļ¬orts. Ultimately, 56 bits was the key size
that won out as those two concerns were balanced.7
The diļ¬erence in key size is signiļ¬cant. Because weā™re talking about
ātumblersā that are binary hereā”weā™re working with a base of 2. That
means that each digit added to the key doubles the key strength. That
14 CHAPTER 3
is, the number of possible settings, only one of which is the key to
unlocking the encrypted message. Consider Table 1.
Power Conventional Notation
2 72, 057, 594, 037, 927, 936
2 5, 192, 296, 858, 534, 827, 628, 530, 496, 329, 220, 096
2 340, 282, 366, 920, 938, 463, 463, 374, 607, 431, 768, 211, 456
Table 1. Powers of Two
The key of IBMā™s original cipher would be not just double or triple