<<

. 3
( 41 .)



>>

plementation and use of ¬‚exible cryptography easier, while frustrating
the e¬orts of cryptanalysts. For cryptanalysts, the computer improves
e¬ciency in the analysis of encrypted messages and building systems
to undermine cryptography, thus making it easier to exploit any ¬‚aw
in the cryptographers™ creations.
Cryptosystems before the twentieth century required tedious man-
ual processing of messages, using code books to match what was written
to what was to be communicated, or perhaps a great deal of scratch pa-
per to perform the necessary text substitution and transposition. The
process of encrypting and decrypting messages essentially consisted of
taking a handwritten message, looking up the correct corresponding
symbol on a chart, and writing the symbol on the paper that would
actually be delivered to the recipient, who would in turn look at the
chart and convert the ciphertext back to the plaintext by hand, one
letter at a time.
Later systems like Enigma, though more convenient than the “old
way,” were still cumbersome and slow. (Early Enigma promotion mate-
rial boasted that the machine could process 300 characters per minute.)
8 CHAPTER 2

Though the internal mechanics were much more complicated, the user
of the Enigma might liken its operation to a typewriter where the keys
are randomly reassigned. The sender would type the letter according
to the keys written on the keyboard, knowing that when an A is struck,
a V, for example, will be written. The recipient will then need to know
the keyboard layout used by the sender in order to recognize that the
V in the message was created by striking the A key, and write “A” on a
scratch pad. Working letter by letter, the sender™s message becomes vis-
ible. Enigma handled this substitution work automatically, preventing
operators from needing scratch paper.
Now, with computers, recipients can often click a few buttons and
have huge amounts of deciphered information almost instantly turned
into the sender™s original message.




Perhaps no one understood the challenge and opportunity that emerged
in the post-war era better than the researchers at IBM. In the 1950s
and 1960s, with its systems designed to handle the heaviest information
processing needs of both corporations and government agencies, IBM
had to give serious consideration to the handling of sensitive data.
One of the earliest applications for computers was in the handling of
government information”some of which was protected by law. Security
was just as much a requirement for early computer systems as the
ability to store and to process information accurately.
The trend to establish standards for data security in automated
information systems became an important issue for IBM and its cus-
tomers. The possibility of computerized records being abused was not
lost on Americans, who were fascinated with computers and technol-
ogy, but also worried about the implications of their use in society. One
of the key ¬gures in helping IBM realize a workable, powerful security
scheme was a German ´migr´ by the name of Horst Feistel. Feistel had
e e
arrived in the United States decades earlier, in 1934. Despite his inter-
est in cryptography, he avoided working in the ¬eld during World War
II to avoid suspicion by the American authorities.
After the war, Feistel found employment at the U.S. Air Force Cam-
bridge Research Center, where he worked on identify friend-or-foe (IFF)
systems. IFF systems were (and still are) used on the battle¬eld to
Keeping Secrets 9

avoid “friendly ¬re” incidents, where forces attack allied units instead
of the enemy. Radar systems with IFF capability, for example, report
not only the position of units in range, but whether they are friendly
or hostile”thanks to the use of cryptography.
In the middle of the twentieth century, the highly secretive U.S. Na-
tional Security Agency (NSA) had a virtual monopoly on cryptographic
research and were trying hard to maintain it. Feistel™s Air Force project
was canceled”though details are shrouded in military secrecy, NSA is
generally credited with ensuring its hasty demise.
Feistel attempted to continue his work at Mitre Corporation in the
1960s, but again ran afoul of NSA™s plans. Dependent on Department
of Defense contracts, Mitre had little choice but to ask Feistel to direct
his energies elsewhere”presumably also at NSA™s behest.
Determined to apply his hard-earned expertise in cryptography,
Feistel joined IBM before 1970, where he was ¬nally free to continue
his work, and headed up a research project known as Lucifer. The goal
of Lucifer was to develop cryptographic systems for use in commercial
products that would address the growing need for data security. IBM
consequently was able to o¬er clients a means of protecting data stored
in its computers.
Commercial users of computers were ¬nally seeing the need to
protect electronic information in their care, and an explosion began
in the commercial availability of cryptographic products. In the late
1960s, fewer than ¬ve companies were o¬ering cryptographic products,
but by the early 1970s, more than 150 companies were active in the
marketplace”and more than ¬fty of them were from outside of the
U.S.
During this time, Feistel published an article in Scienti¬c American,
describing cryptography and how it relates to protecting private infor-
mation in computers. Although much of the article focused on cipher
machines of the sort that were used in World War II, it also contained
some descriptions for mechanisms for computer software to encrypt in-
formation. Those methods, known as Feistel Networks, are the basis of
many cryptosystems today.
Because the government kept their cryptographic technology un-
der lock and key, commercial cryptographers could only guess at what
their counterparts within government research facilities like NSA had
achieved. These commercial cryptographers began with the fragments
10 CHAPTER 2

that could be assembled from historical literature and began to lay the
foundation for open (i.e., not secret) cryptologic research.
At this time, though, very little was understood about how well
various cryptographic techniques could withstand analysis. For exam-
ple, one might believe that an encrypted message would be twice as
resistant to analysis if encrypted twice. Only after years of research did
cryptographers come to realize that for many kinds of ciphers, dou-
ble encryption is no stronger than single encryption. Many questions
played into a system™s strength. How strong would a rotor-based sys-
tem be if it used four rotors instead of three? How strong is strong
enough? How strong is a rotor-based machine system by comparison
with an encryption system implemented entirely in software?
In the early 1970s, no one outside of government cryptology knew
the answers to questions like these, and it would be years before suf-
¬cient work in the ¬eld would be done to ¬nd answers. Thus, the
availability of cryptographic products was of little help”people simply
didn™t know how good any of it was, and making meaningful compar-
isons was impossible. Even worse, no two vendors could agree on a
system, requiring that both sender and receiver use the same equip-
ment. It would be like buying a Ford only to discover that the nearest
gas station sold only fuel to work with Chrysler cars.
Knowing that information needed to be protected, computer system
managers had little choice but to buy something and hope for the best.
3
Data Encryption Standard




In the United States, the National Bureau of Standards (NBS) began
undertaking an e¬ort aimed at protecting communications data. As
part of the Department of Commerce, NBS had an interest in ensuring
that both its own systems and those of the commercial entities with
which it dealt were adequately protecting the information under their
stewardship.
The NBS e¬ort included the establishment of a single standard for
data encryption, which would allow products to be tested and certi¬ed
for compliance. The establishment of a single standard would solve
three major problems in the chaotic encryption marketplace. First,
products compliant with the standard would have to meet security spec-
i¬cations established by experts in cryptography; individual amateurish
e¬orts at merely obfuscating information would not pass muster. Sec-
ond, compliant products from di¬erent vendors would be able to work
with one another, allowing senders and recipients to buy from the ven-
dors of their choosing. And third, the tremendous costs incurred by
vendors in the creation of cryptographic systems could be reduced,
since they would be able to focus on making the systems convenient to
use, rather than spending huge amounts of money on development of
the cryptographic underpinnings.
Requirements for the standard cryptographic algorithm”the de¬-
nition of the series of steps needed to turn plaintext into ciphertext and
back again”were published in the Federal Register. Among the require-
ments were a high level of security, complete and open speci¬cation,
¬‚exibility to support many di¬erent kinds of applications, e¬ciency,
and exportability to the global marketplace.



11
12 CHAPTER 3

NBS received many responses, though it ultimately determined that
none of the algorithms submitted satis¬ed all of these requirements.
Despite this apparent setback, NBS did not consider the e¬ort to be
a complete loss since it demonstrated that there was a substantial in-
terest in cryptography outside of military circles. The large number of
responses, in and of itself, was taken as a ¬rm and positive step in the
right direction.




NBS published a second request in the Federal Register on August 27,
1974. Once again, several serious submissions were made. Some were
too specialized for the purposes NBS envisioned. Others were ine¬ec-
tive. One, however, showed great potential.
IBM™s Lucifer project had an algorithm simply named “Lucifer,”
that was already in the latter stages of its development. IBM submitted
a variation of the algorithm, one with a 112-bit key, to NBS.
Before the signi¬cance of the 112-bit key can be fully appreciated, it
is important to note that modern computers are binary. That is, they
store and process data in bits, the binary units Claude E. Shannon
described in 1948. Anything with two settings can be used to represent
bits. Consider a light bulb. It has two settings and two settings only:
on and o¬.
All data in binary computers are represented in terms of bits, which
are represented as 0 or 1. Absolutely everything, to be stored into a
computer, must ultimately be represented with these two, and only
these two, digits.
The easiest way to grasp the security of algorithms like IBM™s Lu-
cifer is to imagine a simple bicycle tumbler lock. Usually, such locks
are made up of four or ¬ve tumblers, each with ten positions, labeled 0
through 9. In digital computers, however, a cryptosystem with a 112-
bit key is like having a lock with 112 tumblers, each with two settings,
0 and 1.
IBM™s algorithm therefore had a total of 2112 possible settings,
only one of which was the “key” to the system, the equivalent of the
setting of a bicycle lock which would allow its opening. Seeing that
number written out”5,192,296,858,534,827,628,530,496,329,220,096”
shows why scientists prefer to use exponents when talking about large
Data Encryption Standard 13

numbers. The di¬erence is even more pronounced (pardon the pun)
when you hear the numbers spoken. “One hundred twelve bit” is much
easier to say than “¬ve decillion one hundred ninety-two nonillion two
hundred ninety-six octillion eight hundred ¬fty-eight septillion ¬ve hun-
dred thirty-four sextillion eight hundred twenty-seven quintillion six
hundred twenty-eight quadrillion ¬ve hundred thirty trillion four hun-
dred ninety-six billion three hundred twenty-nine million two hundred
twenty thousand ninety-six.” Such a vast number of possible solutions
made the Lucifer algorithm a powerful means to protect information”
satisfying two important NBS criteria at once: high security and secu-
rity coming from the key.
NBS saw IBM™s submission as promising, but it had a serious
problem”the algorithm was covered by some IBM patents which ruled
out interoperability. IBM agreed to work out rights for the patents,
such that even competitors would have the ability to produce systems
that implemented the algorithm without the need to pay IBM licens-
ing fees. Once this legal obstacle was removed, NBS went to work on
evaluation of the system itself.
Lacking a sta¬ with its own cryptographic expertise, NBS turned to
the greatest source of cryptographic expertise known to exist”in other
words, NSA”for help in evaluating the strength of the Lucifer algo-
rithm. After careful analysis, NSA proposed two signi¬cant changes.
The ¬rst was a change in the algorithm™s S-boxes. S-boxes are the
part of the algorithm that control how the data are permutated as they
move from step to step along the process of being converted from the
readable message to the encrypted result (or vice-versa), much like the
rotors of Enigma.
The second, and more drastic, was the reduction of key length from
112 to 56 bits. This recommendation came as a result of debate inside
of NSA. While the code-making part of NSA wanted to produce a
standard that was strong and could protect U.S. interests, the code-
breaking part of NSA was concerned that if the standard were too
strong, it could be used by foreign governments to undermine NSA™s
foreign signal intelligence e¬orts. Ultimately, 56 bits was the key size
that won out as those two concerns were balanced.7
The di¬erence in key size is signi¬cant. Because we™re talking about
“tumblers” that are binary here”we™re working with a base of 2. That
means that each digit added to the key doubles the key strength. That
14 CHAPTER 3

is, the number of possible settings, only one of which is the key to
unlocking the encrypted message. Consider Table 1.

Power Conventional Notation
1
2 2
2
2 4
3
2 8
4
2 16
25 32
29 512
56
2 72, 057, 594, 037, 927, 936
112
2 5, 192, 296, 858, 534, 827, 628, 530, 496, 329, 220, 096
128
2 340, 282, 366, 920, 938, 463, 463, 374, 607, 431, 768, 211, 456
Table 1. Powers of Two




The key of IBM™s original cipher would be not just double or triple

<<

. 3
( 41 .)



>>