Better to know some
... than all
Exhaustive key search and multiple encryption
A fixed-size key defines an upper bound on the security of a block cipher, due to exhaustive key search . While this requires either known-plaintext or plaintext containing redundancy, it has widespread applicability since cipher operations are generally designed to be computationally efficient.
A design technique which complicates exhaustive key search is to make the task of changing cipher keys computationally expensive, while allowing encryption with a fixed key to remain relatively efficient. Examples of ciphers with this property include the block cipher Khufu and the stream cipher SEAL.
Fact (exhaustive key search) For an n-bit block cipher with k-bit key, given a small number (e.g., d(k + 4)=ne) of plaintext-ciphertext pairs encrypted under key K, K can be recovered by exhaustive key search in an expected time on the order of 2k-1 operations. Justification: Progress through the entire key space, decrypting a fixed ciphertext C with each trial key, and discarding those keys which do not yield the known plaintext P. The target key is among the undiscarded keys. The number of false alarms expected (non-target keys which map C to P) depends on the relative size of k and n, and follows from unicity distance arguments; additional (P0; C0) pairs suffice to discard false alarms. One expects to find the correct key after searching half the key space.
Example (exhaustive DES key search) For DES, k = 56, n = 64, and the expected requirement is 255 decryptions and a single plaintext-ciphertext pair. If the underlying plaintext is known to contain redundancy, then ciphertext-only exhaustive key search is possible with a relatively small number of ciphertexts.
Example (ciphertext-onlyDES key search) Suppose DES is used to encrypt 64-bit blocks of 8 ASCII characters each, with one bit per character serving as an even parity bit. Trial decryption with an incorrect keyK yields all 8 parity bits correct with probability 2-8, and correct parity for t different blocks (each encrypted by K) with probability 2-8t. If this is used as a filter over all 256 keys, the expected number of unfiltered incorrect keys is 256/28t. For most practical purposes, t = 10 suffices.
1. Cascades of ciphers and multiple encryption
If a block cipher is susceptible to exhaustive key search (due to inadequate key length), encipherment of the same message block more than once may increase security. Various such techniques for multiple encryptions of n-bit messages are considered here. Once defined, they may be extended to messages exceeding one block by using standard modes of operation, with E denoting multiple rather than single encryption.
A cascade cipher is the concatenation of L => 2 block ciphers (called stages), each with independent keys. Plaintext is input to first stage; the output of stage i is input to stage i + 1; and the output of stage L is the cascade's ciphertext output. In the simplest case, all stages in a cascade cipher have k-bit keys, and the stage inputs and outputs are all n-bit quantities. The stage ciphers may differ (general cascade of ciphers), or all be identical (cascade of identical ciphers).
Multiple encryption is similar to a cascade of L identical ciphers, but the stage keys need not be independent, and the stage ciphers may be either a block cipher E or its corresponding decryption function D = E-1
Two important cases of multiple encryption are double and triple encryption.
Double encryption is defined as , where EK denotes a block cipher E with key K.
Independent stage keys K1 and K2 are typically used in double encryption. In triple encryption, to save on key management and storage costs, dependent stage keys are often used. E-D-E triple-encryption with K1 = K2 = K3 is backwards compatible with (i.e., equivalent to) single encryption.
2. Meet-in-the-middle attacks on multiple encryption
A naive exhaustive key search attack on double encryption tries all 22k key pairs. The attack reduces time from 22k, at the cost of substantial space.
Fact For a block cipher with a k-bit key, a known-plaintext meet-in-the-middle attack defeats double encryption using on the order of 2k operations and 2k storage.
Justification (basic meet-in-the-middle):, given a (P;C) pair, compute Mi = Ei(P) under all 2k possible key values K1 = i; store all pairs (Mi; i), sorted or indexed onMi (e.g., using conventional hashing). Decipher C under all 2k possible values K2 = j, and for each pair (Mj; j) where Mj = Dj(C), check for hits Mj = Mi against entries Mi in the first table. (This can be done creating a second sorted table, or simply checking each Mj entry as generated.) Each hit identifies a candidate solution key pair (i; j), since Ei(P) = M = Dj(C). Using a second known-plaintext pair (P0; C0), discard candidate key pairs which do not map P0 to C0. A concept analogous to unicity distance for ciphertext-only attack can be defined for known-plaintext key search, based on the following strategy. Select a key; check if it is consistent with a given set (history) of plaintext-ciphertext pairs; if so, label the key a hit. A hit that is not the target key is a false key hit.
The number of plaintext-ciphertext pairs required to uniquely determine a key under a known-plaintext key search is the known-plaintext unicity distance. This is the smallest integer t such that a history of length t makes false key hits improbable.
The (known-plaintext) unicity distance of a cascade of L random ciphers can be estimated. Less than one false hit is expected when t > Lk/n.
For an L-stage cascade of random block ciphers with n-bit blocks and k-bit keys, the expected number of false key hits for a history of length t is about 2LK-tn. With respect to random block ciphers defined as follows: given n and k, of the possible (2n)! permutations on 2n elements, choose 2k randomly and with equal probabilities, and associate these with the 2k keys.
3. Multiple-encryption modes of operation
In contrast to the single modes of operation, multiple modes are variants of multiple encryption constructed by concatenating selected single modes. For example, the combination of three single-mode CBC operations provides triple-inner-CBC; an alternative is triple-outer-CBC, the composite operation of triple encryption with one outer ciphertext feedback after the sequential application of three single-ECB operations. With replicated hardware, multiple modes such as triple-inner-CBC may be pipelined allowing performance comparable to single encryption, offering an advantage over triple-outer-CBC. Unfortunately, they are often less secure.
Security of triple-inner-CBC. Many multiple modes of operation are weaker than the corresponding multiple-ECB mode (i.e., multiple encryption operating as a black box with only outer feedbacks), and in some cases multiple modes (e.g., ECB-CBC-CBC) are not significantly stronger than single encryption. In particular, under some attacks triple inner- CBC is significantly weaker than triple-outer-CBC; against other attacks based on the block size, it appears stronger.
4. Cascade ciphers
Counter-intuitively, it is possible to devise examples whereby cascading of ciphers actually reduces security. However, it holds under a wide variety of attack models and meaningful definitions of "breaking".
A cascade of n (independently keyed) ciphers is at least as difficult to break as the first component cipher. Corollary: for stage ciphers which commute (e.g., additive stream ciphers), a cascade is at least as strong as the strongest component cipher. It does not apply to product ciphers consisting of component ciphers which may have dependent keys (e.g., two-key triple-encryption); indeed, keying dependencies across stages may compromise security entirely, as illustrated by a two-stage cascade wherein the components are two binary additive stream ciphers using an identical keystream - in this case, the cascade output is the original plaintext.
It may suggest the following practical design strategy: cascade a set of keystream generators each of which relies on one or more different design principles. It is not clear, however, if this is preferable to one large keystream generatorwhich relies on a single principle. The cascade may turn out to be less secure for a fixed set of parameters (number of key bits, block size), since ciphers built piecewise may often be attacked piecewise.