related textbooks
play

Related textbooks Jonathan Katz, Yehuda Lindell: Security II: - PowerPoint PPT Presentation

Related textbooks Jonathan Katz, Yehuda Lindell: Security II: Cryptography Introduction to Modern Cryptography Chapman & Hall/CRC, 2008 Christof Paar, Jan Pelzl: Markus Kuhn Understanding Cryptography Springer, 2010


  1. Related textbooks Jonathan Katz, Yehuda Lindell: Security II: Cryptography Introduction to Modern Cryptography Chapman & Hall/CRC, 2008 Christof Paar, Jan Pelzl: Markus Kuhn Understanding Cryptography Springer, 2010 http://www.springerlink.com/content/978-3-642-04100-6/ http://www.crypto-textbook.com/ Computer Laboratory Douglas Stinson: Cryptography – Theory and Practice Lent 2012 – Part II 3rd ed., CRC Press, 2005 Menezes, van Oorschot, Vanstone: http://www.cl.cam.ac.uk/teaching/1213/SecurityII/ Handbook of Applied Cryptography CRC Press, 1996 http://www.cacr.math.uwaterloo.ca/hac/ 1 2 Private-key (symmetric) encryption When is an encryption scheme “secure”? A private-key encryption scheme is a tuple of probabilistic polynomial-time algorithms (Gen , Enc , Dec) and sets K , M , C such that If no adversary can . . . the key generation algorithm Gen receives a security parameter ℓ . . . find out the key K ? and outputs a key K ← Gen(1 ℓ ), with K ∈ K , key length | K | ≥ ℓ ; . . . find the plaintext message M ? the encryption algorithm Enc maps a key K and a plaintext message M ∈ M = { 0 , 1 } m to a ciphertext message C ← Enc K ( M ); . . . determine any character/bit of M ? the decryption algorithm Dec maps a key K and a ciphertext . . . determine any information about M from C ? C ∈ C = { 0 , 1 } n ( n ≥ m ) to a plaintext message M := Dec K ( C ); . . . compute any function of the plaintext M from ciphertext C ? for all ℓ , K ← Gen(1 ℓ ), and M ∈ { 0 , 1 } m : Dec K (Enc K ( M )) = M . ⇒ “semantic security” Notes: Note: we explicitly do not worry here about the adversary being able to infer something about the length m of the plaintext message M by looking at the length n of the ciphertext C . A “probabilistic algorithm” can toss coins (uniformly distributed, independent). Therefore, we consider for the following security definitions only messages of fixed length m . Notation: ← assigns the output of a probabilistic algorithm, := that of a deterministic algorithm. Variable-length messages can always be extended to a fixed length, by padding, but this can be A “polynomial-time algorithm” has constants a , b , c such that the runtime is expensive. It will depend on the specific application whether the benefits of fixed-length padding always less than a · ℓ b + c if the input is ℓ bits long. (think Turing machine) outweigh the added transmission cost. Technicality: we supply the security parameter ℓ to Gen here in unary encoding (as a sequence of ℓ “1” bits: 1 ℓ ), merely to remain compatible with the notion of “input size” from computational complexity theory. In practice, Gen usually simply picks ℓ random bits K ∈ R { 0 , 1 } ℓ . 3 4

  2. What capabilities may the adversary have? Recall: perfect secrecy, one-time pad Definition: An encryption scheme (Gen , Enc , Dec) over a message space unlimited / polynomial / realistic ( ≪ 2 80 steps) computation time? M is perfectly secret if for every probability distribution over M , every message M ∈ M , and every ciphertext C ∈ C with P ( C ) > 0 we have only access to ciphertext C ? access to some plaintext/ciphertext pairs ( M , C ) with P ( M | C ) = P ( M ) . C ← Enc K ( M )? In this case, even an eavesdropper with unlimited computational power cannot learn anything about how many applications of K can be observed? M by looking at C that they didn’t know in advance about M ⇒ eavesdropping C has no benefit. ability to trick the user of Enc K into encrypting some plaintext of Shannon’s theorem: Let (Gen , Enc , Dec) be an encryption scheme over the adversary’s choice and return the result? a message space M with |M| = |K| = |C| . It is perfectly secret if and (“oracle access” to Enc) only if ability to trick the user of Dec K into decrypting some ciphertext of Gen chooses every K with equal probability 1 / |K| ; 1 the adversary’s choice and return the result? for every M ∈ M and every C ∈ C , there exists a unique key K ∈ K 2 (“oracle access” to Dec)? such that C := Enc K M . ability to modify or replace C en route? The one-time pad scheme implements this: (not limited to eavesdropping) K ∈ R { 0 , 1 } m Gen : ( m uniform, independent coin tosses) Wanted: Clear definitions of what security of an encryption scheme Enc : C := K ⊕ M (bit-wise XOR) means, to guide both designers and users of schemes, and allow proofs. Dec : M := K ⊕ C 5 6 Security definitions for encryption schemes Indistinguishability in the presence of an eavesdropper We define security via the rules of a game played between two players: Private-key encryption scheme Π = (Gen , Enc , Dec), M = { 0 , 1 } m , security parameter ℓ . Experiment/game PrivK eav a challenger, who uses an encryption scheme Π = (Gen , Enc , Dec) A , Π ( ℓ ): an adversary A , who tries to demonstrate a weakness in Π. 1 ℓ 1 ℓ M 0 , M 1 b ∈ R { 0 , 1 } Most of these games follow a simple pattern: K ← Gen(1 ℓ ) A the challenger uniformly randomly picks a secret bit b ∈ R { 0 , 1 } 1 C ← Enc K ( M b ) A interacts with the challenger according to the rules of the game 2 C challenger adversary b ′ b At the end, A has to output a bit b ′ . 3 Setup: The outcome of such a game X A , Π ( ℓ ) is 1 if b = b ′ , otherwise The challenger generates a bit b ∈ R { 0 , 1 } and a key K ← Gen(1 ℓ ). X A , Π ( ℓ ) = 0. 1 The adversary A is given input 1 ℓ An encryption scheme Π is considered “ X secure” if for all probabilistic 2 polynomial-time (PPT) adversaries A there exists a “negligible” function Rules for the interaction: negl such that The adversary A outputs a pair of messages: 1 P ( X A , Π ( ℓ ) = 1) < 1 2 + negl( ℓ ) M 0 , M 1 ∈ { 0 , 1 } m . The challenger computes C ← Enc K ( M b ) and returns 2 A function negl( ℓ ) is “negligible” if it converges faster to zero than any C to A polynomial over ℓ does, as ℓ → ∞ . Finally, A outputs b ′ . If b ′ = b then A has succeeded ⇒ PrivK eav In practice, we want negl to drop below a small number (e.g., 2 − 80 ) for modest key lengths ℓ A , Π ( ℓ ) = 1 (e.g., log 10 ℓ ≈ 2 . . . 3). 7 8

  3. Indistinguishability in the presence of an eavesdropper Pseudo-random generator G : { 0 , 1 } n → { 0 , 1 } e ( n ) where e ( · ) is a polynomial (expansion factor) Definition: A private-key encryption scheme Π has indistinguishable Definition: G is a pseudo-random generator if both encryption in the presence of an eavesdropper if for all probabilistic, e ( n ) > n for all n (expansion) 1 polynomial-time adversaries A there exists a negligible function negl, for all probabilistic, polynomial-time distinguishers D there exists a such that 2 A , Π ( ℓ ) = 1) ≤ 1 negligible function negl such that P (PrivK eav 2 + negl( ℓ ) | P ( D ( r ) = 1) − P ( D ( G ( s )) = 1) | ≤ negl( n ) In other words: as we increase the security parameter ℓ , we quickly reach the point where no eavesdropper can do significantly better just randomly guessing b . where both r ∈ R { 0 , 1 } e ( n ) and the seed s ∈ R { 0 , 1 } n are chosen at random, and the probabilities are taken over all coin tosses used by The above definition is equivalent to demanding D and for picking r and s . A , Π ( ℓ ) = | P ( b = 1 and b ′ = 1) − P ( b = 0 and b ′ = 1) | ≤ negl( ℓ ) Adv PrivK eav A brute-force distinguisher D would enumerate all 2 n possible outputs of G , and return 1 if the input is one of them. It would achieve P ( D ( G ( s )) = 1) = 1 and P ( D ( r ) = 1) = 2 n / 2 e ( n ) , the The “advantage” Adv that A can achieve is a measure of A ’s ability to difference of which converges to 1, which is not negligible. behave differently depending on the value of b . But a brute-force distinguisher has a exponential run-time O (2 n ), and is therefore excluded. We do not know how to prove that a given algorithm is a pseudo-random generator, but there are many algorithms that are widely believed to be. Some constructions are pseudo-random generators if another well-studied problem is not solvable in polynomial time. 9 10 Encrypting using a pseudo-random generator Security proof for a stream cipher Claim: Π PRG has indistinguishability in the presence of an eavesdropper We define the following fixed-length private-key encryption scheme if G is a pseudo-random generator. Π PRG = (Gen , Enc , Dec): Proof: (outline) If Π PRG did not have indistinguishability in the presence of an eavesdropper, there would be an adversary A for which Let G be a pseudo-random generator with expansion factor e ( · ), K = { 0 , 1 } ℓ , M = C = { 0 , 1 } e ( ℓ ) A , Π PRG ( ℓ ) = 1) − 1 ǫ ( ℓ ) := P (PrivK eav Gen: on input 1 ℓ chose K ∈ R { 0 , 1 } ℓ randomly 2 Enc: C := G ( K ) ⊕ M is not negligible. Use that A to construct a distinguisher D for G : Dec: M := G ( K ) ⊕ C receive input W ∈ { 0 , 1 } e ( ℓ ) Such constructions are known as “stream ciphers”. pick b ∈ R { 0 , 1 } We can prove that Π PRG has “indistinguishable encryption in the run A (1 ℓ ) and receive from it M 0 , M 1 ∈ { 0 , 1 } e ( ℓ ) presence of an eavesdropper” assuming that G is a pseudo-random generator: if we had a polynomial-time adversary A that can succeed return C := W ⊕ M b to A with non-negligible advantage against Π PRG , we can turn that using a receive b ′ from A polynomial-time algorithm into a polynomial-time distinguisher for G , return 1 if b ′ = b , otherwise return 0 which would violate the assumption. Now, what is | P ( D ( r ) = 1) − P ( D ( G ( K )) = 1) | ? 11 12

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend