SLIDE 1
iLab Modern cryptography for communications security a fast rush - - PowerPoint PPT Presentation
iLab Modern cryptography for communications security a fast rush - - PowerPoint PPT Presentation
iLab Modern cryptography for communications security a fast rush Benjamin Hof hof@in.tum.de Lehrstuhl fr Netzarchitekturen und Netzdienste Fakultt fr Informatik Technische Universitt Mnchen Cryptography 16ss 1 / 38 Outline
SLIDE 2
SLIDE 3
Outline
Cryptography Secret-key setting Hash functions Using cryptography
3 / 38
SLIDE 4
Scope
Focus on:
◮ modern cryptography ◮ methods used in communications security
Based on: Introduction to modern cryptography, Katz and Lindell, 2nd edition, 2015.
4 / 38
SLIDE 5
What we are concerned with
Alice Bob “Let’s meet up at 9!”
5 / 38
SLIDE 6
What we are concerned with
Alice Bob “Let’s meet up at 9!” BfV
Roens/Wikipedia. CC-by-sa 2.0
5 / 38
SLIDE 7
What we are concerned with
Alice Bob Eve “Let’s meet up at 9!” passive attack: eavesdropping We want to provide confidentiality!
5 / 38
SLIDE 8
What we are concerned with
Alice Bob Mallory “You can trust Trent!” active attack: message modification We want to provide message authentication!
5 / 38
SLIDE 9
Limitations
◮ cryptography is typically bypassed, not broken ◮ not applied correctly ◮ not implemented correctly ◮ subverted
communication
◮ existence ◮ time ◮ extent ◮ partners 6 / 38
SLIDE 10
Kerckhoffs’ principle
Security should only depend on secrecy of the key, not the secrecy of the system.
◮ key easier to keep secret ◮ change ◮ compatibility
No security by obscurity.
◮ scrutiny ◮ standards ◮ reverse engineering 7 / 38
SLIDE 11
Another principle as a side note
The system should be usable easily.
◮ Kerckhoffs actually postulated 6 principles ◮ this one got somewhat forgotten ◮ considered uncontroversial by Kerckhoffs ◮ starting to be rediscovered in design of secure applications and
libraries
Example
Signal, NaCl
8 / 38
SLIDE 12
Modern cryptography
relies on
◮ formal definitions ◮ precisely defined assumptions ◮ mathematical proofs
Reductionist security arguments, the proofs, require to formulate assumptions explicitly.
9 / 38
SLIDE 13
Uniform distribution
P : U → [0, 1]
- x∈U
P(x) = 1 ∀x ∈ U : P(x) = 1 |U|
10 / 38
SLIDE 14
Randomness
◮ required to do any cryptography at all ◮ somewhat difficult to get in a computer (deterministic!) ◮ required to be cryptographically secure: indistiguishable from
truly random
◮ not provided in programming languages
Example
used to generate keys or other information unkown to any other parties
11 / 38
SLIDE 15
Collecting unpredictable bits
◮ physical phenomena
◮ time between emission of particles during radioactive decay ◮ thermal noise from a semiconductor diode or resistor
◮ software-based
◮ elapsed time between keystrokes or mouse movement ◮ packet interarrival times
◮ attacker must not be able to guess/influence the collected
values
- 1. collect pool of high-entropy data
- 2. process into sequence of nearly independent and unbiased bits
12 / 38
SLIDE 16
Pseudo-random generator
G : {0, 1}s → {0, 1}n, n ≫ s
13 / 38
SLIDE 17
A definition of security
A scheme is secure, if any probabilistic polynomial time adversary succeeds in breaking the scheme with at most negligible probability.
Negligible
For every polynomial p and for all sufficiently large values of n: f (n) < 1 p(n) e.g., f (n) = 1
2n
Church-Turing Hypothesis
We believe polynomial time models all computers.
14 / 38
SLIDE 18
Our goals
Secret-key (symmetric)
◮ confidentiality ◮ authenticity
(as in: message integrity)
public-key (asymmetric)
◮ confidentiality ◮ authenticity ◮ key exchange
Something providing confidentiality generally makes no statement whatsoever about authenticity.
15 / 38
SLIDE 19
Outline
Cryptography Secret-key setting Hash functions Using cryptography
16 / 38
SLIDE 20
Secret-key encryption scheme
- 1. k ← Gen(1n), security parameter 1n
- 2. c ← Enck(m), m ∈ {0, 1}∗
- 3. m := Deck(c)
◮ provide confidentiality ◮ definition of security: chosen-plaintext attack (CPA)
Cryptography uses theoretical attack games to analyze and formalize security. C: challenger, ← means non-deterministic, A: adversary := means deterministic
17 / 38
SLIDE 21
The eavesdropping experiment
C A k ← Gen(1n) input 1n
SLIDE 22
The eavesdropping experiment
C A k ← Gen(1n) input 1n b ← {0, 1} c ← Enck(mb)
- utput b′
m0, m1 c
◮ A succeeds, iff b = b′ 18 / 38
SLIDE 23
Discussion of the eavesdropping experiment
◮ |m0| = |m1| ◮ probabilistic polynomial time algorithms ◮ success probability should be 0.5 + negligible ◮ if so, Enc has indistinguishable encryptions in the presence of
an eavesdropper
19 / 38
SLIDE 24
Pseudorandom permutation
F : {0, 1}∗ × {0, 1}∗ → {0, 1}∗
◮ Fk(x) and F −1 k (y) efficiently computable ◮ Fk be indistinguishable from uniform permutation ◮ adversary may have access to F −1
We can assume that all inputs and the output have the same length.
20 / 38
SLIDE 25
A block cipher
Example
◮ fixed key length and block length ◮ chop m into 128 bit blocks
m k AES c 128 bit Does this function survive the eavesdropping experiment?
21 / 38
SLIDE 26
Chosen-plaintext attack
C A k ← Gen(1n) input 1n
22 / 38
SLIDE 27
Chosen-plaintext attack
C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . m c
22 / 38
SLIDE 28
Chosen-plaintext attack
C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m
1
E n c
k
( m
b
)
22 / 38
SLIDE 29
Chosen-plaintext attack
C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m
1
E n c
k
( m
b
) C (cont’d) A c ← Enck(m) . . . . . . m c
- utput bit b′
22 / 38
SLIDE 30
Chosen-plaintext attack
C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m
1
E n c
k
( m
b
) C (cont’d) A c ← Enck(m) . . . . . . m c
- utput bit b′
22 / 38
SLIDE 31
Discussion of CPA
◮ Enc is secure under chosen-plaintext attack ◮ again, messages must have same length ◮ multiple-use key ◮ non-deterministic (e. g. random initialization vector) or state ◮ block cipher requires operation mode: counter (CTR),
- utput-feedback (OFB), . . .
23 / 38
SLIDE 32
Example constructions: counter mode
Example
◮ randomised AES counter mode (AES-CTR$) ◮ choose nonce r ← {0, 1}128, key k ← {0, 1}128 ◮ great if you have dedicated circuits for AES, else vulnerable to
timing attacks r AES k m0 ⊕ c0 r + 1 AES k m1 ⊕ c1
· · ·
complete ciphertext c := (r, c0, c1, · · · )
24 / 38
SLIDE 33
Example constructions: stream ciphers
Example
A modern stream cipher, fast in software: 96 bit nonce 256 bit key 32 bit initial counter ChaCha ⊕ plaintext ciphertext keystream
25 / 38
SLIDE 34
Message authentication code (MAC)
- 1. k ← Gen(1n), security parameter 1n
- 2. t ← Mack(m), m ∈ {0, 1}∗
- 3. b := Vrfyk(m, t)
b = 1 means valid, b = 0 invalid
◮ transmit m, t ◮ tag t is a short authenticator ◮ message authenticity ⇔ integrity ◮ detect tampering ◮ no protection against replay ◮ “existentially unforgeable” ◮ security definition: adaptive chosen-message attack 26 / 38
SLIDE 35
Adaptive chosen-message attack
C A k ← Gen(1n) input 1n t ← Mack(m) . . . . . .
- utput m′, t′
m
- m
, t
- ◮ let Q be the set of all queries m
◮ A succeeds, iff Vrfyk(m′, t′) = 1 and m′ /
∈ Q
27 / 38
SLIDE 36
Used in practice
Example
◮ HMAC based on hash functions ◮ CMAC based on cipher block chaining mode (CBC) ◮ authenticated encryption modes 28 / 38
SLIDE 37
Example: side-channel attack
How does tag verification work and how to implement tag comparison correctly?
29 / 38
SLIDE 38
Recap: secret-key cryptography
◮ attacker power: probabilistic polynomial time ◮ confidentiality defined as IND-CPA:
encryption, e. g. AES-CTR$
◮ message authentication defined as existentially unforgeable
under adaptive chosen-message attack: message authentication codes, e. g. HMAC-SHA2
◮ authenticated encryption modes 30 / 38
SLIDE 39
Combining confidentiality and authentication
◮ encrypt-then-authenticate is generally secure:
c ← Enck1(m), t ← Mack2(c) transmit: c, t
◮ authenticated encryption is also a good choice:
- e. g. offset codebook (OCB), Galois counter mode (GCM)
c, t ← AEADenc
k
(ad, m) m := AEADdec
k
(ad, c, t) or verification failure
31 / 38
SLIDE 40
Outline
Cryptography Secret-key setting Hash functions Using cryptography
32 / 38
SLIDE 41
Cryptographic hash functions
secret-key
◮ encryption ◮ message
authentication codes hash functions
public-key
. . .
33 / 38
SLIDE 42
Hash functions
◮ variable length input ◮ fixed length output
provide:
- 1. pre-image resistance
given H(x) with a randomly chosen x, cannot find x′ s. t. H(x′) = H(x) “H is one-way”
- 2. second pre-image resistance
given x, cannot find x′ = x s. t. H(x′) = H(x)
- 3. collision resistance
cannot find x = x′ s. t. H(x) = H(x′)
34 / 38
input H(·)
- utput
fixed length
SLIDE 43
Example: constructing MACs from hash functions
HMAC is a popular MAC:
◮ opad is 0x36, ipad is 0x5C
tag := H(k ⊕ opadH(k ⊕ ipadm))
◮ use SHA2-512, truncate tag to 256 bits
Used with Merkle-Damgård functions, since they allow to compute from H(km) the extension H(kmtail).
35 / 38
SLIDE 44
Outline
Cryptography Secret-key setting Hash functions Using cryptography
36 / 38
SLIDE 45
What to use
◮ key sizes from ENISA reports ◮ algorithms from competitions (also see ENISA) ◮ nothing to do with passwords ◮ “side channels” paramount consideration 37 / 38
SLIDE 46
Words of caution
limits
◮ crypto will not solve your problem ◮ only a small part of a secure system ◮ don’t implement yourself
difficult to solve problems
◮ trust / key distribution
◮ revocation