iLab Modern cryptography for communications security a fast rush - - PowerPoint PPT Presentation

ilab
SMART_READER_LITE
LIVE PREVIEW

iLab Modern cryptography for communications security a fast rush - - PowerPoint PPT Presentation

iLab Modern cryptography for communications security a fast rush Benjamin Hof hof@in.tum.de Lehrstuhl fr Netzarchitekturen und Netzdienste Fakultt fr Informatik Technische Universitt Mnchen Cryptography 16ss 1 / 38 Outline


slide-1
SLIDE 1

iLab

Modern cryptography for communications security a fast rush Benjamin Hof hof@in.tum.de

Lehrstuhl für Netzarchitekturen und Netzdienste Fakultät für Informatik Technische Universität München

Cryptography – 16ss

1 / 38

slide-2
SLIDE 2

Outline

Cryptography Secret-key setting Hash functions Using cryptography

2 / 38

slide-3
SLIDE 3

Outline

Cryptography Secret-key setting Hash functions Using cryptography

3 / 38

slide-4
SLIDE 4

Scope

Focus on:

◮ modern cryptography ◮ methods used in communications security

Based on: Introduction to modern cryptography, Katz and Lindell, 2nd edition, 2015.

4 / 38

slide-5
SLIDE 5

What we are concerned with

Alice Bob “Let’s meet up at 9!”

5 / 38

slide-6
SLIDE 6

What we are concerned with

Alice Bob “Let’s meet up at 9!” BfV

Roens/Wikipedia. CC-by-sa 2.0

5 / 38

slide-7
SLIDE 7

What we are concerned with

Alice Bob Eve “Let’s meet up at 9!” passive attack: eavesdropping We want to provide confidentiality!

5 / 38

slide-8
SLIDE 8

What we are concerned with

Alice Bob Mallory “You can trust Trent!” active attack: message modification We want to provide message authentication!

5 / 38

slide-9
SLIDE 9

Limitations

◮ cryptography is typically bypassed, not broken ◮ not applied correctly ◮ not implemented correctly ◮ subverted

communication

◮ existence ◮ time ◮ extent ◮ partners 6 / 38

slide-10
SLIDE 10

Kerckhoffs’ principle

Security should only depend on secrecy of the key, not the secrecy of the system.

◮ key easier to keep secret ◮ change ◮ compatibility

No security by obscurity.

◮ scrutiny ◮ standards ◮ reverse engineering 7 / 38

slide-11
SLIDE 11

Another principle as a side note

The system should be usable easily.

◮ Kerckhoffs actually postulated 6 principles ◮ this one got somewhat forgotten ◮ considered uncontroversial by Kerckhoffs ◮ starting to be rediscovered in design of secure applications and

libraries

Example

Signal, NaCl

8 / 38

slide-12
SLIDE 12

Modern cryptography

relies on

◮ formal definitions ◮ precisely defined assumptions ◮ mathematical proofs

Reductionist security arguments, the proofs, require to formulate assumptions explicitly.

9 / 38

slide-13
SLIDE 13

Uniform distribution

P : U → [0, 1]

  • x∈U

P(x) = 1 ∀x ∈ U : P(x) = 1 |U|

10 / 38

slide-14
SLIDE 14

Randomness

◮ required to do any cryptography at all ◮ somewhat difficult to get in a computer (deterministic!) ◮ required to be cryptographically secure: indistiguishable from

truly random

◮ not provided in programming languages

Example

used to generate keys or other information unkown to any other parties

11 / 38

slide-15
SLIDE 15

Collecting unpredictable bits

◮ physical phenomena

◮ time between emission of particles during radioactive decay ◮ thermal noise from a semiconductor diode or resistor

◮ software-based

◮ elapsed time between keystrokes or mouse movement ◮ packet interarrival times

◮ attacker must not be able to guess/influence the collected

values

  • 1. collect pool of high-entropy data
  • 2. process into sequence of nearly independent and unbiased bits

12 / 38

slide-16
SLIDE 16

Pseudo-random generator

G : {0, 1}s → {0, 1}n, n ≫ s

13 / 38

slide-17
SLIDE 17

A definition of security

A scheme is secure, if any probabilistic polynomial time adversary succeeds in breaking the scheme with at most negligible probability.

Negligible

For every polynomial p and for all sufficiently large values of n: f (n) < 1 p(n) e.g., f (n) = 1

2n

Church-Turing Hypothesis

We believe polynomial time models all computers.

14 / 38

slide-18
SLIDE 18

Our goals

Secret-key (symmetric)

◮ confidentiality ◮ authenticity

(as in: message integrity)

public-key (asymmetric)

◮ confidentiality ◮ authenticity ◮ key exchange

Something providing confidentiality generally makes no statement whatsoever about authenticity.

15 / 38

slide-19
SLIDE 19

Outline

Cryptography Secret-key setting Hash functions Using cryptography

16 / 38

slide-20
SLIDE 20

Secret-key encryption scheme

  • 1. k ← Gen(1n), security parameter 1n
  • 2. c ← Enck(m), m ∈ {0, 1}∗
  • 3. m := Deck(c)

◮ provide confidentiality ◮ definition of security: chosen-plaintext attack (CPA)

Cryptography uses theoretical attack games to analyze and formalize security. C: challenger, ← means non-deterministic, A: adversary := means deterministic

17 / 38

slide-21
SLIDE 21

The eavesdropping experiment

C A k ← Gen(1n) input 1n

slide-22
SLIDE 22

The eavesdropping experiment

C A k ← Gen(1n) input 1n b ← {0, 1} c ← Enck(mb)

  • utput b′

m0, m1 c

◮ A succeeds, iff b = b′ 18 / 38

slide-23
SLIDE 23

Discussion of the eavesdropping experiment

◮ |m0| = |m1| ◮ probabilistic polynomial time algorithms ◮ success probability should be 0.5 + negligible ◮ if so, Enc has indistinguishable encryptions in the presence of

an eavesdropper

19 / 38

slide-24
SLIDE 24

Pseudorandom permutation

F : {0, 1}∗ × {0, 1}∗ → {0, 1}∗

◮ Fk(x) and F −1 k (y) efficiently computable ◮ Fk be indistinguishable from uniform permutation ◮ adversary may have access to F −1

We can assume that all inputs and the output have the same length.

20 / 38

slide-25
SLIDE 25

A block cipher

Example

◮ fixed key length and block length ◮ chop m into 128 bit blocks

m k AES c 128 bit Does this function survive the eavesdropping experiment?

21 / 38

slide-26
SLIDE 26

Chosen-plaintext attack

C A k ← Gen(1n) input 1n

22 / 38

slide-27
SLIDE 27

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . m c

22 / 38

slide-28
SLIDE 28

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

)

22 / 38

slide-29
SLIDE 29

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

) C (cont’d) A c ← Enck(m) . . . . . . m c

  • utput bit b′

22 / 38

slide-30
SLIDE 30

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

) C (cont’d) A c ← Enck(m) . . . . . . m c

  • utput bit b′

22 / 38

slide-31
SLIDE 31

Discussion of CPA

◮ Enc is secure under chosen-plaintext attack ◮ again, messages must have same length ◮ multiple-use key ◮ non-deterministic (e. g. random initialization vector) or state ◮ block cipher requires operation mode: counter (CTR),

  • utput-feedback (OFB), . . .

23 / 38

slide-32
SLIDE 32

Example constructions: counter mode

Example

◮ randomised AES counter mode (AES-CTR$) ◮ choose nonce r ← {0, 1}128, key k ← {0, 1}128 ◮ great if you have dedicated circuits for AES, else vulnerable to

timing attacks r AES k m0 ⊕ c0 r + 1 AES k m1 ⊕ c1

· · ·

complete ciphertext c := (r, c0, c1, · · · )

24 / 38

slide-33
SLIDE 33

Example constructions: stream ciphers

Example

A modern stream cipher, fast in software: 96 bit nonce 256 bit key 32 bit initial counter ChaCha ⊕ plaintext ciphertext keystream

25 / 38

slide-34
SLIDE 34

Message authentication code (MAC)

  • 1. k ← Gen(1n), security parameter 1n
  • 2. t ← Mack(m), m ∈ {0, 1}∗
  • 3. b := Vrfyk(m, t)

b = 1 means valid, b = 0 invalid

◮ transmit m, t ◮ tag t is a short authenticator ◮ message authenticity ⇔ integrity ◮ detect tampering ◮ no protection against replay ◮ “existentially unforgeable” ◮ security definition: adaptive chosen-message attack 26 / 38

slide-35
SLIDE 35

Adaptive chosen-message attack

C A k ← Gen(1n) input 1n t ← Mack(m) . . . . . .

  • utput m′, t′

m

  • m

, t

  • ◮ let Q be the set of all queries m

◮ A succeeds, iff Vrfyk(m′, t′) = 1 and m′ /

∈ Q

27 / 38

slide-36
SLIDE 36

Used in practice

Example

◮ HMAC based on hash functions ◮ CMAC based on cipher block chaining mode (CBC) ◮ authenticated encryption modes 28 / 38

slide-37
SLIDE 37

Example: side-channel attack

How does tag verification work and how to implement tag comparison correctly?

29 / 38

slide-38
SLIDE 38

Recap: secret-key cryptography

◮ attacker power: probabilistic polynomial time ◮ confidentiality defined as IND-CPA:

encryption, e. g. AES-CTR$

◮ message authentication defined as existentially unforgeable

under adaptive chosen-message attack: message authentication codes, e. g. HMAC-SHA2

◮ authenticated encryption modes 30 / 38

slide-39
SLIDE 39

Combining confidentiality and authentication

◮ encrypt-then-authenticate is generally secure:

c ← Enck1(m), t ← Mack2(c) transmit: c, t

◮ authenticated encryption is also a good choice:

  • e. g. offset codebook (OCB), Galois counter mode (GCM)

c, t ← AEADenc

k

(ad, m) m := AEADdec

k

(ad, c, t) or verification failure

31 / 38

slide-40
SLIDE 40

Outline

Cryptography Secret-key setting Hash functions Using cryptography

32 / 38

slide-41
SLIDE 41

Cryptographic hash functions

secret-key

◮ encryption ◮ message

authentication codes hash functions

public-key

. . .

33 / 38

slide-42
SLIDE 42

Hash functions

◮ variable length input ◮ fixed length output

provide:

  • 1. pre-image resistance

given H(x) with a randomly chosen x, cannot find x′ s. t. H(x′) = H(x) “H is one-way”

  • 2. second pre-image resistance

given x, cannot find x′ = x s. t. H(x′) = H(x)

  • 3. collision resistance

cannot find x = x′ s. t. H(x) = H(x′)

34 / 38

input H(·)

  • utput

fixed length

slide-43
SLIDE 43

Example: constructing MACs from hash functions

HMAC is a popular MAC:

◮ opad is 0x36, ipad is 0x5C

tag := H(k ⊕ opadH(k ⊕ ipadm))

◮ use SHA2-512, truncate tag to 256 bits

Used with Merkle-Damgård functions, since they allow to compute from H(km) the extension H(kmtail).

35 / 38

slide-44
SLIDE 44

Outline

Cryptography Secret-key setting Hash functions Using cryptography

36 / 38

slide-45
SLIDE 45

What to use

◮ key sizes from ENISA reports ◮ algorithms from competitions (also see ENISA) ◮ nothing to do with passwords ◮ “side channels” paramount consideration 37 / 38

slide-46
SLIDE 46

Words of caution

limits

◮ crypto will not solve your problem ◮ only a small part of a secure system ◮ don’t implement yourself

difficult to solve problems

◮ trust / key distribution

◮ revocation

◮ ease of use

many requirements remaining

◮ replay ◮ timing attack ◮ endpoint security 38 / 38