All iLabs and P2PSS Modern cryptography for communications security - - PowerPoint PPT Presentation

all ilabs and p2pss
SMART_READER_LITE
LIVE PREVIEW

All iLabs and P2PSS Modern cryptography for communications security - - PowerPoint PPT Presentation

All iLabs and P2PSS Modern cryptography for communications security part 1 Benjamin Hof hof@in.tum.de Lehrstuhl fr Netzarchitekturen und Netzdienste Fakultt fr Informatik Technische Universitt Mnchen Cryptography 17ss 1 / 34


slide-1
SLIDE 1

All iLabs and P2PSS

Modern cryptography for communications security part 1 Benjamin Hof hof@in.tum.de

Lehrstuhl für Netzarchitekturen und Netzdienste Fakultät für Informatik Technische Universität München

Cryptography – 17ss

1 / 34

slide-2
SLIDE 2

Outline

Cryptography Symmetric setting

2 / 34

slide-3
SLIDE 3

Outline

Cryptography Symmetric setting

3 / 34

slide-4
SLIDE 4

Scope

Focus on:

◮ modern cryptography ◮ methods used in communications security

Based on: Introduction to modern cryptography, Katz and Lindell, 2nd edition, 2015.

4 / 34

slide-5
SLIDE 5

Communication

by Melissa Elliott https://twitter.com/0xabad1dea/status/400676797874208768

5 / 34

slide-6
SLIDE 6

What we are concerned with

Alice Bob “Let’s meet up at 9!”

6 / 34

slide-7
SLIDE 7

What we are concerned with

Alice Bob “Let’s meet up at 9!” BfV

Roens/Wikipedia. CC-by-sa 2.0

6 / 34

slide-8
SLIDE 8

What we are concerned with

Alice Bob Eve “Let’s meet up at 9!” passive attack: eavesdropping We want to provide confidentiality!

6 / 34

slide-9
SLIDE 9

What we are concerned with

Alice Bob Mallory “This will not be on the exam!” active attack: message modification or forgery We want to provide message authentication!

6 / 34

slide-10
SLIDE 10

Limitations

◮ cryptography is typically bypassed, not broken ◮ not applied correctly ◮ not implemented correctly ◮ subverted

No protection of information about the communication.

◮ existence ◮ time ◮ extent ◮ partners 7 / 34

slide-11
SLIDE 11

Kerckhoffs’ principle

Security should only depend on secrecy of the key, not the secrecy of the system.

◮ key easier to keep secret ◮ change ◮ compatibility

No security by obscurity.

◮ scrutiny ◮ standards ◮ reverse engineering 8 / 34

slide-12
SLIDE 12

Another principle as a side note

The system should be usable easily.

◮ Kerckhoffs actually postulated 6 principles ◮ this one got somewhat forgotten ◮ considered uncontroversial by Kerckhoffs ◮ starting to be rediscovered in design of secure applications and

libraries

Example

Signal, NaCl

9 / 34

slide-13
SLIDE 13

What should secure encryption guarantee?

It should be impossible for the attacker to

10 / 34

slide-14
SLIDE 14

What should secure encryption guarantee?

It should be impossible for the attacker to

◮ recover the key. ◮ recover the entire plaintext from the ciphertext. ◮ recover any character of the plaintext from the ciphertext. 10 / 34

slide-15
SLIDE 15

What should secure encryption guarantee?

It should be impossible for the attacker to

◮ recover the key. ◮ recover the entire plaintext from the ciphertext. ◮ recover any character of the plaintext from the ciphertext.

Regardless of any information an attacker already has, a ciphertext should leak no additional information about the underlying plaintext.

10 / 34

slide-16
SLIDE 16

Modern cryptography

relies on

◮ formal definitions ◮ precisely defined assumptions ◮ mathematical proofs

Reductionist security arguments, the proofs, require to formulate assumptions explicitly.

11 / 34

slide-17
SLIDE 17

A definition of security

A scheme is secure, if any probabilistic polynomial time adversary succeeds in breaking the scheme with at most negligible probability.

Negligible

For every polynomial p and for all sufficiently large values of n: f (n) < 1 p(n) e.g., f (n) = 1

2n

Church-Turing Hypothesis

We believe polynomial time models all computers.

12 / 34

slide-18
SLIDE 18

Our goals

symmetric (secret-key)

◮ confidentiality ◮ authenticity

(as in: message integrity)

asymmetric (public-key)

◮ confidentiality ◮ authenticity ◮ key exchange

Something providing confidentiality generally makes no statement whatsoever about authenticity.

13 / 34

slide-19
SLIDE 19

Motivation

What does a perfectly encrypted message look like?

14 / 34

slide-20
SLIDE 20

Uniform distribution

P : U → [0, 1]

  • x∈U

P(x) = 1 ∀x ∈ U : P(x) = 1 |U|

15 / 34

slide-21
SLIDE 21

Randomness

◮ required to do any cryptography at all ◮ somewhat difficult to get in a computer (deterministic!) ◮ required to be cryptographically secure: indistiguishable from

truly random

◮ not provided in programming languages

Example

used to generate keys or other information unkown to any other parties

16 / 34

slide-22
SLIDE 22

Collecting unpredictable bits

◮ physical phenomena

◮ time between emission of particles during radioactive decay ◮ thermal noise from a semiconductor diode or resistor

◮ software-based

◮ elapsed time between keystrokes or mouse movement ◮ packet interarrival times

◮ attacker must not be able to guess/influence the collected

values

  • 1. collect pool of high-entropy data
  • 2. process into sequence of nearly independent and unbiased bits

17 / 34

slide-23
SLIDE 23

Pseudo-random generator

G : {0, 1}s → {0, 1}n, n ≫ s

18 / 34

slide-24
SLIDE 24

Outline

Cryptography Symmetric setting

19 / 34

slide-25
SLIDE 25

Symmetric encryption scheme

  • 1. k ← Gen(1n), security parameter 1n
  • 2. c ← Enck(m), m ∈ {0, 1}∗
  • 3. m := Deck(c)

◮ provide confidentiality ◮ definition of security: chosen-plaintext attack (CPA)

Cryptography uses theoretical attack games to analyze and formalize security. C: challenger, ← means non-deterministic, A: adversary := means deterministic

20 / 34

slide-26
SLIDE 26

The eavesdropping experiment

C A k ← Gen(1n) input 1n

slide-27
SLIDE 27

The eavesdropping experiment

C A k ← Gen(1n) input 1n b ← {0, 1} c ← Enck(mb)

  • utput b′

m0, m1 c

◮ A succeeds, iff b = b′ 21 / 34

slide-28
SLIDE 28

Discussion of the eavesdropping experiment

◮ |m0| = |m1| ◮ probabilistic polynomial time algorithms ◮ success probability should be 0.5 + negligible ◮ if so, Enc has indistinguishable encryptions in the presence of

an eavesdropper

22 / 34

slide-29
SLIDE 29

Pseudorandom permutation

F : {0, 1}∗ × {0, 1}∗ → {0, 1}∗

◮ Fk(x) and F −1 k (y) efficiently computable ◮ Fk be indistinguishable from uniform permutation ◮ adversary may have access to F −1

We can assume that all inputs and the output have the same length.

23 / 34

slide-30
SLIDE 30

A block cipher

Example

◮ fixed key length and block length ◮ chop m into 128 bit blocks

m k AES c 128 bit Does this function survive the eavesdropping experiment?

24 / 34

slide-31
SLIDE 31

Chosen-plaintext attack

C A k ← Gen(1n) input 1n

25 / 34

slide-32
SLIDE 32

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . m c

25 / 34

slide-33
SLIDE 33

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

)

25 / 34

slide-34
SLIDE 34

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

) C (cont’d) A c ← Enck(m) . . . . . . m c

  • utput bit b′

25 / 34

slide-35
SLIDE 35

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

) C (cont’d) A c ← Enck(m) . . . . . . m c

  • utput bit b′

25 / 34

slide-36
SLIDE 36

Discussion of CPA

◮ Enc is secure under chosen-plaintext attack ◮ again, messages must have same length ◮ multiple-use key ◮ non-deterministic (e. g. random initialization vector) or state ◮ block cipher requires operation mode, e. g.: counter (CTR),

  • utput-feedback (OFB), . . .

26 / 34

slide-37
SLIDE 37

Example constructions: counter mode

Example

◮ randomised AES counter mode (AES-CTR$) ◮ choose nonce r ← {0, 1}128, key k ← {0, 1}128 ◮ great if you have dedicated circuits for AES, else vulnerable to

timing attacks r AES k m0 ⊕ c0 r + 1 AES k m1 ⊕ c1

· · ·

complete ciphertext c := (r, c0, c1, · · · )

27 / 34

slide-38
SLIDE 38

Example constructions: stream ciphers

Example

A modern stream cipher, fast in software: 96 bit nonce 128 bit key 32 bit initial counter ChaCha ⊕ plaintext ciphertext keystream

28 / 34

slide-39
SLIDE 39

Message authentication code (MAC)

  • 1. k ← Gen(1n), security parameter 1n
  • 2. t ← Mack(m), m ∈ {0, 1}∗
  • 3. b := Vrfyk(m, t)

b = 1 means valid, b = 0 invalid

◮ transmit m, t ◮ tag t is a short authenticator ◮ message authenticity ⇔ integrity ◮ detect tampering ◮ no protection against replay ◮ “existentially unforgeable” ◮ security definition: adaptive chosen-message attack 29 / 34

slide-40
SLIDE 40

Adaptive chosen-message attack

C A k ← Gen(1n) input 1n t ← Mack(m) . . . . . .

  • utput m′, t′

m

  • m

, t

  • ◮ let Q be the set of all queries m

◮ A succeeds, iff Vrfyk(m′, t′) = 1 and m′ /

∈ Q

30 / 34

slide-41
SLIDE 41

Used in practice

Example

◮ HMAC based on hash functions ◮ CMAC based on cipher block chaining mode (CBC) ◮ authenticated encryption modes 31 / 34

slide-42
SLIDE 42

Example: side-channel attack

How does tag verification work and how to implement tag comparison correctly?

32 / 34

slide-43
SLIDE 43

Recap: secret-key cryptography

◮ attacker power: probabilistic polynomial time ◮ confidentiality defined as IND-CPA:

encryption, e. g. AES-CTR$

◮ message authentication defined as existentially unforgeable

under adaptive chosen-message attack: message authentication codes, e. g. HMAC-SHA2

◮ authenticated encryption modes 33 / 34

slide-44
SLIDE 44

Combining confidentiality and authentication

◮ encrypt-then-authenticate is generally secure:

c ← Enck1(m), t ← Mack2(c) transmit: c, t

◮ authenticated encryption is also a good choice:

  • e. g. offset codebook (OCB), Galois counter mode (GCM)

c, t ← AEADenc

k

(ad, m) m := AEADdec

k

(ad, c, t) or verification failure

34 / 34