iLab Modern cryptography for communications security Benjamin Hof - - PowerPoint PPT Presentation

ilab
SMART_READER_LITE
LIVE PREVIEW

iLab Modern cryptography for communications security Benjamin Hof - - PowerPoint PPT Presentation

iLab Modern cryptography for communications security Benjamin Hof hof@in.tum.de Lehrstuhl fr Netzarchitekturen und Netzdienste Fakultt fr Informatik Technische Universitt Mnchen Cryptography 15ws 1 / 72 Outline Cryptography


slide-1
SLIDE 1

iLab

Modern cryptography for communications security Benjamin Hof hof@in.tum.de

Lehrstuhl für Netzarchitekturen und Netzdienste Fakultät für Informatik Technische Universität München

Cryptography – 15ws

1 / 72

slide-2
SLIDE 2

Outline

Cryptography Private-key setting Public-key setting Meta

2 / 72

slide-3
SLIDE 3

Outline

Cryptography Private-key setting Public-key setting Meta

3 / 72

slide-4
SLIDE 4

Scope

Focus on:

◮ modern cryptography ◮ methods used in communications security

Based on: Introduction to modern cryptography, Katz and Lindell, 2nd edition, 2015.

4 / 72

slide-5
SLIDE 5

Communication

by Melissa Elliott https://twitter.com/0xabad1dea/status/400676797874208768

5 / 72

slide-6
SLIDE 6

What we are concerned with

Alice Bob “Let’s meet up at 9!”

6 / 72

slide-7
SLIDE 7

What we are concerned with

Alice Bob “Let’s meet up at 9!” BfV

Roens/Wikipedia. CC-by-sa 2.0

6 / 72

slide-8
SLIDE 8

What we are concerned with

Alice Bob Eve “Let’s meet up at 9!” passive attack: eavesdropping We want to provide confidentiality!

6 / 72

slide-9
SLIDE 9

What we are concerned with

Alice Bob Mallory “You can trust Trent!” active attack: message modification We want to provide message authentication!

6 / 72

slide-10
SLIDE 10

Limitations

◮ cryptography is typically bypassed, not broken ◮ not applied correctly ◮ not implemented correctly ◮ subverted

communication

◮ existence ◮ time ◮ extent ◮ partners 7 / 72

slide-11
SLIDE 11

Kerckhoffs’ principle

Security should only depend on secrecy of the key, not the secrecy of the system.

◮ key easier to keep secret ◮ change ◮ compatibility

No security by obscurity.

◮ scrutiny ◮ standards ◮ reverse engineering 8 / 72

slide-12
SLIDE 12

Another principle as a side note

The system should be usable easily.

◮ Kerckhoffs actually postulated 6 principles ◮ this one got somewhat forgotten ◮ starting to be rediscovered in design of secure applications and

libraries

Example

Signal, NaCl

9 / 72

slide-13
SLIDE 13

Modern cryptography

relies on

◮ formal definitions ◮ precisely defined assumptions ◮ mathematical proofs

Reductionist security arguments, the “proofs”, require to formulate assumptions explicitly.

10 / 72

slide-14
SLIDE 14

Uniform distribution

P : U → [0, 1]

  • x∈U

P(x) = 1 ∀x ∈ U : P(x) = 1 |U|

11 / 72

slide-15
SLIDE 15

Randomness

◮ required to do any cryptography at all ◮ somewhat difficult to get in a computer (deterministic!) ◮ required to be cryptographically secure: indistiguishable from

truly random

◮ not provided in programming languages

Example

used to generate keys or other information unkown to any other parties

12 / 72

slide-16
SLIDE 16

Collecting unpredictable bits

  • 1. collect pool of high-entropy data
  • 2. process into sequence of nearly independent and unbiased bits

◮ physical phenomena

◮ time between emission of particles during radioactive decay ◮ thermal noise from a semiconductor diode or resistor

◮ software-based

◮ elapsed time between keystrokes or mouse movement ◮ packet interarrival times

◮ attacker must not be able to guess/influence the collected

values

13 / 72

slide-17
SLIDE 17

Pseudo-random generator

G : {0, 1}s → {0, 1}n, n ≫ s

14 / 72

slide-18
SLIDE 18

A definition of security

A scheme is secure, if any probabilistic polynomial time adversary succeeds in breaking the scheme with at most negligible probability.

Negligible

For every polynomial p and for all sufficiently large values of n: f (n) < 1 p(n) e.g., f (n) = 1

2n

Church-Turing Hypothesis

We believe polynomial time models all computers.

15 / 72

slide-19
SLIDE 19

Our goals

private-key (symmetric)

◮ confidentiality ◮ authenticity

(as in: message integrity)

public-key (asymmetric)

◮ confidentiality ◮ authenticity ◮ key exchange

Something providing confidentiality generally makes no statement whatsoever about authenticity.

16 / 72

slide-20
SLIDE 20

Outline

Cryptography Private-key setting Public-key setting Meta

17 / 72

slide-21
SLIDE 21

Private-key encryption scheme

  • 1. k ← Gen(1n), security parameter 1n
  • 2. c ← Enck(m), m ∈ {0, 1}∗
  • 3. m := Deck(c)

◮ provide confidentiality ◮ definition of security: chosen-plaintext attack (CPA)

Cryptography uses theoretical attack games to analyze and formalize security. C: challenger, ← means non-deterministic, A: adversary := means deterministic

18 / 72

slide-22
SLIDE 22

The eavesdropping experiment

C A k ← Gen(1n) input 1n

slide-23
SLIDE 23

The eavesdropping experiment

C A k ← Gen(1n) input 1n b ← {0, 1} c ← Enck(mb)

  • utput b′

m0, m1 c

◮ A succeeds, iff b = b′ 19 / 72

slide-24
SLIDE 24

Discussion of the eavesdropping experiment

◮ |m0| = |m1| ◮ probabilistic polynomial time algorithms ◮ success probability should be 0.5 + negligible ◮ if so, Enc has indistinguishable encryptions in the presence of

an eavesdropper

20 / 72

slide-25
SLIDE 25

Pseudorandom permutation

F : {0, 1}∗ × {0, 1}∗ → {0, 1}∗

◮ Fk(x) and F −1 k (y) efficiently computable ◮ Fk be indistinguishable from uniform permutation ◮ adversary may have access to F −1

We can assume that all inputs and the output have the same length.

21 / 72

slide-26
SLIDE 26

A block cipher

Example

◮ fixed key lenght and block length ◮ chop m into 128 bit blocks

m k AES c 128 bit Does this function survive the eavesdropping experiment?

22 / 72

slide-27
SLIDE 27

Chosen-plaintext attack

C A k ← Gen(1n) input 1n

23 / 72

slide-28
SLIDE 28

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . m c

23 / 72

slide-29
SLIDE 29

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

)

23 / 72

slide-30
SLIDE 30

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

) C (cont’d) A c ← Enck(m) . . . . . . m c

  • utput bit b′

23 / 72

slide-31
SLIDE 31

Chosen-plaintext attack

C A k ← Gen(1n) input 1n c ← Enck(m) . . . . . . b ← {0, 1} m c m , m

1

E n c

k

( m

b

) C (cont’d) A c ← Enck(m) . . . . . . m c

  • utput bit b′

23 / 72

slide-32
SLIDE 32

Discussion of CPA

◮ Enc is secure under chosen-plaintext attack ◮ again, messages must have same length ◮ multiple-use key ◮ non-deterministic (e. g. random initialization vector) or state ◮ block cipher requires operation mode: counter (CTR),

  • utput-feedback (OFB), . . .

24 / 72

slide-33
SLIDE 33

Example constructions: counter mode

Example

◮ randomised AES counter mode (AES-CTR$) ◮ choose nonce r ← {0, 1}128, key k ← {0, 1}128 ◮ great if you have dedicated circuits for AES, else vulnerable to

timing attacks r AES k m0 ⊕ c0 r + 1 AES k m1 ⊕ c1

· · ·

complete ciphertext c := (r, c0, c1, · · · )

25 / 72

slide-34
SLIDE 34

Example constructions: stream ciphers

Example

A modern stream cipher, fast in software: 96 bit nonce 256 bit key 32 bit initial counter ChaCha ⊕ plaintext ciphertext keystream

26 / 72

slide-35
SLIDE 35

Message authentication code

  • 1. k ← Gen(1n), security parameter 1n
  • 2. t ← Mack(m), m ∈ {0, 1}∗
  • 3. b := Vrfyk(m, t)

b = 1 means valid, b = 0 invalid

◮ transmit m, t ◮ tag t is a short authenticator ◮ message authenticity ⇔ integrity ◮ detect tampering ◮ no protection against replay ◮ “existentially unforgeable” ◮ security definition: adaptive chosen-message attack 27 / 72

slide-36
SLIDE 36

Adaptive chosen-message attack

C A k ← Gen(1n) input 1n t ← Mack(m) . . . . . .

  • utput (m′, t′)

m ( m , t )

◮ let Q be the set of all queries m ◮ A succeeds, iff Vrfyk(m′, t′) = 1 and m′ /

∈ Q

28 / 72

slide-37
SLIDE 37

Used in practice

Example

◮ HMAC based on hash functions ◮ CMAC based on CBC mode ◮ authenticated encryption modes 29 / 72

slide-38
SLIDE 38

Side-channel attacks

How does tag verification work and how to implement tag comparison correctly?

30 / 72

slide-39
SLIDE 39

Cryptographic hash functions

private-key

◮ encryption ◮ message

authentication codes

◮ hash functions

public-key

. . .

31 / 72

slide-40
SLIDE 40

Hash functions

◮ variable length input ◮ fixed length output

provide:

  • 1. pre-image resistance

given H(x) with a randomly chosen x, cannot find x′ s. t. H(x′) = H(x) “H is one-way”

  • 2. second pre-image resistance

given x, cannot find x′ = x s. t. H(x′) = H(x)

  • 3. collision resistance

cannot find x = x′ s. t. H(x) = H(x′)

32 / 72

input H(·)

  • utput

fixed length

slide-41
SLIDE 41

Birthday problem

question one

◮ number of people in a room required ◮ s. t. P[same birthday as you] ≥ 0.5:

1 −

364

365

n

≥ 0.5 ≥ 253 people necessary.

question two

◮ number of people in a room required ◮ s. t. P[at least two people with same birthday] ≥ 0.5

≈ const · √ 365 ≈ 23.

33 / 72

slide-42
SLIDE 42

Birthday problem

question one

◮ number of people in a room required ◮ s. t. P[same birthday as you] ≥ 0.5:

1 −

364

365

n

≥ 0.5 ≥ 253 people necessary. Second pre-image

question two

◮ number of people in a room required ◮ s. t. P[at least two people with same birthday] ≥ 0.5

≈ const · √ 365 ≈ 23. Collision

33 / 72

slide-43
SLIDE 43

Birthday problem (cont’d)

◮ collision resitance is the strongest property

◮ implies pre-image resistance and second pre-image resistance

◮ usually broken broken first: MD5, SHA1 ◮ hash function with output size of 128 bit: ≤ 2128 possible

  • utputs

◮ finding collisions:

√ 2128 = 264

◮ minimum output size: 256 34 / 72

slide-44
SLIDE 44

HMAC

A popular MAC:

◮ opad is 0x36, ipad is 0x5C

tag := H(k ⊕ opadH(k ⊕ ipadm))

◮ use SHA2-512, truncate tag to 256 bits

Used with Merkle-Damgård functions, since they allow to compute from H(km) the extension H(kmtail).

35 / 72

slide-45
SLIDE 45

Combining confidentiality and authentication

◮ encrypt-then-authenticate:

c ← Enck1(m), t ← Mack2(c) transmit: c, t This is generally secure.

◮ authenticated encryption

Also a good choice.

  • e. g. offset codebook (OCB), Galois counter mode (GCM)

36 / 72

slide-46
SLIDE 46

Recap: private-key cryptography

◮ attacker power: probabilistic polynomial time ◮ confidentiality defined as IND-CPA:

encryption, e. g. AES-CTR$

◮ message authentication defined as existentially unforgeable

under adaptive chosen-message attack: message authentication codes, e. g. HMAC-SHA2

◮ authenticated encryption modes 37 / 72

slide-47
SLIDE 47

Outline

Cryptography Private-key setting Public-key setting Meta

38 / 72

slide-48
SLIDE 48

The idea

We no longer have one shared key, but each participant has a key pair:

◮ a private key we give to nobody else ◮ a public key to be published, e. g. on a keyserver 39 / 72

slide-49
SLIDE 49

Public-key cryptography

◮ based on mathematical problems believed to be hard ◮ proofs often only in the weaker random oracle model ◮ only authenticated channels needed for key exchange, not

private

◮ less keys required ◮ orders of magnitude slower

Problems believed to be hard

◮ RSA assumption based on integer factorization ◮ discrete logarithm and Diffie-Hellman assumption

◮ elliptic curves ◮ El Gamal encryption ◮ Digital Signature Standard/Algorithm

40 / 72

slide-50
SLIDE 50

Public-key cryptography

private-key

◮ encryption ◮ message

authentication codes

◮ hash functions

public-key

◮ encryption ◮ signatures ◮ key exchange 41 / 72

slide-51
SLIDE 51

Uses

◮ encryption

◮ encrypt with public key of key owner ◮ decrypt with private key

◮ signatures

◮ sign with private key ◮ verify with public key of key owner ◮ authentication with non-repudiation

◮ key exchange

◮ protect past sessions against key compromise

42 / 72

slide-52
SLIDE 52

Uses

◮ encryption

◮ encrypt with public key of key owner ◮ decrypt with private key

◮ signatures

◮ sign with private key ◮ verify with public key of key owner ◮ authentication with non-repudiation

◮ key exchange

◮ protect past sessions against key compromise

Encryption and signing have nothing to do with each other.

42 / 72

slide-53
SLIDE 53

Public-key encryption scheme

  • 1. (pk, sk) ← Gen(1n), security parameter 1n
  • 2. c ← Encpk(m)
  • 3. m := Decsk(c)

We may need to map the plaintext onto the message space.

43 / 72

slide-54
SLIDE 54

RSA primitive

Textbook RSA

0.0 (N, p, q) ← GenModulus(1n) 0.1 φ(N) := (p − 1)(q − 1) 0.2 find e: gcd(e, φ(N)) = 1 0.3 d := [e−1 mod φ(N)]

  • 1. public key pk = N, e
  • 2. private key sk = N, d
  • perations:
  • 1. public key operation on a value y ∈ Z∗

N

z := [ye mod N] we denote z := RSApk(y)

  • 2. private key operation on a value z ∈ Z∗

N

y := [zd mod N] we denote y := RSAsk(z)

44 / 72

slide-55
SLIDE 55

RSA assumption

steps

  • 1. choose uniform x ∈ Z∗

N

  • 2. A is given N, e, and [xe mod N]

assumption

Infeasable to recover x.

45 / 72

slide-56
SLIDE 56

Chosen-plaintext attack

A (pk, sk) ← Gen(1n) c ← Encpk(m) . . . . . . b ← {0, 1} pk m c m0, m1 Encpk(mb) A c ← Encpk(m) . . . . . . m c

  • utput bit b′

46 / 72

slide-57
SLIDE 57

Security of RSA

◮ textbook RSA is deterministic → must be insecure against CPA

⇒ textbook RSA is not secure

◮ can be used to build secure encryption functions with

appropriate encoding scheme

We want a construction with proof:

◮ use the RSA function ◮ breaking the construction implies ability to factor large

numbers

◮ “breaks RSA assumption” ◮ factoring belived to be difficult (assumption!)

◮ secure at least against CPA

armoring (“padding”) schemes needed

◮ attacks exist, but used often: PKCS #1 v1.5 ◮ better security: PKCS #1 v2.1/v2.2 (OAEP) 47 / 72

slide-58
SLIDE 58

Chosen-ciphertext attack

A (pk, sk) ← Gen(1n)

48 / 72

slide-59
SLIDE 59

Chosen-ciphertext attack

A (pk, sk) ← Gen(1n) m := Decsk(c) . . . . . . pk c m

48 / 72

slide-60
SLIDE 60

Chosen-ciphertext attack

A (pk, sk) ← Gen(1n) m := Decsk(c) . . . . . . b ← {0, 1} pk c m m0, m1 Encpk(mb) Adversary may not request decryption of Encpk(mb) itself.

48 / 72

slide-61
SLIDE 61

Chosen-ciphertext attack

A (pk, sk) ← Gen(1n) m := Decsk(c) . . . . . . b ← {0, 1} pk c m m0, m1 Encpk(mb) A m := Decsk(c) . . . . . . c m

  • utput bit b′

Adversary may not request decryption of Encpk(mb) itself.

48 / 72

slide-62
SLIDE 62

Chosen-ciphertext attack

A (pk, sk) ← Gen(1n) m := Decsk(c) . . . . . . b ← {0, 1} pk c m m0, m1 Encpk(mb) A m := Decsk(c) . . . . . . c m

  • utput bit b′

Adversary may not request decryption of Encpk(mb) itself.

48 / 72

slide-63
SLIDE 63

Optimal asymmetric encryption padding

ˆ m0 ˆ m1 m||0k1 r ← {0, 1}k0 G ⊕ H ⊕ ˆ m := ˆ m0|| ˆ m1 c := RSApk( ˆ m) recall: c := [ ˆ me mod N]

49 / 72

slide-64
SLIDE 64

Discussion

A proof exists with

assumptions:

◮ G, H hash functions with random oracle property ◮ RSA assumption: RSA is one-way

result:

⇒ RSA-OAEP secure against CCA

◮ negligible probability 50 / 72

slide-65
SLIDE 65

Signature scheme

  • 1. (pk, sk) ← Gen(1n)
  • 2. σ ← Signsk(m)
  • 3. b := Vrfypk(m, σ)

b = 1 means valid, b = 0 invalid

51 / 72

slide-66
SLIDE 66

Signatures

◮ (often) slower than MACs ◮ non-repudiation ◮ verify OS packages

RSA signatures

◮ RSA not a secure signature function ◮ PKCS #1 v1.5 ◮ use RSASSA-PSS 52 / 72

slide-67
SLIDE 67

Adaptive chosen-message attack

A (pk, sk) ← Gen(1n) σ ← Signsk(m) . . . . . .

  • utput (m′, σ′)

p k m ( m , σ )

◮ let Q be the set of all queries m ◮ A succeeds, iff Vrfypk(m′, σ′) = 1 and m′ /

∈ Q

53 / 72

slide-68
SLIDE 68

Goal

◮ signature function using RSA ◮ breaking signature function implies breaking the RSA

assumption

◮ proof 54 / 72

slide-69
SLIDE 69

RSASSA-PSS

m SHA2 hash salt SHA2 ⊕ MGF masked data block hash RSAsk(·) pad1 salt pad2 0xBC

55 / 72

slide-70
SLIDE 70

Overview: signatures using RSA

sign sk m σ verify pk m′

  • σ

valid/invalid m, σ m′, σ

Signsk(m) :

em ← PSS(m) // encoding σ := RSAsk(em)

Vrfypk(m′, σ) :

  • em

:= RSApk( σ)

  • salt

:= recover-PSS-salt( em) em′ := PSS(m′, salt) em′

?

=

  • em

56 / 72

slide-71
SLIDE 71

Discussion

A proof exists with

assumptions:

◮ random oracle model ◮ RSA assumption: RSA is one-way

result:

⇒ RSA-PSS existentially unforgeable under adaptive chosen-message attack

◮ negligible probability 57 / 72

slide-72
SLIDE 72

Combining signatures and encryption

Goal: S sends message m to R, assuring:

◮ secrecy ◮ message came from S

encrypt-then-authenticate

◮ S, c, SignskS(c) ◮ attacker A executes CCA: A, c, SignskA(c) 58 / 72

slide-73
SLIDE 73

Combining signatures and encryption

Goal: S sends message m to R, assuring:

◮ secrecy ◮ message came from S

encrypt-then-authenticate

◮ S, c, SignskS(c) ◮ attacker A executes CCA: A, c, SignskA(c) successful attack 58 / 72

slide-74
SLIDE 74

Signcryption cont’d

authenticate-then-encrypt

◮ σ ← SignskS(m) ◮ S, EncekR(m||σ) ◮ Malicious R to R’: S, EncekR′(m||σ) 59 / 72

slide-75
SLIDE 75

Signcryption cont’d

authenticate-then-encrypt

◮ σ ← SignskS(m) ◮ S, EncekR(m||σ) ◮ Malicious R to R’: S, EncekR′(m||σ) successful attack

solution for AtE

◮ compute σ ← SignskS(m||R) 59 / 72

slide-76
SLIDE 76

Perfect forward security

Assume

◮ long-term (identity) keys ◮ session keys (for protecting one connection)

Idea

◮ attacker captures private-key encrypted traffic ◮ later: an endpoint is compromised → keys are compromised

We want: security of past connections should not be broken.

Perfect forward security

protection of past sessions against:

◮ compromise of session key ◮ compromise of long-term key 60 / 72

slide-77
SLIDE 77

Decisional Diffie-Hellman assumption

Alice Bob compute s compute s D Ha D Hb [store transcript] C A b ← {0, 1} if b = 0 : ˆ s := s, else: ˆ s

random

← − − − −

  • utput b′

ˆ s , t r a n s c r i p t

61 / 72

slide-78
SLIDE 78

Textbook Diffie-Hellman key exchange

◮ p prime ◮ generator g (primitive root for cyclic group of Zp):

{g0, g1, g2, . . . } = {1, 2, . . . , p − 1} a ← Zp b ← Zp X := ga mod p s := Y a mod p k := KDF(s) Y := gb mod p s := X b mod p k := KDF(s) (p, g, X) Y

◮ Y a = gba = gab = X b mod p ◮ insecure for certain weak values 62 / 72

slide-79
SLIDE 79

Elliptic curve Diffie-Hellman key exchange: X25519

◮ p = 2255 − 19 ◮ E(Fp × Fp) ◮ E : y2 = x3 + 486662x2 + x

a ← {0, 1}255 b ← {0, 1}255 A := aG B := bG s := aB k := KDF(sx) s := bA k := KDF(sx) A B

(Other ECDH cryptosystems will need additional verification steps.)

63 / 72

slide-80
SLIDE 80

Perfect forward security

◮ generate new DH key for each connection ◮ wipe old shared keys

Compromise of long term keys in combination with eavesdropping does not break security of past connections anymore!

64 / 72

slide-81
SLIDE 81

Hybrid approach

Public-key cryptography

◮ valuable properties ◮ slow

Hybrid encryption

◮ protect shared key with public-key cryptography ◮ protect bulk traffic with private-key cryptography

Example

k ← {0, 1}n w ← Encpk(k) c0 ← Enck(msg0) c1 ← Enck(msg1) transmit: w, c0, c1

65 / 72

slide-82
SLIDE 82

Combining private-key and public-key methods in protocols

  • e. g.:

handshake

◮ Diffie-Hellman key exchange ◮ signatures for entity authentication ◮ key derivation ◮ . . .

transport

◮ private-key authenticated encryption ◮ replay protection 66 / 72

slide-83
SLIDE 83

Key size equivalents

private-key hash output RSA DLOG EC 128 256 3072 3072 256 near term 256 512 15360 15360 512 long term

ENISA report, Nov. 2014

  • penssl on my E5-1630, ops/s (very unscientific):

◮ 175 sig RSA4096 ◮ 1773 sig RSA2048 ◮ 10990 vrfy ECDSAp256 67 / 72

slide-84
SLIDE 84

Considerations

◮ different keys for different purposes ◮ algorithms from competitions: eSTREAM, PHC, AES, SHA,

CAESAR

◮ e. g. Salsa20, AES

◮ keysizes: ENISA, ECRYPT2, Suite B, keylength.com

◮ e. g. ECRYPT2: RSA keys ≥ 3248 bit

◮ keys based on passwords: Argon2, scrypt, bcrypt, PBKDF2

In networking, timing is not “just a side channel”. Demand constant-time implementations.

68 / 72

slide-85
SLIDE 85

What has to go right

algorithms protocol design implementation library API design deployment & correct usage cryptographic security software security, side channel

insipired by Matthew D. Green, Pascal Junod

69 / 72

slide-86
SLIDE 86

Words of caution

limits

◮ crypto will not solve your problem ◮ only a small part of a secure system ◮ don’t implement yourself

difficult to solve problems

◮ trust / key distribution

◮ revocation

◮ ease of use

many requirements remaining

◮ replay ◮ timing attack ◮ endpoint security 70 / 72

slide-87
SLIDE 87

Outline

Cryptography Private-key setting Public-key setting Meta

71 / 72

slide-88
SLIDE 88

Oral attestations

◮ 2015-12-04 Fri ◮ 2015-12-07 Mon ◮ 2015-12-08 Tue ◮ 2015-12-11 Fri

Registration in Moodle starts tonight at 2000 and ends this Sunday.

72 / 72