Can Charlie distinguish Alice and Bob? Automated verification of - - PowerPoint PPT Presentation

can charlie distinguish alice and bob
SMART_READER_LITE
LIVE PREVIEW

Can Charlie distinguish Alice and Bob? Automated verification of - - PowerPoint PPT Presentation

Can Charlie distinguish Alice and Bob? Automated verification of equivalence properties Steve Kremer joint work with: Myrto Arapinis, David Baelde, Rohit Chadha, Vincent Cheval, S tefan Ciob ac a, V eronique Cortier, St ephanie


slide-1
SLIDE 1

1/30

Can Charlie distinguish Alice and Bob?

Automated verification of equivalence properties Steve Kremer

joint work with: Myrto Arapinis, David Baelde, Rohit Chadha, Vincent Cheval, S ¸tefan Ciob˘ acˆ a, V´ eronique Cortier, St´ ephanie Delaune, Ivan Gazeau, Itsaka Rakotonirina, Mark Ryan

29th IEEE Computer Security Foundations Symposium

slide-2
SLIDE 2

2/30

Cryptographic protocols everywhere!

◮ Distributed programs that ◮ use crypto primitives (encryption, digital signature ,. . . ) ◮ to ensure security properties (confidentiality, authentication,

anonymity,. . . )

slide-3
SLIDE 3

3/30

Symbolic models for protocol verification

Main ingredient of symbolic models

◮ messages = terms

enc pair s1 s2 k

◮ perfect cryptography (equational theories)

dec(enc(x, y), y) = x fst(pair(x, y)) = x snd(pair(x, y)) = y

◮ the network is the attacker

◮ messages can be eavesdropped ◮ messages can be intercepted ◮ messages can be injected

Dolev, Yao: On the Security of Public Key Protocols. FOCS’81

slide-4
SLIDE 4

4/30

Cryptographic protocols are tricky!

slide-5
SLIDE 5

4/30

Cryptographic protocols are tricky!

Bhargavan et al.:FREAK, Logjam, SLOTH, . . . Cremers et al., S&P’16

slide-6
SLIDE 6

4/30

Cryptographic protocols are tricky!

Bhargavan et al.:FREAK, Logjam, SLOTH, . . . Cremers et al., S&P’16 Arapinis et al., CCS’12

slide-7
SLIDE 7

4/30

Cryptographic protocols are tricky!

Bhargavan et al.:FREAK, Logjam, SLOTH, . . . Cremers et al., S&P’16 Arapinis et al., CCS’12 Cortier & Smyth, CSF’11

slide-8
SLIDE 8

4/30

Cryptographic protocols are tricky!

Bhargavan et al.:FREAK, Logjam, SLOTH, . . . Cremers et al., S&P’16 Arapinis et al., CCS’12 Cortier & Smyth, CSF’11 Steel et al., CSF’08, CCS’10

slide-9
SLIDE 9

5/30

Modelling the protocol

Protocols modelled in a process calculus, e.g. the applied pi calculus P ::= | in(c, x).P input |

  • ut(c, t).P
  • utput

| if t1 = t2 then P else Q conditional | P | | Q parallel | !P replication | new n.P restriction

slide-10
SLIDE 10

5/30

Modelling the protocol

Protocols modelled in a process calculus, e.g. the applied pi calculus P ::= | in(c, x).P input |

  • ut(c, t).P
  • utput

| if t1 = t2 then P else Q conditional | P | | Q parallel | !P replication | new n.P restriction Specificities:

◮ messages are terms (not just names as in the pi calculus) ◮ equality in conditionals interpreted modulo an equational

theory

slide-11
SLIDE 11

6/30

Reasoning about attacker knowledge

Terms output by a process are organised in a frame: φ = new ¯

  • n. {t1/x1, . . . ,tn /xn}
slide-12
SLIDE 12

6/30

Reasoning about attacker knowledge

Terms output by a process are organised in a frame: φ = new ¯

  • n. {t1/x1, . . . ,tn /xn}

Deducibility: φ ⊢R t if R is a public term and Rφ =E t

Example

ϕ = new n1, n2, k1, k2. {enc(n1,k1)/x1,enc(n2,k2) /x2,k1 /x3} ϕ ⊢dec(x1,x3) n1 ϕ ⊢ n2 ϕ ⊢1 1

slide-13
SLIDE 13

6/30

Reasoning about attacker knowledge

Terms output by a process are organised in a frame: φ = new ¯

  • n. {t1/x1, . . . ,tn /xn}

Static equivalence: φ1 ∼s φ2 if ∀ public terms R, R′. Rφ1 = R′φ1 ⇔ Rφ2 = R′φ2

Examples

new k. {enc(0,k)/x1} ∼s new k. {enc(1,k)/x1}

slide-14
SLIDE 14

6/30

Reasoning about attacker knowledge

Terms output by a process are organised in a frame: φ = new ¯

  • n. {t1/x1, . . . ,tn /xn}

Static equivalence: φ1 ∼s φ2 if ∀ public terms R, R′. Rφ1 = R′φ1 ⇔ Rφ2 = R′φ2

Examples

new n1, n2. {n1/x1,n2 /x2} ∼s new n1, n2. {n1/x1,n1 /x2} Check (x1

?

= x2)

slide-15
SLIDE 15

6/30

Reasoning about attacker knowledge

Terms output by a process are organised in a frame: φ = new ¯

  • n. {t1/x1, . . . ,tn /xn}

Static equivalence: φ1 ∼s φ2 if ∀ public terms R, R′. Rφ1 = R′φ1 ⇔ Rφ2 = R′φ2

Examples

{enc(n,k)/x1,k /x2} ∼s {enc(0,k)/x1,k /x2} Check (dec(x1, x2) ? = 0)

slide-16
SLIDE 16

7/30

From authentication to privacy

Many good tools:

AVISPA, Casper, Maude-NPA, ProVerif, Scyther, Tamarin, . . .

Good at verifying trace properties (predicates on system behavior), e.g.,

◮ (weak) secrecy of a key ◮ authentication (correspondence properties)

If B ended a session with A (and parameters p) then A must have started a session with B (and parameters p′).

slide-17
SLIDE 17

7/30

From authentication to privacy

Many good tools:

AVISPA, Casper, Maude-NPA, ProVerif, Scyther, Tamarin, . . .

Good at verifying trace properties (predicates on system behavior), e.g.,

◮ (weak) secrecy of a key ◮ authentication (correspondence properties)

If B ended a session with A (and parameters p) then A must have started a session with B (and parameters p′).

Not all properties can be expressed on a trace. recent interest in indistinguishability properties.

slide-18
SLIDE 18

8/30

Indistinguishability as a process equivalence

Naturally modelled using equivalences from process calculi Testing equivalence (P ≈ Q) for all processes A, we have that: A | P ⇓ c if, and only if, A | Q ⇓ c − → P ⇓ c when P can send a message on the channel c.

slide-19
SLIDE 19

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

Abadi, Gordon. A Calculus for Cryptographic Protocols: The Spi Calculus. CCS’97, Inf.& Comp.’99 Abadi, Fournet. Mobile values, new names, and secure communication. POPL’01

slide-20
SLIDE 20

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

labelled bisim.

Abadi, Fournet. Mobile values, new names, and secure communication. POPL’01

slide-21
SLIDE 21

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

labelled bisim. diff equiv.

Blanchet et al.: Automated Verification of Selected Equivalences for Security

  • Protocols. LICS’05
slide-22
SLIDE 22

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

labelled bisim. diff equiv. Diff equivalence too fine grained for several properties.

Blanchet et al.: Automated Verification of Selected Equivalences for Security

  • Protocols. LICS’05
slide-23
SLIDE 23

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

labelled bisim. symbolic bisim. diff equiv.

Delaune et al. Symbolic bisimulation for the applied pi calculus. JCS’10

slide-24
SLIDE 24

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

labelled bisim. symbolic bisim. diff equiv.

Liu, Lin. A complete symbolic bisimulation for full applied pi calculus.TCS’12

slide-25
SLIDE 25

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

trace equiv. labelled bisim. symbolic bisim. diff equiv.

Cheval et al.: Deciding equivalence-based properties using constraint solving. TCS’13

slide-26
SLIDE 26

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

trace equiv. labelled bisim. symbolic bisim. diff equiv. For a bounded number of sessions (no replication).

Cheval et al.: Deciding equivalence-based properties using constraint solving. TCS’13

slide-27
SLIDE 27

9/30

A tour to the (equivalence) zoo

testing equiv.

  • bs. equiv.

trace equiv. labelled bisim. symbolic bisim. diff equiv. For a class of determinate processes.

Cheval et al.: Deciding equivalence-based properties using constraint solving. TCS’13

slide-28
SLIDE 28

10/30

A few security properties

“Strong” secrecy (non-interference) in(c, x1).in(c, x2).P{x1/s} ≈ in(c, x1).in(c, x2).P{x2/s}

slide-29
SLIDE 29

10/30

A few security properties

“Strong” secrecy (non-interference) in(c, x1).in(c, x2).P{x1/s} ≈ in(c, x1).in(c, x2).P{x2/s} Real-or-random secrecy P.out(c, s) ≈ P.new r.out(c, r)

slide-30
SLIDE 30

10/30

A few security properties

“Strong” secrecy (non-interference) in(c, x1).in(c, x2).P{x1/s} ≈ in(c, x1).in(c, x2).P{x2/s} Real-or-random secrecy P.out(c, s) ≈ P.new r.out(c, r) Simulation based security (I is an ideal functionality) ∃S. P ≈ S[I]

slide-31
SLIDE 31

10/30

A few security properties

“Strong” secrecy (non-interference) in(c, x1).in(c, x2).P{x1/s} ≈ in(c, x1).in(c, x2).P{x2/s} Real-or-random secrecy P.out(c, s) ≈ P.new r.out(c, r) Simulation based security (I is an ideal functionality) ∃S. P ≈ S[I] Anonymity P{a/id} ≈ P{b/id}

slide-32
SLIDE 32

10/30

A few security properties

“Strong” secrecy (non-interference) in(c, x1).in(c, x2).P{x1/s} ≈ in(c, x1).in(c, x2).P{x2/s} Real-or-random secrecy P.out(c, s) ≈ P.new r.out(c, r) Simulation based security (I is an ideal functionality) ∃S. P ≈ S[I] Anonymity P{a/id} ≈ P{b/id} Vote privacy Unlinkability

slide-33
SLIDE 33

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

slide-34
SLIDE 34

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote

slide-35
SLIDE 35

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote

but the attacker knows values 0 and 1

slide-36
SLIDE 36

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote ◮ The attacker cannot distinguish A votes and B votes:

VA(v) ≈ VB(v)

slide-37
SLIDE 37

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote ◮ The attacker cannot distinguish A votes and B votes:

VA(v) ≈ VB(v) but identities are revealed

slide-38
SLIDE 38

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote ◮ The attacker cannot distinguish A votes and B votes:

VA(v) ≈ VB(v)

◮ The attacker cannot distinguish A votes 0 and A votes 1:

VA(0) ≈ VA(1)

slide-39
SLIDE 39

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote ◮ The attacker cannot distinguish A votes and B votes:

VA(v) ≈ VB(v)

◮ The attacker cannot distinguish A votes 0 and A votes 1:

VA(0) ≈ VA(1) but election outcome is revealed

slide-40
SLIDE 40

11/30

How to model vote privacy?

How can we model “the attacker does not learn my vote (0 or 1)”?

◮ The attacker cannot learn the value of my vote ◮ The attacker cannot distinguish A votes and B votes:

VA(v) ≈ VB(v)

◮ The attacker cannot distinguish A votes 0 and A votes 1:

VA(0) ≈ VA(1)

◮ The attacker cannot distinguish the situation where two

honest voters swap votes: VA(0) | | VB(1) ≈ VA(1) | | VB(0)

Kremer, Ryan: Analysis of an E-Voting Protocol in the Applied Pi Calculus. ESOP’05

slide-41
SLIDE 41

12/30

How to verify vote privacy?

Definitions of privacy and stronger variants (receipt-freeness and coercion-resistance) in terms of process equivalences. Our first case study: the FOO protocol based on blind signatures

Kremer, Ryan: Analysis of an E-Voting Protocol in the Applied Pi Calculus. ESOP’05 Delaune et al.: Coercion-Resistance and Receipt-Freeness in E-Voting. CSFW’06

slide-42
SLIDE 42

12/30

How to verify vote privacy?

Definitions of privacy and stronger variants (receipt-freeness and coercion-resistance) in terms of process equivalences. Our first case study: the FOO protocol based on blind signatures

◮ ProVerif was the only tool able to check equivalence properties ◮ Diff-equivalence checked by ProVerif is too fine-grained ◮ Needed to do hand proofs

slide-43
SLIDE 43

12/30

How to verify vote privacy?

Definitions of privacy and stronger variants (receipt-freeness and coercion-resistance) in terms of process equivalences. Our first case study: the FOO protocol based on blind signatures

◮ ProVerif was the only tool able to check equivalence properties ◮ Diff-equivalence checked by ProVerif is too fine-grained ◮ Needed to do hand proofs

Motivation for an alternate tool.

slide-44
SLIDE 44

12/30

How to verify vote privacy?

Definitions of privacy and stronger variants (receipt-freeness and coercion-resistance) in terms of process equivalences. Our first case study: the FOO protocol based on blind signatures

◮ ProVerif was the only tool able to check equivalence properties ◮ Diff-equivalence checked by ProVerif is too fine-grained ◮ Needed to do hand proofs

Motivation for an alternate tool. see Ben Smyth’s talk in next session

slide-45
SLIDE 45

13/30

AKiSs: our goals and approach

Decision procedure for trace equivalence:

◮ many equational theories, ◮ practical implementation

Protocols modelled as first order Horn clauses (bounded number of sessions, i.e., no replication) Resolution based procedure for trace equivalence for convergent equational theories (that have the finite variant property)

Chadha et al.: Automated Verification of Equivalence Properties of Cryptographic

  • Protocols. ESOP’12, TOCL’16
slide-46
SLIDE 46

14/30

AKiSs: overview

Protocol specification process calculus no replication no else branches Rewrite rules Query P ⊲ ⊳ Q Translation into first order Horn clauses Saluration of Horn clauses (Resolution based procedure) Check P ⊲ ⊳ Q Yes No + witness

slide-47
SLIDE 47

15/30

Modelling protocols in Horn clauses: an example

R = {dec(enc(x, y), y) → x} T = in(c, x).if dec(x, k) = a then out(c, s)

rin(c,x) ⇐ k(X, x) rin(c,x),test ⇐ k(X, x), dec(x, k) =R a rin(c,x),test,out(c) ⇐ k(X, x), dec(x, k) =R a kin(c,x),test,out(c)(w1, s) ⇐ k(X, x), dec(x, k) =R a

slide-48
SLIDE 48

15/30

Modelling protocols in Horn clauses: an example

R = {dec(enc(x, y), y) → x} T = in(c, x).if dec(x, k) = a then out(c, s)

rin(c,x) ⇐ k(X, x) rin(c,x),test ⇐ k(X, x), dec(x, k) =R a rin(c,x),test,out(c) ⇐ k(X, x), dec(x, k) =R a kin(c,x),test,out(c)(w1, s) ⇐ k(X, x), dec(x, k) =R a Get rid of equalities by equational unification. mguR(dec(x, k) =R a) : x → enc(a, k)

slide-49
SLIDE 49

15/30

Modelling protocols in Horn clauses: an example

R = {dec(enc(x, y), y) → x} T = in(c, x).if dec(x, k) = a then out(c, s)

rin(c,x) ⇐ k(X, x) rin(c,enc(a,k)),test ⇐ k(X, enc(a, k)) rin(c,enc(a,k)),test,out(c) ⇐ k(X, enc(a, k)) kin(c,enc(a,k)),test,out(c)(w1, s) ⇐ k(X, enc(a, k)) Get rid of equalities by equational unification. mguR(dec(x, k) =R a) : x → enc(a, k)

slide-50
SLIDE 50

15/30

Modelling protocols in Horn clauses: an example

R = {dec(enc(x, y), y) → x} T = in(c, x).if dec(x, k) = a then out(c, s)

k(enc(X, Y ), enc(x, y)) ⇐ k(X, x), k(Y , y) k(dec(X, Y ), dec(x, y)) ⇐ k(X, x), k(Y , y)

slide-51
SLIDE 51

15/30

Modelling protocols in Horn clauses: an example

R = {dec(enc(x, y), y) → x} T = in(c, x).if dec(x, k) = a then out(c, s)

k(enc(X, Y ), enc(x, y)) ⇐ k(X, x), k(Y , y) k(dec(X, Y ), dec(x, y)) ⇐ k(X, x), k(Y , y) Rewrite systems with the finite variant property: precompute all possible normal forms get rid of equational reasoning

slide-52
SLIDE 52

15/30

Modelling protocols in Horn clauses: an example

R = {dec(enc(x, y), y) → x} T = in(c, x).if dec(x, k) = a then out(c, s)

k(enc(X, Y ), enc(x, y)) ⇐ k(X, x), k(Y , y) k(dec(X, Y ), dec(x, y)) ⇐ k(X, x), k(Y , y) k(dec(X, Y ), z) ⇐ k(X, enc(z, y)), k(Y , y) Rewrite systems with the finite variant property: precompute all possible normal forms get rid of equational reasoning

slide-53
SLIDE 53

16/30

Saturating clauses

A clause is solved if it is of the form H ⇐ kw1(X1, x1), . . . , kwn(Xn, xn) Resolution

H ⇐ kuv(X, t), B1, . . . , Bn ∈ K, kw(R, t′) ⇐ Bn+1, . . . , Bm ∈ Ksolved t not a var σ = mgu(ku(X, t), kw(R, t′)) K := K ∪

  • (H ⇐ B1, . . . , Bm)σ
  • Identity

ku(R, t) ⇐ B1, . . . , Bn ∈ Ksolved ku′v′(R′, t′) ⇐ Bn+1, . . . , Bm ∈ Ksolved σ = mgu(ku( , t), ku′( , t′)) K = K ∪

  • (iu′v′(R, R′) ⇐ B1, . . . , Bm)σ
  • Iterated until reaching fixpoint.
slide-54
SLIDE 54

17/30

Properties of saturated set of clauses

A the end of the saturation we have a finite set of solved clauses that represents:

◮ all reachable traces of the protocol ◮ all deducible messages by the adversary ◮ all identities among adversary recipes

slide-55
SLIDE 55

18/30

Trace equivalences

Trace equivalence: P ⊑t Q

if (P, ∅)

tr

= ⇒ (P′, ϕ) then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ ϕ ∼s ϕ′

P ≈ Q iff P ⊑ Q ∧ Q ⊑ P

slide-56
SLIDE 56

18/30

Trace equivalences

Trace equivalence: P ⊑t Q

if (P, ∅)

tr

= ⇒ (P′, ϕ) then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ ϕ ∼s ϕ′

Fine grained trace equivalence: P ⊑ft Q

∀ interleaving T of P. ∃ interleaving T ′ of Q. T ≈t T ′

P ≈ Q iff P ⊑ Q ∧ Q ⊑ P

slide-57
SLIDE 57

18/30

Trace equivalences

Trace equivalence: P ⊑t Q

if (P, ∅)

tr

= ⇒ (P′, ϕ) then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ ϕ ∼s ϕ′

Fine grained trace equivalence: P ⊑ft Q

∀ interleaving T of P. ∃ interleaving T ′ of Q. T ≈t T ′

  • P ≈ Q iff P ⊑ Q ∧ Q ⊑ P
slide-58
SLIDE 58

18/30

Trace equivalences

Trace equivalence: P ⊑t Q

if (P, ∅)

tr

= ⇒ (P′, ϕ) then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ ϕ ∼s ϕ′

Fine grained trace equivalence: P ⊑ft Q

∀ interleaving T of P. ∃ interleaving T ′ of Q. T ≈t T ′

  • Coarse trace equivalence: P ⊑ct Q

if (P, ∅)

tr

= ⇒ (P′, ϕ)∧(r = s)ϕ then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ (r = s)ϕ′

P ≈ Q iff P ⊑ Q ∧ Q ⊑ P

slide-59
SLIDE 59

18/30

Trace equivalences

Trace equivalence: P ⊑t Q

if (P, ∅)

tr

= ⇒ (P′, ϕ) then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ ϕ ∼s ϕ′

Fine grained trace equivalence: P ⊑ft Q

∀ interleaving T of P. ∃ interleaving T ′ of Q. T ≈t T ′

  • Coarse trace equivalence: P ⊑ct Q

if (P, ∅)

tr

= ⇒ (P′, ϕ)∧(r = s)ϕ then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ (r = s)ϕ′

  • P ≈ Q iff P ⊑ Q ∧ Q ⊑ P
slide-60
SLIDE 60

18/30

Trace equivalences

Trace equivalence: P ⊑t Q

if (P, ∅)

tr

= ⇒ (P′, ϕ) then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ ϕ ∼s ϕ′

Fine grained trace equivalence: P ⊑ft Q

∀ interleaving T of P. ∃ interleaving T ′ of Q. T ≈t T ′

  • Coarse trace equivalence: P ⊑ct Q

if (P, ∅)

tr

= ⇒ (P′, ϕ)∧(r = s)ϕ then ∃Q′, ϕ′. (Q, ∅)

tr

= ⇒ (Q′, ϕ′) ∧ (r = s)ϕ′

  • =

det. proc.

P is determinate if whenever (P, ∅)

tr

= ⇒ (T, ϕ) and (P, ∅)

tr

= ⇒ (T ′, ϕ′) then ϕ ∼s ϕ′.

P ≈ Q iff P ⊑ Q ∧ Q ⊑ P

slide-61
SLIDE 61

19/30

AKiSs: checking equivalences

AKiSs can be used to

◮ under-approximate trace equivalence : prove ≈ft ◮ over-approximate trace equivalence : prove ≈ct ◮ prove trace equivalence for determinate processes

Correctness: any convergent rewrite system that has the finite variant property no else branches Termination: guaranteed for any subterm convergent rewrite system ℓ → r: r is either a subterm of ℓ or ground Terminates in practice on other examples as well First automated proof of FOO e-voting protocol

slide-62
SLIDE 62

20/30

The Helios e-voting protocol (MixNet version)

V1

id1, aenc(pkE, r1, v1)

id1, aenc(pkE , r1, v1)

BB

authenticated channel

where pkE is the election public key and MIX a verifiable mixnet.

slide-63
SLIDE 63

20/30

The Helios e-voting protocol (MixNet version)

V1 V2

id1, aenc(pkE, r1, v1) id2, aenc(pkE, r2, v2)

id1, aenc(pkE , r1, v1) id2, aenc(pkE , r2, v2)

BB

authenticated channel

where pkE is the election public key and MIX a verifiable mixnet.

slide-64
SLIDE 64

20/30

The Helios e-voting protocol (MixNet version)

V1 V2 . . . Vn

id1, aenc(pkE, r1, v1) id2, aenc(pkE, r2, v2) . . . idn, aenc(pkE, rn, vn)

id1, aenc(pkE , r1, v1) id2, aenc(pkE , r2, v2) idn, aenc(pkE , r3, vn)

BB

authenticated channel

where pkE is the election public key and MIX a verifiable mixnet.

slide-65
SLIDE 65

20/30

The Helios e-voting protocol (MixNet version)

V1 V2 . . . Vn

id1, aenc(pkE, r1, v1) id2, aenc(pkE, r2, v2) . . . idn, aenc(pkE, rn, vn)

id1, aenc(pkE , r1, v1) id2, aenc(pkE , r2, v2) idn, aenc(pkE , r3, vn)

v2 vn v1 BB

authenticated channel

MIX Tally

where pkE is the election public key and MIX a verifiable mixnet.

slide-66
SLIDE 66

20/30

The Helios e-voting protocol (MixNet version)

V1 V2 . . . Vn

id1, aenc(pkE, r1, v1) id2, aenc(pkE, r2, v2) . . . idn, aenc(pkE, rn, vn)

id1, aenc(pkE , r1, v1) id2, aenc(pkE , r2, v2) idn, aenc(pkE , r3, vn)

v2 vn v1 BB

authenticated channel

MIX Tally

where pkE is the election public key and MIX a verifiable mixnet. Privacy: Helios(v1, v2)

?

≈t Helios(v2, v1)

slide-67
SLIDE 67

20/30

The Helios e-voting protocol (MixNet version)

V1 V2 . . . Vn

id1, aenc(pkE, r1, v1) id2, aenc(pkE, r2, v2) . . . idn, aenc(pkE, rn, vn)

id1, aenc(pkE , r1, v1) id2, aenc(pkE , r2, v2) idn, aenc(pkE , r3, vn)

v2 vn v1 BB

authenticated channel

MIX Tally

where pkE is the election public key and MIX a verifiable mixnet. Privacy: Helios(v1, v2)

?

≈t Helios(v2, v1) replay attack!

Cortier,Smyth: Attacking and Fixing Helios: An Analysis of Ballot Secrecy. CSF’11

slide-68
SLIDE 68

20/30

The Helios e-voting protocol (MixNet version)

V1 V2 . . . Vn

id1, aenc(pkE, r1, v1) id2, aenc(pkE, r2, v2) . . . idn, aenc(pkE, rn, vn)

id1, aenc(pkE , r1, v1) id2, aenc(pkE , r2, v2) idn, aenc(pkE , r3, vn)

v2 vn v1 BB

authenticated channel

MIX Tally

where pkE is the election public key and MIX a verifiable mixnet. Privacy: Helios(v1, v2)

?

≈t Helios(v2, v1) replay attack! Fix: either use weeding, or zkp that voter knows encryption randomness

slide-69
SLIDE 69

21/30

Everlasting privacy

Does verifiability decrease vote privacy? Publishing encrypted votes on the bulletin board may be a threat for vote privacy.

◮ Future technology and scientific advances may break

encryptions

◮ How long must a vote remain private?

1 year? 10 years? 100 years? 1010 years?

◮ Impossible to predict the necessary key length with certainty:

typical recommendations for less than 10 years (cf www.keylength.com)

slide-70
SLIDE 70

21/30

Everlasting privacy

Does verifiability decrease vote privacy? Publishing encrypted votes on the bulletin board may be a threat for vote privacy.

◮ Future technology and scientific advances may break

encryptions

◮ How long must a vote remain private?

1 year? 10 years? 100 years? 1010 years?

◮ Impossible to predict the necessary key length with certainty:

typical recommendations for less than 10 years (cf www.keylength.com) everlasting privacy: guarantee privacy even if crypto is broken

slide-71
SLIDE 71

22/30

Modelling everlasting privacy

◮ Information available in the future: everlasting channels ◮ Define future attacker capabilities (crypto assumption broken)

equational theory E + Example: break(aenc(pk(x), y, z)) → z

◮ Check in two phases:

  • 1. check trace equivalence with E
  • 2. check static equivalence with E + on future information

implemented in AKiSs and ProVerif

Arapinis et al.: Practical Everlasting Privacy. POST’13

slide-72
SLIDE 72

22/30

Modelling everlasting privacy

◮ Information available in the future: everlasting channels ◮ Define future attacker capabilities (crypto assumption broken)

equational theory E + Example: break(aenc(pk(x), y, z)) → z

◮ Check in two phases:

  • 1. check trace equivalence with E
  • 2. check static equivalence with E + on future information

implemented in AKiSs and ProVerif Achieving everlasting privacy:

◮ Do not publish encryption on the BB, but only a perfectly

hiding commitment

◮ Replace identities by anonymous credentials Belenios Arapinis et al.: Practical Everlasting Privacy. POST’13

slide-73
SLIDE 73

23/30

How to model unlinkability

Unlinkability [ISO/IEC 15408]: Ensuring that a user may make multiple uses of a service

  • r resource without others being able to link these uses

together. Applications: e-Passport, mobile phones, RFID tags, . . .

slide-74
SLIDE 74

23/30

How to model unlinkability

Unlinkability [ISO/IEC 15408]: Ensuring that a user may make multiple uses of a service

  • r resource without others being able to link these uses

together. Applications: e-Passport, mobile phones, RFID tags, . . . Can be modelled as an equivalence property: 2 sessions of the same device ≈ 2 sessions of different devices

Arapinis et al. Analysing Unlinkability and Anonymity Using the Applied Pi Calculus. CSF’10 Brus`

  • et al. Formal Verification of Privacy for RFID Systems. CSF’10
slide-75
SLIDE 75

24/30

Authentication protocol of a RFID tag (KCL)

Reader k, id Tag k, id new r1 r1 new r2 id ⊕ r2, h(r1, k) ⊕ r2

(id ⊕ r2) ⊕ (h(r1, k) ⊕ r2) ⊕ id

?

= h(r1, k)

Is unlinkability satisfied? tag(id, k) | tag(id, k)

?

≈ tag(id, k) | tag(id′, k′)

slide-76
SLIDE 76

25/30

Linkability attack

1 Tag k, id Att 2 Tags k, id

r1 r1 new r2 new r2 id ⊕ r2, h(r1, k) ⊕ r2 id ⊕ r2, h(r1, k) ⊕ r2

k′, id′

r1 r1 new r ′

2

new r ′

2

id ⊕ r ′

2, h(r1, k) ⊕ r ′ 2 id′ ⊕ r ′ 2, h(r1, k′) ⊕ r ′ 2

(id ⊕ r2) ⊕ (h(r1, k) ⊕ r2)

?

= ([id/id′] ⊕ r ′

2) ⊕ (h(r1, [k/k′]) ⊕ r ′ 2)

slide-77
SLIDE 77

26/30

Automated analysis of KCL?

Which tool to choose?

slide-78
SLIDE 78

26/30

Automated analysis of KCL?

Which tool to choose?

◮ None provides support for ⊕

slide-79
SLIDE 79

26/30

Automated analysis of KCL?

Which tool to choose?

◮ None provides support for ⊕ ◮ Abstracting away from algebraic properties: we miss the

linkability attack

slide-80
SLIDE 80

26/30

Automated analysis of KCL?

Which tool to choose?

◮ None provides support for ⊕ ◮ Abstracting away from algebraic properties: we miss the

linkability attack Motivated an extension of AKiSs with ⊕: joint work with Baelde, Delaune and Gazeau

◮ perform Horn clause resolution modulo AC ◮ new strategy: forbid some resolutions to avoid

non-termination major changes in the completeness proof

◮ successfully tested among others on 5 RFID protocols

slide-81
SLIDE 81

27/30

Overview of tools

Unbounded number of sessions (no termination guarantees)

ProVerif Tamarin Maude NPA equivalence diff (+ extensions) diff diff protocol model applied pi MSR (state, else, . . . ) strands (no else)

  • eq. theories

finite variant (?) subterm conv. + DH finite variant + algebraic prop.

slide-82
SLIDE 82

27/30

Overview of tools

Unbounded number of sessions (no termination guarantees)

ProVerif Tamarin Maude NPA equivalence diff (+ extensions) diff diff protocol model applied pi MSR (state, else, . . . ) strands (no else)

  • eq. theories

finite variant (?) subterm conv. + DH finite variant + algebraic prop.

Bounded number of sessions

SPEC APTE AKiSs equivalence symb. bisimulations trace equiv ∼ trace equiv protocol model spi (no else) applied pi applied pi (no else)

  • eq. theories

fixed fixed finite variant + xor

slide-83
SLIDE 83

27/30

Overview of tools

Unbounded number of sessions (no termination guarantees)

ProVerif Tamarin Maude NPA equivalence diff (+ extensions) diff diff protocol model applied pi MSR (state, else, . . . ) strands (no else)

  • eq. theories

finite variant (?) subterm conv. + DH finite variant + algebraic prop.

Bounded number of sessions

SPEC APTE AKiSs equivalence symb. bisimulations trace equiv ∼ trace equiv protocol model spi (no else) applied pi applied pi (no else)

  • eq. theories

fixed fixed finite variant + xor

No swiss knife for equivalence properties

slide-84
SLIDE 84

28/30

Theory and practice of equivalence properties

Extensions of AKiSs

◮ else branches, needed e.g. for analysing unlinkability for the

e-Passport

◮ more algebraic properties, e.g., DH exponentiation `

a la tamarin

slide-85
SLIDE 85

28/30

Theory and practice of equivalence properties

Extensions of AKiSs

◮ else branches, needed e.g. for analysing unlinkability for the

e-Passport

◮ more algebraic properties, e.g., DH exponentiation `

a la tamarin Merge APTE and AKISS

joint work with Cheval ◮ decide trace equivalence ◮ general processes (else branches, not necessarily determinate) ◮ many equational theories

slide-86
SLIDE 86

29/30

Theory and practice of equivalence properties (2)

Decidability and complexity

joint work with Cheval and Rakotonirina

e.g. for subterm convergent equational theories, obs. equivalence is coNP complete for determinate processes, but coNEXP hard otherwise

interesting insights on how to make tools efficient see Itsaka’s 5 minute talk

slide-87
SLIDE 87

30/30

Automated Security Proofs

  • f Cryptographic Protocols

◮ Theory and practice for equivalence properties ◮ Models for and analysis of secure elements (TPM, SGX, . . . ) ◮ Multi-factor authentication ◮ E-voting on untrusted clients

Join us: open PhD and post-doc positions