Verifying Security Protocols and their Implementations Information - - PowerPoint PPT Presentation

verifying security protocols and their implementations
SMART_READER_LITE
LIVE PREVIEW

Verifying Security Protocols and their Implementations Information - - PowerPoint PPT Presentation

Verifying Security Protocols and their Implementations Information Security and Cryptography Reading Group Presenter: C t lin Hri cu References Verified Interoperable Implementations of Security Protocols. Karthikeyan Bhargavan,


slide-1
SLIDE 1

Verifying Security Protocols and their Implementations

Information Security and Cryptography Reading Group Presenter: Cătălin Hriţcu

slide-2
SLIDE 2

References

 Verified Interoperable Implementations of Security

  • Protocols. Karthikeyan Bhargavan, Cédric Fournet,

Andrew D. Gordon, Stephen Tse, CSFW 2006. (and accompanying technical report with proofs)

 Most of the slides are from

 Protecting Alice from Malice: Protocols, Process Calculi,

Proved Programs. Andy Gordon, Marktoberdorf, 2006.

 Vérification de Protocoles Cryptographiques et de leurs

Implémentations. Cédric Fournet, Colloquium INRIA, 2007.

 Language-based Security. Matteo Maffei, Lecture, 2007.  Many thanks to Andy Gordon and Cédric Fournet

(Microsoft Research), and Matteo Maffei

2

slide-3
SLIDE 3

Outline

 Verifying Security Protocols

 What are security protocols?  What are the properties we want to verify?  Why is it so difficult?  Formal methods

 The Dolev-Yao model  Modeling protocols using process calculi  Specifying and analyzing security properties

 Verifying Implementations of Security Protocols

3

slide-4
SLIDE 4

Verifying Security Protocols

4

slide-5
SLIDE 5

Security Protocols

 Modern applications heavily rely on secure

communication over an untrusted network

 Malicious entities can read, block,

modify, (re)transmit messages

5

slide-6
SLIDE 6

Security Properties

 The exact notion of security depends on application  In this talk we focus on two simple properties:

 Secrecy

 Confidential information not disclosed to third-parties

 Authentication

 The recipient of a message should be able to verify its

freshness and its originator

6

slide-7
SLIDE 7

A Simple eBanking Protocol

7

slide-8
SLIDE 8

A Simple eBanking Protocol

 Bad Pete intercepts the message and modifies it in

  • rder to get 2000$

7

slide-9
SLIDE 9

A Simple eBanking Protocol

 Bad Pete intercepts the message and modifies it in

  • rder to get 2000$

7

slide-10
SLIDE 10

Cryptography

8

slide-11
SLIDE 11

Cryptography

 Unfortunately, cryptography is not enough!  Attacker can break security of the protocol

 by simply intercepting, duplicating, sending back the

messages in transit on the network, by interleaving simultaneous protocol sessions, etc

 no need to break the encryption scheme

 In the following, we assume that cryptography is a

fully reliable black box

8

slide-12
SLIDE 12

Reflection Attack

9

slide-13
SLIDE 13

Reflection Attack

 The symmetric nature of the encryption key k does

not allow Mickey to verify whether the encrypted message has been generated by Minnie or himself

9

slide-14
SLIDE 14

Replay attack

10

slide-15
SLIDE 15

Replay attack

 Minnie has no way to verify the freshness of the

message she receives

10

slide-16
SLIDE 16

Fixed Protocol

 Possible solution is to use a nonce

 Randomly generated number used only once

11

slide-17
SLIDE 17

Fixed Protocol

 Possible solution is to use a nonce

 Randomly generated number used only once

11

 This protocol is secure, it guarantees

 the secrecy and authenticity of the message

 ISO two-pass unilateral authentication protocol

slide-18
SLIDE 18

Protocols Are Hard to Get Right

 Historically, one keeps finding simple attacks

against protocols

 even carefully-written, widely-deployed protocols  even a long time after the design and deployment

 New protocols appear regularly, and the same

mistakes are made again and again

 Attacks on Web Services security  Recent MITM attack on public-key Kerberos (2005)

 What’s so difficult about security protocols?

 concurrency + distribution + cryptography  little control on the runtime environment  hard to test against active attackers

12

slide-19
SLIDE 19

Needham-Schroeder Protocol

 This protocol was proposed in 1978  The aim is to guarantee the secrecy and

authenticity of the two nonces (later used for generating a symmetric session-key)

13

slide-20
SLIDE 20

Man-in-the-middle Attack

 Found by Lowe in a 1995

14

slide-21
SLIDE 21

Lowe’s Fix

 Insert Mickey's identifier in the second message  Rules out the man-in-the-middle attack

15

slide-22
SLIDE 22

Informal Methods

 Principles codified in articles and textbooks since mid-90s:

 Abadi and Needham, Prudent engineering practice for

cryptographic protocols, 1994

 Anderson and Needham, Programming Satan’s Computer, 1995

 For instance, Lowe’s fix of the Needham-Schroeder protocol

makes explicit that the second message is sent by Mickey

 These check lists are useful; yet hard for the inexperienced to

understand and to apply

 No guarantee that they will make your protocol secure

16

Principle 1 Every message should say what it means: the interpretation of the message should depend only on its content. It should be possible to write down a straightforward English sentence describing the content — though if there is a suitable formalism available that is good too.

slide-23
SLIDE 23

Formal Methods

17

slide-24
SLIDE 24

Dolev-Yao Attacker Model

 Widely-used abstraction

 Because of automatic tool support

 The attacker can:

 Engage in any number of protocol sessions with any

number of many honest principals (unbounded)

 Read, block, modify, reply any message sent on the

network (the attacker is the network)

 Split composite messages, recompose arbitrarily  Encrypt messages in arbitrary ways  Decrypt encrypted messages with the appropriate key  No bound on the size of the messages, or number of

fresh nonces and keys

18

slide-25
SLIDE 25

Dolev-Yao Attacker Model

 Strong assumptions

 Perfect cryptography, the attacker cannot:

 Decrypt a message without knowing the encryption key  Guess or brute-force keys, nonces or even passwords  Obtain partial information (e.g. half the bits of a message)

 Message length is only partially observable  No collisions: {M}K={M’}K’ implies M=M’ and K=K’  Non-malleability: from {M}K cannot construct {M’}K

 In cryptography attacker is PPT that tries to break security

with non-negligible probability (computational model)

 Justifying Dolev-Yao style symbolic models via

computational models is sometimes possible

19

slide-26
SLIDE 26

20

Protocols as Processes

 Security protocols in the Dolev-Yao model can be

rephrased as processes in process calculi

 E.g. spi calculus, applied pi-calculus etc.

 Their security properties can be rigorously formalized  There are many applicable formalisms for analyzing

these properties automatically

 Type (and effect) systems, type inference  Abstract interpretation

 E.g. abstract processes to Horn clauses (Prolog rules) then

use resolution

 Model checking (bounded or symbolic)

slide-27
SLIDE 27

Applied Pi-calculus (ProVerif)

21  constructors: enc, pair  value: enc(pair(pair(id, m), n), k)  destructors: dec, left, right

x,y,z variable a,b name f constructor (uncurried) function (curried) M,N ::= value x variable a name f(M1,...,Mn) constructor application e ::= expression g destructor function P Q R ::= process

slide-28
SLIDE 28

Processes

22

P,Q,R ::= process in(M,x);P input of x from M (x

  • ut(M,N);P
  • utput of N on M

new a;P make new name a (a !P replication of P P | Q parallel composition inactivity event M event M let x1,...,xn suchthat M = N in P else Q match (x1, ..., xn hav let x = g(M1,...,Mn) in P else Q destructor application ∆ ::= declaration

slide-29
SLIDE 29

Declarations and Scripts

23

let x g M1 Mn in P else Q destructor application ∆ ::= declaration free a name a data f/n data constructor private fun f/n private constructor reduc g(M1,...,Mn) = M destructor private reduc g(M1,...,Mn) = M private destructor ∆s ::= ∆1.···∆n. set of declarations (n ≥ 0) Σ ::= ∆s process P script

 Example

fun enc / 2. fun pair / 2. reduc dec(enc(x,y),y) = x. reduc left(pair(x,y)) = x. reduc right(pair(x,y)) = y.

[ [ ] ] [ ] [ ]

slide-30
SLIDE 30

Structural Equivalence

24

P ≡ P Q ≡ P ⇒ P ≡ Q P ≡ Q,Q ≡ R ⇒ P ≡ R P | 0 ≡ P P | Q ≡ Q | P (P | Q) | R ≡ P | (Q | R) !P ≡ P | !P a / ∈ fn(P) ⇒ new a;(P | Q) ≡ P | new a;Q new a;new b;P ≡ new b;new a;P new a;0 ≡ 0 P ≡ P′ ⇒ new a;P ≡ new a;P′ P ≡ P′ ⇒ P | R ≡ P′ | R

slide-31
SLIDE 31

Internal Reduction

25

P ≡ Q,Q → Q′,Q′ ≡ P′ ⇒ P → P′ P → P′ ⇒ P | Q → P′ | Q P → P′ ⇒ new a;P → new a;P′ in(M,x);P | out(M,N);Q → P{N/x} | Q let x1,...,xn suchthat M = N in P else Q →

if M = Nσ and dom(σ) = {x1,...,xn} Q

  • therwise

let x = g(M′

1,...,M′ n) in P else Q →

  • P{Mσ/x}

if M′

i = Miσ for all i ∈ 1..n

Q

  • therwise

where (g(M1,...,Mn) = M) declared in ∆sa

slide-32
SLIDE 32

Modeling our protocol

26

free ch. fun enc / 2. data pair / 2. data mickey / 0. data minnie / 0. private fun begin / 1. private fun end / 1. reduc dec(enc(x,y),y) = x. reduc left(pair(x,y)) = x. reduc right(pair(x,y)) = y. let Mickey = new m; in(ch,xn); event begin(mickey, minnie, xn);

  • ut(ch, enc(pair(pair(mickey, m), xn), k)).

let Minnie = new n; out(ch,n); in(ch, x); let y = dec(x, k) in let xm suchthat y = pair(pair(mickey, xm), n) in event end(mickey, minnie, n). process new k; Mickey | Minnie

slide-33
SLIDE 33

Correspondence Properties

27

Query Satisfaction and Safety: P | = ev:E ⇒ ev:B1 ∨ ··· ∨ ev:Bn if and only if whenever P ≡ new as;(event Eσ | P′), we have P′ ≡ event Biσ | P′′ for some i ∈ 1..n and some P′′. A process P is safe for q if and only if, for all reductions P →∗

≡ P′, we have P′ |

= q. Opponent Processes and Robust Safety: A ∆s-opponent is a process O with no events such that ∆s process O is well formed and O contains no constructor or destructor declared private in ∆s. A script ∆s process P is robustly safe for q if and only if for all ∆s-opponents O, P | O is safe for q.

slide-34
SLIDE 34

37

!"#$%&'(%#)*)+"&),-'&,,./'0#,1%#)*

! 0#,1%#)*2'"-'"3&,4"&%5'+#67&,$#"78)+'7#,&,+,.

(%#)*)%#'5%(%.,7%5'96':#3-,':."-+8%&';<=>?

! @-73&/'7#,&,+,.'A+#)7&A'B#)&&%-')-'"77.)%5'7)'+".+3.3A

! C,-+3##%-&'7#,+%AA%A'D'7"#"4%&#)+'+#67&,$#"786

! E8"&')&'+"-'7#,(%/

! >%+#%+62'"3&8%-&)+)&6';+,##%A7,-5%-+%'7#,7%#&)%A? ! <F3)(".%-+%A';%G$G'7#,&%+&),-',*'B%"H'A%+#%&A?

! I,B')&'B,#HA';*."(,#?/

! @-&%#-".'#%7#%A%-&"&),-'9"A%5',-'I,#-'+."3A%A ! J%A,.3&),-K9"A%5'".$,#)&842'B)&8'+.%(%#'A%.%+&),-'#3.%A ! L&&"+H'#%+,-A&#3+&),-

slide-35
SLIDE 35

Queries in ProVerif

 query attacker: k.

Starting query not attacker:k[] RESULT not attacker:k[] is true.

 query ev: end(x,y,z) ==> ev: begin(x,y,z).

Starting query ev:end(x_15,y_16,z_17) ==> ev:begin(x_15,y_16,z_17) goal reachable: begin:begin(mickey(),minnie(),n[]) -> end:end(mickey(),minnie(),n[]) RESULT ev:end(x_15,y_16,z_17) ==> ev:begin(x_15,y_16,z_17) is true.

29

slide-36
SLIDE 36

!""#$%&'((')*%+,,)&-.'(/01

!

234/0*4',+4')*%+*1%(0,50,6%-5)-054'0(%7)5%,56-4)%-5)4),).( /+80%900*%7)5&+.':01%+*1%4/)5)3;/.6%(431'01%

!

27405%'*40*(0%077)54%)*%(6&9).',%50+()*'*;<%(0805+.%40,/*'=30( +*1%4)).(%+50%+8+'.+9.0%7)5%+34)&+4',+..6%-5)8'*;%4/0(0%-5)-054'0(

! 0>;>%24/0*+<%?2@A<%@5)B05'7<%CDE<%2BFA@2<%04,

!

G0%,+*%*)H%+34)&+4',+..6%805'76%&)(4%(0,35'46%-5)-054'0( 7)5%104+'.01%&)10.(%)7%,56-4)%-5)4),).(

! 0>;>%F@AIJ<%K05905)(<%G09%A058',0(<%F*7),+51

!

L*;)'*;%H)5M

! C+*,6%-5)4),).(%+*1%-5)-054'0( ! E0.+4')*%904H00*%D).08 N+)%+9(45+,4')*(%+*1%,)*,5040%,56-4)

9

slide-37
SLIDE 37

!""#$%&'((')*%+,,)&-.'(/012

!

30(4%-5+,4',0$%+--.6%7)5&+.%&04/)1(%+*1%4)).( 4/5)89/)84%4/0%-5)4),).%10('9*%:%50;'0<%-5),0((

!

=)4%()%0+(6

! >-0,'76'*9%+%-5)4),).%'(%+%.)4%)7%<)5? ! !"#$%&'()$*$*"+,'#%-"+.$%/+-,'#$(+-%0"'1(2%1"-,2#

!

@5)4),).(%9)%<5)*9%A0,+8(0BBB%

! 4/06%+50%.)9',+..6%7.+<01C%)5 ! 4/06%+50%8(01%<5)*9.6C%)5 ! 4/06%+50%<5)*9.6%'&-.0&0*401%

!

D<)%45)8A.0()&0%E80(4')*( FB G)<%4)%50.+40%,56-4)%-5)4),).(%4)%+--.',+4')*%(0,85'462 !B G)<%4)%50.+40%7)5&+.%&)10.(%4)%0H0,84+A.0%,)102%

10

slide-38
SLIDE 38

!""#$%&'((')*%+,,)&-.'(/012

!

30(4%-5+,4',0$%+--.6%7)5&+.%&04/)1(%+*1%4)).( 4/5)89/)84%4/0%-5)4),).%10('9*%:%50;'0<%-5),0((

!

=)4%()%0+(6

! >-0,'76'*9%+%-5)4),).%'(%+%.)4%)7%<)5? ! !"#$%&'()$*$*"+,'#%-"+.$%/+-,'#$(+-%0"'1(2%1"-,2#

!

@5)4),).(%9)%<5)*9%A0,+8(0BBB%

! 4/06%+50%.)9',+..6%7.+<01C%)5 ! 4/06%+50%8(01%<5)*9.6C%)5 ! 4/06%+50%<5)*9.6%'&-.0&0*401%

!

D<)%45)8A.0()&0%E80(4')*( FB G)<%4)%50.+40%,56-4)%-5)4),).(%4)%+--.',+4')*%(0,85'462 !B G)<%4)%50.+40%7)5&+.%&)10.(%4)%0H0,84+A.0%,)102%

10

 Q: How to relate formal models to executable code?

slide-39
SLIDE 39

Verifying Implementations of Security Protocols

32

slide-40
SLIDE 40

The Trouble with Models

 Protocol designers are typically reluctant to write

formal models - specs are natural language

 Specs are always refined by implementation

experience, so absolute correctness is not a goal

 Timely agreement is more important  So specs continue to be partial and ambiguous

 In spite of successful research on model verification,

new specs exhibit the same old mistakes, again and again (e.g. the attack on Kerberos 5 from 2005)

 The Ugly Reality: implementation code is the closest

we get to a formal description of most protocols

33

slide-41
SLIDE 41

The Trouble with Models (2)

 Formal models are short and abstract

 They ignore large functional parts of implementations  Their formulation is driven by verification techniques  It is easy to write models that are safe but

dysfunctional (testing & debugging is difficult)

 Specs, models, and implementations drift apart…

 Even informal synchronization involves painful code

reviews

 How to keep track of implementation changes?

34

slide-42
SLIDE 42

From Code to Model

 Automatically extract models from interoperable

protocol implementations

 Analyze model automatically using existing tool  Reference implementations, not (yet) production code

 Largely avoids potential mismatches between model

and implementation

 Executable code is more detailed than models

 Some functional aspects can be ignored for security  Model extraction can safely erase those aspects

 Executable code has better tool support

 Type checkers, compilers, debuggers, libraries,

  • ther verification tools

35

slide-43
SLIDE 43

36

Application Authz

One Source, Three Tasks

Concrete Crypto Symbolic Crypto Crypto Net Some other implementation (WSE) ProVerif fs2pv Platform (CLR) Interoperability (via SOAP) Symbolic Verification Symbolic Testing & debugging My code My protocol Other Libraries Source code with modules and strong interfaces

slide-44
SLIDE 44

37

Application Authz

  • 1. Symbolic testing and debugging

Symbolic Crypto Crypto Net Platform (CLR) My code My protocol Other Libraries Coded in C#, F#... We use idealized “black-box” cryptographic primitives Safety relies

  • n typing
slide-45
SLIDE 45

37

Application Authz

  • 1. Symbolic testing and debugging

Symbolic Crypto Crypto Net Platform (CLR) My code My protocol Other Libraries Coded in C#, F#... Attacker (test) We can code any given potential attack as a program We use idealized “black-box” cryptographic primitives Safety relies

  • n typing

We model attackers as arbitrary code with access to selected libraries

slide-46
SLIDE 46

38

Application Authz

  • 2. Formal Verification

Symbolic Crypto ProVerif My code My protocol Other Libraries Translated to pi calculus Pass: Ok for all attackers, or No + potential attack trace Formal verification considers ALL such attackers fs2pv Attacker (unknown) We model attackers as arbitrary code with access to selected libraries We only support a subset

  • f F#
slide-47
SLIDE 47

39

Application Authz

  • 3. Concrete testing & interop

Concrete Crypto Crypto Net Some Other Implementation (WSE) Platform (CLR) Interoperability (via SOAP) My code My protocol Other Libraries Coded in C#, F#... We test that our code produces and consumes the same messages as another implementation We only change our implementation of cryptography

slide-48
SLIDE 48

39

Application Authz

  • 3. Concrete testing & interop

Concrete Crypto Crypto Net Some Other Implementation (WSE) Platform (CLR) Interoperability (via SOAP) My code My protocol Other Libraries Coded in C#, F#... We test that our code produces and consumes the same messages as another implementation We can also run attacks to test other implementations Attacker (test) We only change our implementation of cryptography

slide-49
SLIDE 49

40

Source language: F#

 F#, a variant of ML for the .NET runtime

http://research.microsoft.com/fsharp Experimental language for research and prototyping

 Formally, a clean strongly-typed semantics

 We support first-order functional programming  We rely on abstract interfaces  We use algebraic data types and pattern-matching

for symbolic cryptography, for XML applications

slide-50
SLIDE 50

41

Password-Based Authentication

 A simple, one-message authentication protocol  Two roles

 client (A) sends some text, along with a MAC  server (B) checks authenticity of the text  the MAC is keyed using a nonce and a shared password  the password is protected from guessing attacks

by encrypting the nonce with the server’s public key

slide-51
SLIDE 51

42

Making and Checking Messages

slide-52
SLIDE 52

43

Coding Client and Server roles

slide-53
SLIDE 53

44

One Source, Three Tasks

 Using concrete libraries, our client and server can interact on

top of a TCP socket (and could interoperate with other implementations)

 Using symbolic libraries, we can see through cryptography  Using symbolic libraries, fs2pv generates a ProVerif model for

verification, direct from source code

slide-54
SLIDE 54

Two Implementations of Crypto

slide-55
SLIDE 55

Formalizing a Subset of F#

slide-56
SLIDE 56

47

A First-Order Functional Language

x,y,z variable a,b name f constructor (uncurried) ℓ function (curried) true,false,tuplen,Ss primitive constructors name,send,recv,log,failwith primitive functions M,N ::= value x variable a name f(M1,...,Mn) constructor application e ::= expression M value ℓ M1 ... Mn function application fork(fun()→e) fork a parallel thread match M with(| Mi → ei)i∈1..n pattern match let x = e1 in e2 sequential evaluation d ::= declaration type s = (| fi of si1∗...∗simi)i∈1..n datatype declaration let x = e value declaration let ℓ x1 ...xn = e n > 0 function declaration S ::= d1 ···dn system: list of declarations

slide-57
SLIDE 57

48

A Security Goal

slide-58
SLIDE 58

49

Safety Against an Attacker

slide-59
SLIDE 59

Mapping F to a Verifiable Model

slide-60
SLIDE 60

51

How to compile a function?

 Our compiler specifically targets symbolic verification  We select a translation for each function

 Complete inlining (anticipating resolution)  ProVerif reductions (eliminated by ProVerif)  ProVerif predicate declarations (logic programming)  ProVerif processes (most general, also most expensive)

 We follow Milner’s classic “functions as processes”  Each call takes two channel-based communication steps  We use private or public channels depending on the interface

slide-61
SLIDE 61

52

How to compile a function?

slide-62
SLIDE 62

53

Interlude: Functions as Processes

slide-63
SLIDE 63

54

Interlude: Functions as Processes

 Example  Other than this translation of functions everything else trivial

slide-64
SLIDE 64

55

Soundness of the translation

slide-65
SLIDE 65

56

Experimental results

 We coded and verified a series of protocols

 An implementation of a nested RPC protocol (Otway-Rees)  A library for Web Services Security  A range of web services protocols, checking interoperability

 We experimented with a range of security properties

 Secrecy  Authentication  Correlation properties

 We coded libraries enabling various realistic attacker models

 The attacker creates new principals, triggers their role,

control the generation of cryptographic materials, and can ask for password- and key-compromise

slide-66
SLIDE 66

57

Some Verification Results

slide-67
SLIDE 67

58

Limits of our model

 As usual, formal security guarantees hold

  • nly within the boundaries of the model

 We keep model and implementation in sync  We automatically deal with very precise models  We can precisely “program” the attacker model

 We verify our own implementations, not legacy code  We trust the F# compiler and the .NET runtime

 Certification is possible, but a separate problem

 We trust our symbolic model of cryptography

 Partial computational soundness results may apply  Further verification tools may use a concrete model

slide-68
SLIDE 68

59

Summary (part 2)

 We verify reference implementations of security protocols  Our implementations run with both concrete and symbolic

cryptographic libraries.

 Concrete implementation for production and interop testing  Symbolic implementation for debugging and verification

 We develop our approach for protocols written in F#,

running on the CLR, verified by ProVerif.

 We show its correctness for a range of security properties,

against realistic classes of adversaries

 We validate our approach on WS protocols

slide-69
SLIDE 69

Open Problems

 Verifying production code in a real language

 How to verify C code for SSH or SSL or Kerberos?

 Some difficulties: discovery of loop invariants, alias

analysis, discovery of heap invariants, etc.

60

slide-70
SLIDE 70

Related Work

 From models to code (usually Java)

 Strand spaces: Perrig, Song, Phan(2001), Lukell et al (2003)  CAPSL: Muller and Millen (2001)  Spi calculus: Lashari (2002), Pozza, Sisto, Durante (2004)

 Newest version of spi2java supports interoperability  Implementation of SSH generated: Sisto, Pironti (2007)

 Giambagi and Dam (2003) show conformance between

models and their implementation in terms of information flow

 Goubault-Larrecqand and Parennes (2005) analyze the secrecy

  • f C code by first performing an alias analysis to generate

Horn clauses, which they feed to FOL theorem prover (SPASS)

 Applied to implementation of Needham-Schroeder-Lowe protocol

61