Formal Analysis of Electronic Voting Systems Mark Ryan University - - PowerPoint PPT Presentation

formal analysis of electronic voting systems
SMART_READER_LITE
LIVE PREVIEW

Formal Analysis of Electronic Voting Systems Mark Ryan University - - PowerPoint PPT Presentation

Formal Analysis of Electronic Voting Systems Mark Ryan University of Birmingham joint work with Ben Smyth Steve Kremer Imperial College 21 April 2010 Outline Potential & current situation 1 Desired properties 2 Example 3 Modelling


slide-1
SLIDE 1

Formal Analysis of Electronic Voting Systems

Mark Ryan University of Birmingham joint work with Ben Smyth Steve Kremer Imperial College 21 April 2010

slide-2
SLIDE 2

Outline

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-3
SLIDE 3

Electronic voting: potential

Electronic voting potentially offers Efficiency

higher voter participation greater accuracy lower costs

Better security

vote-privacy even in presence

  • f corrupt election authorities

voter verification, i.e. the ability of voters and observers to check the declared

  • utcome against the votes

cast.

Governments world over have been trialling e-voting, e.g. USA, UK, Canada, Brasil, the Netherlands and Estonia. Can also be useful for smaller-scale elections (student guild, shareholder voting, trade union ballots, local government).

slide-4
SLIDE 4

Current situation

The potential benefits have turned out to be hard to realise. In UK May 2007 elections included 5 local authorities that piloted a range

  • f electronic voting machines.

Electoral Commission report concluded that the implementation and security risk was significant and unacceptable and recommends that no further e-voting take place until a sufficiently secure and transparent system is available. In USA: Diebold controversy since 2003 when code leaked on internet.

Kohno/Stubblefield/Rubin/Wallach analysis concluded Diebold system far below even most minimal security standards. Voters without insider privileges can cast unlimited votes without being detected.

slide-5
SLIDE 5

Current situation in USA, continued

In 2007, Secr. of State for California commissioned “top-to-bottom” review by computer science academics of the four machines certified for use in the state. Result is a catalogue of vulnerabilities, including appalling software engineering practices, such as hardcoding crypto keys in source code; bypassing OS protection mechanisms, . . . susceptibility of voting machines to viruses that propogate from machine to machine, and that could maliciously cause votes to be recorded incorrectly or miscounted “weakness-in-depth”, architecturally unsound systems in which even as known flaws are fixed, new ones are discovered. In response to these reports, she decertified all four types of voting machine for regular use in California, on 3 August 2007.

slide-6
SLIDE 6

Current situation in Estonia

Estonia is a tiny former Soviet republic (pop. 1.4M), nicknamed “e-Stonia” because of its tech-savvy character.

  • Oct. 2005 local election allowed voters to cast ballots on internet.

There were 9,317 electronic votes cast out of 496,336 votes in total (1.9%) participated online. Officials hailed the experiment a success. Said no reports of hacking

  • r flaws. System based on linux.

Voters need special ID smartcard, a $24 device that reads the card, and a computer with internet access. About 80% of Estonian voters have the cards anyway, also used since 2002 for online banking and tax records.

  • Feb. 2007 general election: 30,275 voters used internet voting.
slide-7
SLIDE 7

Internet voting and coercion resistance

The possibility of coercion (e.g. by family members) seems very hard to avoid for internet voting. In Estonia, the threat is somewhat mitigated: Election system allows multiple online votes to be cast by the same person during the days of advance voting, with each vote cancelling the previous one. System gives priority to paper ballots; a paper ballot cancels any previous online ballot by the same person.

slide-8
SLIDE 8

Where are we?

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-9
SLIDE 9

Desired properties

Verifiability

  • Outcome of election is

verifiable by voters and observers

  • You don’t need to trust

election software

slide-10
SLIDE 10

Desired properties

Verifiability

  • Outcome of election is

verifiable by voters and observers

  • You don’t need to trust

election software

Incoercibility

  • Your vote is private
  • even if you try to

cooperate with a coercer

  • even if the coercer is the

election authorities

slide-11
SLIDE 11

Desired properties

Verifiability

  • Outcome of election is

verifiable by voters and observers

  • You don’t need to trust

election software

Incoercibility

  • Your vote is private
  • even if you try to

cooperate with a coercer

  • even if the coercer is the

election authorities

Usability

  • Vote & go
  • Verify any time
slide-12
SLIDE 12

Examples Verifiable Usable Incoercible

raising hands website voting using Tor

?

slide-13
SLIDE 13

How could it be secure?

slide-14
SLIDE 14

Security by trusted client software

→ → → → → → → → → → trusted by user does not need to be trusted by authorities

  • r other voters

not trusted by user doesn’t need to be trusted by anyone

slide-15
SLIDE 15

Where are we?

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-16
SLIDE 16

Election of president at University of Louvain

The election Helios 2.0 25,000 potential voters

5000 registered, 4000 voted Educated, but not technical

30% voters checked their vote

No valid complaints

Verifiability

Anyone can write code to verify the election Sample python code provided

No coercion resistance Only recommended for low-coercion environments Re-votes are allowed, but don’t help w.r.t. “insider” coercer

[Adida/deMarneffe/Pereira/- Quisquater 09]

slide-17
SLIDE 17

OPEN-AUDIT OF THE RESULTS OF THE RECTOR ELECTION 2009

The voting system used for this election provides universally verifiable elections. This means that: a voter can verify that her ballot is cast as intended (her ballot reflects her own opinion), 1. a voter can verify that her ballot is included unmodified in the collection of ballots to be used at tally time, 2. anyone can verify that the election result is consistent with that collection of ballots. 3.

UCL - Audit des résultats de l'élection file:///home/mdr/tmp/election.uclouvain.be/audit-en.html 1 of 1 26/08/09 12:09

slide-18
SLIDE 18

Helios 2.0

→ → → → → → → → →

User prepares ballot (encrypted vote) on her computer, together with ZKPs. Cut-and-choose auditability provides assurance of correctness Not much guarantee of privacy on client side. Ballots are checked & homomorphically combined into a single encrypted outcome. Outcome is decrypted by threshold of talliers. Proof of correct decryption.

slide-19
SLIDE 19

Helios 2.0

→ → → → → → → → →

User prepares ballot (encrypted vote) on her computer, together with ZKPs. Cut-and-choose auditability provides assurance of correctness Not much guarantee of privacy on client side. Ballots are checked & homomorphically combined into a single encrypted outcome. Outcome is decrypted by threshold of talliers. Proof of correct decryption.

slide-20
SLIDE 20

Where are we?

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-21
SLIDE 21

Applied pi calculus and ProVerif

The applied pi calculus is a language for describing concurrent processes and their interactions

Developed explicitly for modelling security protocols Similar to spi calculus; with more general cryptography

ProVerif is a leading software tool for automated reasoning

Takes applied pi processes and reasons about observational equivalence, correspondence assertions and secrecy

History of applied pi calculus and ProVerif 1970s: Milner’s Calculus of Communicating Systems (CCS) 1989: Milner et al. extend CCS to pi calculus 1999: Abadi & Gordon introduce spi calculus, variant of pi 2001: Abadi & Fournet generalise spi to applied pi calculus 2000s: Blanchet develops ProVerif to enable automated reasoning for applied pi calculus processes

slide-22
SLIDE 22

Applied pi calculus: Grammar

Terms L, M, N, T, U, V ::= a, b, c, k, m, n, s, t, r, . . . name x, y, z variable g(M1, . . . , Ml) function Equational theory Suppose we have defined nullary function ok, unary function pk, binary functions enc, dec, senc, sdec, sign, and ternary function checksign.

sdec( x, senc(x, y) ) = y dec( x, enc(pk(x), y) ) = y checksign( pk(x), y, sign(x, y) ) =

  • k
slide-23
SLIDE 23

Applied pi calculus: Grammar

Processes P, Q, R ::= processes null process P | Q parallel comp. !P replication ν n.P name restriction u(x).P message input uM.P message output if M = N then P else Q cond’nl A, B, C ::= extended processes P plain process A | B parallel comp. ν n.A name restriction ν x.A variable restriction {M/x} active substitution Example ν k.(csenc(k, a). csenc(k, b) | {h(k)/x})

slide-24
SLIDE 24

Modelling Helios 2.0: equational theory

dec(xsk, penc(pk(xsk), xrand, xtext)) = xtext dec(decKey(xsk, ciph), ciph) = xplain where ciph = penc(pk(xsk), xrand, xplain) penc(xpk, yrand, ytext) ∗ penc(xpk, zrand, ztext) = penc(xpk, yrand ◦ zrand, ytext + ztext) checkBallotPf(xpk, ballot, ballotPf(xpk, xrand, s, ballot)) = true where ballot = penc(xpk, xrand, s) checkDecKeyPf(pk(xsk), ciph, dk, decKeyPf(xsk, ciph, dk)) = true where ciph = penc(pk(xsk), xrand, xplain) and dk = decKey(xsk, ciph)

slide-25
SLIDE 25

Modelling trusted & untrusted components

Depending on the property to be analysed, some components are required to be trusted & some not. If it is Then it is Required to be trusted Coded as a process Auditable Coded as a process Not required to be trusted Not coded as a process. The attacker can do what it wants.

slide-26
SLIDE 26

Modelling Helios 2.0: processes

Definition A voting process specification is a tuple V , A where V is a plain process without replication and A is a closed evaluation context such that fv(V ) = {v} and rv(V ) = ∅. Given a voting process specification V , A, integer n ∈ N, and names s1, . . . , sn we can build the voting process VPn(s1, . . . , sn) = A[V1 | · · · | Vn] where Vi = V {si /v}. Intuitively, VPn(s1, . . . , sn) models the protocol with n voters casting votes for candidates s1, . . . , sn. Definition The voting process specification Vhelios, Ahelios is defined where Vhelios ˆ = d(xpid). dv. d(xballot). d(xballotpf).c(w, xballot, xballotpf ) Ahelios[ ] ˆ = νsk, d. ` cpk(sk) | (!νpid. dpid) | (!B) | T | ´ B ˆ = νm. d(xvote).dpenc(pk(sk), m, xvote). dballotPf(pk(sk), m, xvote, penc(pk(sk), m, xvote)) T ˆ = c(xtally). c(decKey(sk, xtally), decKeyPf(sk, xtally, decKey(sk, xtally)))

slide-27
SLIDE 27

Where are we?

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-28
SLIDE 28

Election verifiability

Individual verifiability A voter can check her own vote is included in the ballot collection. Universal verifiability Anyone can check that the declared

  • utcome

corresponds to the ballot collection. Eligibility verifiability Anyone can check that only eligible votes are included in the ballot collection. Remark Verifiability = correctness

slide-29
SLIDE 29

Individual and universal verifiability

Individual test We require a test ΦIV my vote , my data , bb entry

  • that a voter can apply after the

election. The test succeeds iff the bulletin board entry corresponds to the voter’s vote and data. Universal test We require a test ΦUV decl outcome , bb entries , pf

  • that an observer can apply after the

election. The test succeeds iff the declared

  • utcome is correct w.r.t. the bb

entries and the proof.

slide-30
SLIDE 30

Individual and universal verifiability

ΦIV my vote , my data , bb entry

  • ΦUV

decl outcome , bb entries , pf

  • Acceptability conditions for ΦIV and ΦUV

Soundness, For all BBs σ: ∀i, j. ΦIV (vi, ri, y) ∧ ΦIV (vj, rj, y) ⇒ i = j (1) ΦUV (˜ v, ˜ y, p) ∧ ΦUV (˜ v ′, ˜ y, p) ⇒ ˜ v ≃ ˜ v ′ (2)

  • 1≤i≤n

ΦIV (vi, ri, yi) ∧ ΦUV (˜ v ′, ˜ y, p) ⇒ ˜ v ≃ ˜ v ′ (3) Effectiveness There exists a BB σ s.t.

  • 1≤i≤n

ΦIV

i (vi, ri, yi)σ ∧ ΦUV (˜

v, ˜ y, p) (4)

slide-31
SLIDE 31

Helios 2.0: Verifiability

The untrusted server is assumed to publish the election data. When the protocol is executed as expected the resulting frame should have substitution σ such that xpkσ = pk(sk) yiσ = (pidi, penc(pk(sk), mi, vi), ballotPf(pk(sk), mi, vi, penc(pk(sk), mi, vi))) ztallyσ = π2(y1) ∗ · · · ∗ π2(yn)σ zdecKeyσ = decKey(sk, ztally)σ zdecKeyPfσ = decKeyPf(sk, ztally, zdecKey)σ ΦIV = y =E (rpid, rballot, rballotpf ) ΦUV = ztally =E π2(y1) ∗ · · · ∗ π2(yn) ∧ n

i=1(checkBallotPf(xpk, π2(yi), π3(yi)) =E true)

∧ checkDecKeyPf(xpk, ztally, zdecKey, zdecKeyPf) =E true ∧ v1 + · · · + vn =E dec(zdecKey, ztally)

slide-32
SLIDE 32

Election verifiability

Individual verifiability A voter can check her own vote is included in the ballot collection. Universal verifiability Anyone can check that the declared

  • utcome

corresponds to the ballot collection. Eligibility verifiability Anyone can check that only eligible votes are included in the ballot collection. Remark Verifiability = correctness

slide-33
SLIDE 33

Individual and universal verifiability

Individual test We require a test ΦIV my vote , my data , bb entry

  • that a voter can apply after the

election. The test succeeds iff the bulletin board entry corresponds to the voter’s vote and data. Universal test We require a test ΦUV decl outcome , bb entries , pf

  • that an observer can apply after the

election. The test succeeds iff the declared

  • utcome is correct w.r.t. the bb

entries and the proof. Eligibility test We require a test ΦEV pub creds , bb entries , pf

  • that an observer can apply after the election. The test succeeds iff each

the votes bb entries was created by an owner of a pub cred.

slide-34
SLIDE 34

Individual, universal and eligibility verifiability

ΦIV (v, w, r, y) ΦUV (˜ v, ˜ y, p) ΦEV (˜ w, ˜ y, p) Acceptability conditions for ΦIV , ΦUV and ΦEV (1), (2), (3), (4), and: ΦEV (˜ w, ˜ y, p) ∧ ΦEV (˜ w ′, ˜ y, p) ⇒ ˜ w ≃ ˜ w ′ (5)

  • 1≤i≤n

ΦIV (vi, ri, yi) ∧ ΦEV (˜ w ′, ˜ y, p) ⇒ ˜ w ≃ ˜ w ′ (6)

slide-35
SLIDE 35

JCJ-Civitas

Voter prepares encrypted ballot on a trusted computer, and submits it.

Similar to Helios 2.0, except that cut-and-choose auditability of ballot isn’t necessary.

Ballots are submitted to a re-encryption mixnet that is able to prove it correctly mixed the ballots. Ballots are threshold decrypted with proof of correct decryption, and counted.

slide-36
SLIDE 36

JCJ-Civitas: Eligibility verifiability

Voters construct a secret credential through interaction with multiple registrars. A public part of the credential is published on the electoral register, for public scrutiny. Voters submit their vote with a differently randomised public part of the credential. This means that any observer can verify that all the votes cast are cast by eligible voters.

slide-37
SLIDE 37

JCJ-Civitas

Voter with credential d

Ballot

{v}pk T

m

, {d }

m' , zkp remove malformed ballots remove duplicates remove ineligible ballots decrypt results

Electoral register

{d }pk R

m' ' , Anne Jones

slide-38
SLIDE 38

JCJ-Civitas: equational theory

checkBallot(ballotPf(xpk, xrand, xtext, x′

pk, x′ rand, x′ text),

penc(xpk, xrand, xtext), penc(x′

pk, x′ rand, x′ text)) = true

pet(petPf(xsk, ciph, ciph′), ciph, ciph′) = true where ciph ˆ = penc(pk(xsk), xrand, xtext) and ciph′ ˆ = penc(pk(xsk), x′

rand, xtext).

renc(yrand, penc(pk(xsk), xrand, xtext)) = penc(pk(xsk), f (xrand, yrand), xtext). For each permutation χ on {1, . . . , n}: checkMix(pfMix(xciph,1, . . . , xciph,n, ciph1, . . . , ciphn, zrand,1, . . . , zrand,n), xciph,1, . . . , xciph,n, ciph1, . . . , ciphn) = true where ciphi ˆ = renc(zrand,i, xciph,χ(i)).

slide-39
SLIDE 39

JCJ-Civitas: model

Definition

The voting process specification Ajcj, Vjcj is defined where: Ajcj ˆ = ν a, sskR.(!R | {pk(skR)/xpkR, pk(sskR)/xspkR , pk(skT )/xpkT} | ) Vjcj ˆ = ν m, m′.a(xcred). let ciph = penc(xpkT , m, v) in let ciph′ = penc(xpkR , m′, π1(xcred)) in let zkp = ballotPf(xpkT , m, v, xpkR , m′, π1(xcred)) in c(ciph, ciph′, zkp) R ˆ = ν d, m′′. let sig = sign(sskR, penc(xpkR , m′′, d)) in a(d, sig ).csig

slide-40
SLIDE 40

JCJ-Civitas: verifiability

wiσ = sign(sskR, c′′

i )

xpkR σ = pk(skR) xspkR σ = pk(sskR) xpkT σ = pk(skT ) yiσ = (ci, c′

i , ballotPf(pk(skT ), mi, si, pk(skR), m′ i , di))

zbal,iσ = (renc( ˆ mi, cχ(i)), renc( ˆ m′

i , c′ χ(i)))

zpfMixPairσ = pfMixPair((c1, c′

1), . . . , (cn, c′ n), (renc( ˆ

m1, cχ(1)), renc( ˆ m′

1, c′ χ(1))),

. . . , (renc( ˆ mn, cχ(n)), renc( ˆ m′

n, c′ χ(n))), ( ˆ

m1, ˆ m′

1), . . . , ( ˆ

mn, ˆ m′

n))

zdecKey,iσ = decKey(skT , renc( ˆ mi, cχ(i))) zdecPf,iσ = decKeyPf(skT , renc( ˆ mi, cχ(i)), decKey(skT , renc( ˆ mi, cχ(i)))) zcred,iσ = renc( ˆ m′′

i , c′′ χ′(i))

ˆ zcred,iσ = renc( ˆ m′′

χ(χ′−1(i)), c′′ χ(i))

zcredMixPfσ = pfMix(c′′

1 , . . . , c′′ n , renc( ˆ

m′′

1 , c′′ χ′(1)), . . . , renc( ˆ

m′′

n , c′′ χ′(n)), ˆ

m′′

1 , . . . , ˆ

m′′

n )

zpetPf,iσ = petPf(skR, renc( ˆ m′

i , c′ χ(i)), renc( ˆ

m′′

χ(χ′−1(i)), c′′ χ(i)))

where ci ˆ = penc(pk(skT ), m, si), c′

i ˆ

= penc(pk(skR), m′, di), c′′

i

ˆ = penc(pk(skR), m′′, di) and χ, χ′ are permutations on {1, . . . , n}.

slide-41
SLIDE 41

JCJ-Civitas: verifiability

ΦIV b = y =E (penc(xpkT, rm, v), penc(xpkR, rm′, π1(rcred)), ballotPf(xpkT, rm, v, xpkR, rm′, π1(rcred))) ∧ w = π2(rcred) ΦUV b = checkMixPair(zpfMixPair, (π1(y1), π2(y1)), . . . , (π1(yn), π2(yn)), zbal,1, . . . , zbal,n) =E true ∧ Vn

i=1 dec(zdecKey,i, π1(zbal,i)) =E vi

∧ Vn

i=1 checkDecKeyPf(xpkT, π1(zbal,i), zdecKey,i, zdecPf,i) =E true

ΦEV b = Vn

i=1 checkBallot(π3(yi), π1(yi), π2(yi))

∧ checkMixPair(zpfMixPair, (π1(y1), π2(y1)), . . . , (π1(yn), π2(yn)), zbal,1, . . . , zbal,n) =E true ∧ Vn

i=1 pet(zpetPf,i, π2(zbal,i), ˆ

zcred,i) =E true ∧ (zcred,1, . . . , zcred,n) ≃ (ˆ zcred,1, . . . , ˆ zcred,n) ∧ checkMix(zcredMixPf, getmsg(w1), . . . , getmsg(wn), zcred,1, . . . , zcred,n) =E true ∧ Vn

i=1 checksign(xspkR , wi)

XXX: CHECK mixPr and mixPairPf

slide-42
SLIDE 42

Where are we?

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-43
SLIDE 43

Formalisation of vote-privacy

Classically modeled as observational equivalences between two slightly different processes P1 and P2, but changing the identity does not work, as identities are revealed changing the vote does not work, as the votes are revealed at the end ֒ → consider two honest voters and swap their votes Definition (Privacy) A voting protocol respects privacy if S[VA{a/v} | VB{b/v}] ≈ℓ S[VA{b/v} | VB{a/v}].

slide-44
SLIDE 44

Receipt-freeness: leaking secrets to the coercer

To model receipt-freeness we need to specify that a coerced voter cooperates with the coercer by leaking secrets on a channel ch P ::= P | P νn.P u(x).P uM.P if M = N then P else P !P . . . Pch in terms of P 0ch = 0 (P | Q)ch = Pch | Qch (νn.P)ch = νn.chn.Pch (u(x).P)ch = u(x).chx.Pch (uM.P)ch = uM.Pch . . . We denote by P\out(chc,·) the process νchc.(P |!chc(x)). Lemma: (Pch)\out(chc,·) ≈ℓ P

slide-45
SLIDE 45

Receipt-freeness: definition

Intuition There exists a process V ′ which votes a, leaks (possibly fake) secrets to the coercer, and makes the coercer believe she voted c Definition (Receipt-freeness) A voting protocol is receipt-free if there exists a process V ′, satisfying V ′\out(chc,·) ≈ℓ VA{a/v}, S[VA{c/v}chc | VB{a/v}] ≈ℓ S[V ′ | VB{c/v}]. Case study: Lee et al. protocol We prove receipt-freeness by exhibiting V ′ showing that V ′\out(chc,·) ≈ℓ VA{a/v} showing that S[VA{c/v}chc | VB{a/v}] ≈ℓ S[V ′ | VB{c/v}]

slide-46
SLIDE 46

Coercion resistance: talking with the coercer

Like receipt-freness, but: voter interacts with the coercer during the protocol (instead of just supplying data at the end). The voting booth makes coercion resistance possible. Interactively communicating with the coercer: Pc1,c2 in terms of P 0c1,c2 = 0, (P | Q)c1,c2 = Pc1,c2 | Qc1,c2 (νn.P)c1,c2 = νn.c1n.Pc1,c2 (u(x).P)c1,c2 = u(x).c1x.Pc1,c2 (uM.P)c1,c2 = c2(x).ux.Pc1,c2 (!P)c1,c2 = !Pc1,c2, (if M = N then P else Q)c1,c2 = c2(x). if x = true then Pc1,c2 else Qc1,c2

slide-47
SLIDE 47

Coercion resistance: definition

Definition (Coercion resistance) VP is coercion resistant if there exists a process V ′ such that for any C = νc1.νc2.( | P) satisfying ˜ n ∩ fn(C) = ∅ S[C[VA{?/v}c1,c2] | VB{a/v}] ≈ℓ S[VA{c/v}chc | VB{a/v}] we have C[V ′]\out(chc,·) ≈ℓ VA{a/v}, S[C[VA{?/v}c1,c2] | VB{a/v}] ≈ℓ S[C[V ′] | VB{c/v}]. Intuitively, C together with the environment represent the coercer. The definition says there’s a strategy V ′ for the voter such that if the coercer is trying to force A to vote c then A can do V ′, which will result in an a vote, but will satisfy the coercer. Doesn’t take account of fault attacks (cf. K¨ usters/Truderung).

slide-48
SLIDE 48

Privacy properties

Proposition Let VP be a voting protocol. Then VP is coercion-resistant ⇓ VP is receipt-free ⇓ VP respects privacy

slide-49
SLIDE 49

Where are we?

1

Potential & current situation

2

Desired properties

3

Example

4

Modelling systems

5

Election verifiability

6

Incoercibility

7

Conclusions

slide-50
SLIDE 50

Conclusions

Electronic voting Ongoing issues

securability usability adoptability

Comparison with digital cash Process calculus analysis Powerful Appropriate Abstraction level

separation of concerns may miss attacks

Trustworthy Voting Systems EPSRC project 2009-2013 Birmingham, Surrey, Newcastle/Luxembourg Opt2Vote, Electoral Reform Services Ministry of Justice

slide-51
SLIDE 51

[ This slide has been intentionally left blank. ]

slide-52
SLIDE 52

[ This slide has been intentionally left blank. ]

slide-53
SLIDE 53

[ This slide has been intentionally left blank. ]

slide-54
SLIDE 54

[ This slide has been intentionally left blank. ]

slide-55
SLIDE 55

[ This slide has been intentionally left blank. ]

slide-56
SLIDE 56

[ This slide has been intentionally left blank. ]

slide-57
SLIDE 57

[ This slide has been intentionally left blank. ]

slide-58
SLIDE 58

[ This slide has been intentionally left blank. ]

slide-59
SLIDE 59

[ This slide has been intentionally left blank. ]

slide-60
SLIDE 60

[ This slide has been intentionally left blank. ]