Helper data schemes for privacy-preserving biometrics Boris kori - - PowerPoint PPT Presentation

helper data schemes for privacy preserving biometrics
SMART_READER_LITE
LIVE PREVIEW

Helper data schemes for privacy-preserving biometrics Boris kori - - PowerPoint PPT Presentation

Helper data schemes for privacy-preserving biometrics Boris kori TU Eindhoven van der Meulen seminar Leuven, December 2011 1 Outline Security with noisy data - biometrics & privacy - Physical Unclonable Functions (PUFs)


slide-1
SLIDE 1

Boris Škorić

TU Eindhoven

van der Meulen seminar Leuven, December 2011

Helper data schemes for privacy-preserving biometrics

1

slide-2
SLIDE 2

Outline

  • “Security with noisy data”
  • biometrics & privacy
  • Physical Unclonable Functions (PUFs)
  • anti-counterfeiting
  • Secure Sketches & Fuzzy Extractors
  • basics
  • Leftover Hash Lemma
  • some easy constructions

2

slide-3
SLIDE 3

Example A: biometric authentication

  • Biometrics are not really secret
  • easy to obtain
  • don’t entrust important secrets to biometric key
  • ... but have to be treated confidentially
  • privacy legislation
  • large databases, insider attacks

3

slide-4
SLIDE 4

Example A: biometric authentication

  • Biometrics are not really secret
  • easy to obtain
  • don’t entrust important secrets to biometric key
  • ... but have to be treated confidentially
  • privacy legislation
  • large databases, insider attacks
  • Solution
  • treat biom. authentication same as Unix passwords
  • store hash of {biometric + salt}
  • Problem
  • noisy measurements
  • hash has no noise tolerance

0010110101 1110111001... 3

slide-5
SLIDE 5

PUFs

  • Relatively new security primitive (Pappu 2001)
  • Physical Unclonable Function
  • complex piece of material
  • challenge-response pairs (CRPs)
  • difficult to characterize (“opaque”)
  • difficult to clone physically
  • difficult to emulate (“mathematical unclonability”)
  • Various applications
  • authentication token
  • anti-counterfeiting
  • secure key storage
  • software to hardware binding
  • tamper evidence

4

slide-6
SLIDE 6

Silicon PUF [Gassend et al. 2002]

TiN TiO2

Coating PUF

Posch 1998; Tuyls et al. 2006

SRAM PUF

Guajardo et al. Su et al. 2007

FPGA ‘butterfly’ PUF

Kumar et al. 2008

Optical PUF

Pappu 2001

5

slide-7
SLIDE 7

Anti-counterfeiting

Traditional approach:

  • add authenticity mark to product
  • hard to forge
  • all marks are identical

6

slide-8
SLIDE 8

Anti-counterfeiting

Traditional approach:

  • add authenticity mark to product
  • hard to forge
  • all marks are identical } Er, ... WTF?

6

slide-9
SLIDE 9

Anti-counterfeiting

Traditional approach:

  • add authenticity mark to product
  • hard to forge
  • all marks are identical } Er, ... WTF?

Imagine your company needs a security label ...

  • how do you know what you are buying?
  • nobody discloses technology details
  • there is no “AES” for anti-counterfeiting
  • perfect market for snake oil
  • by the way, many of the suppliers are Chinese

6

slide-10
SLIDE 10

is a high-tech company which is professional in laser security. (...) We can supply our clients with the comprehensive products in this field, such as: hologram label design, hologram master shooting, related professional equipments and related hologram materials. (...) And our clients are quite satisfied with our products and services. We are sure that we can meet your demands as well. Welcome to visit our company.Any more requirements, please connect us.

From a Chinese company webpage:

7

slide-11
SLIDE 11

!"#$%&'()"*+,%" !"-.#*$/"&0)-" !"10$2"&-)0*34" !"#"$%&'("#)%$*+,'

  • .'/*$01+"$.'234'
  • Unique marks
  • uncontrollable process
  • even manufacturer cannot clone
  • digitally signed by Enrollment Authority.
  • Two-step verification
  • check signature of Authority
  • check the mark.
  • Forgery needs either
  • physical cloning
  • or fake signature.
  • Allows open approach
  • no longer security-by-obscurity.

Certificate

Example B: anti-counterfeiting with bare PUFs

[Bauder, Simmons < 1991]

8

slide-12
SLIDE 12

!"#$%&'()"*+,%" !"-.#*$/"&0)-" !"10$2"&-)0*34" !"#"$%&'("#)%$*+,'

  • .'/*$01+"$.'234'
  • Unique marks
  • uncontrollable process
  • even manufacturer cannot clone
  • digitally signed by Enrollment Authority.
  • Two-step verification
  • check signature of Authority
  • check the mark.
  • Forgery needs either
  • physical cloning
  • or fake signature.
  • Allows open approach
  • no longer security-by-obscurity.

Certificate

Example B: anti-counterfeiting with bare PUFs

[Bauder, Simmons < 1991]

Manufacturer afraid to reveal product properties

  • just like biometric privacy
  • store hash of {mark + salt} → problem with noise

8

slide-13
SLIDE 13

Example C: remote authentication with bare PUF

  • PUF serves as huge repository of keys
  • Problem: noisy measurements !

Alice has {ci, Si} Bob has the PUF

Random i ci

Check if ci is replay Measure PUF response S’ Never use ci again Authenticated channel; MAC key Si

Eve has occasional access to the PUF

9

slide-14
SLIDE 14

Example D: Read-proof key storage

Device secrets stored during off state

  • attacker has full access
  • assumption: digital NV memory is insecure

Derive encryption key from Silicon/Coating PUF

  • only when needed
  • non-digital, hard to read from outside
  • tampering destroys key
  • Physically Obfuscated Key (POK)

EK[Device secrets] Insecure NV-mem

crypto processor POK sensor

Integrated components

K

Noise!

10

slide-15
SLIDE 15

Secure Sketches & Fuzzy Extractors

11

slide-16
SLIDE 16

ˆ X

A special kind of noise correction

Juels, Wattenberg 1999 Dodis, Reyzin, Smith 2003 Linnartz, Tuyls 2003

Redundancy data

  • required for error correction
  • created at enrollment
  • assumed public! (stored e.g. in DB)
  • must not leak

Secure Sketch Fuzzy Extractor SS Rec

ˆ X

W (helper data)

X X’ Gen Rep

W

X X’ S’ S

  • Prob[ ≠X] is low
  • I(W; X) is small
  • Prob[S’≠S] is low
  • I(W; S) is small
  • don’t care about I(W;X)

12

slide-17
SLIDE 17

Which technique to use, SS or FE?

Application privacy

  • f X ?

uniform secret? Technique

authentication by password

One-Way Function biometric authentication ∎ Secure Sketch + OWF anticounter- feiting PUF

Secure Sketch + OWF anticounter- feiting PUF

  • PUF authent.

w/o MACs

  • PUF authent.

with MACs

Fuzzy Extractor POK

Fuzzy Extractor

bare PUF

13

slide-18
SLIDE 18

Privacy-preserving biometric database

ID helper data salt hash 1 W1 c1 H1=h(c1||X1) ... ... ... ... n Wn cn Hn=h(cn||Xn)

Enrollment: Authentication: SS Xi Wi X'i Rec Wi ̂ Xi h(.||.) ci Compare

Hi

yes/no

14

slide-19
SLIDE 19

Helper Data; some intuition

SS

Rec

ˆ X

W

X

X’

Noisy measurement X stable part noisy part

}

W

Gen

Rep

W

X

X’

S’ S

key FE SS

15

slide-20
SLIDE 20

Helper Data; some intuition

SS

Rec

ˆ X

W

X

X’

Noisy measurement X stable part noisy part

}

W

Helper Data reveals noisy part of X.

  • SS: How much privacy is lost?

Not so bad:

  • W is subject to noise anyway
  • many people may have same W
  • FE: How much of the key is leaked?

Zero if W, S derived from independent parts of X

Gen

Rep

W

X

X’

S’ S

key FE SS

15

slide-21
SLIDE 21

Bluff your way in Secure Sketches

Enrollment phase:

w ← SS(x) Discrete non-uniform noisy X.

16

slide-22
SLIDE 22

Bluff your way in Secure Sketches

Enrollment phase:

w ← SS(x)

x w

Discrete non-uniform noisy X.

16

slide-23
SLIDE 23

Reconstruction phase:

x̂ = Rep(x’,w)

x’

17

slide-24
SLIDE 24

Reconstruction phase:

x̂ = Rep(x’,w)

x’ w

17

slide-25
SLIDE 25

Reconstruction phase:

x̂ = Rep(x’,w)

x’ w

17

slide-26
SLIDE 26

Secure Sketch: privacy of X

How much does W leak?

  • position of X in a tile

w How bad is that?

  • generally not so bad
  • “least significant bits”
  • subject to noise anyway
  • attacker must guess which tile

18

slide-27
SLIDE 27

Secure Sketch: privacy of X

How much does W leak?

  • position of X in a tile

w How bad is that?

  • generally not so bad
  • “least significant bits”
  • subject to noise anyway
  • attacker must guess which tile

Can you do without helper data?

  • only if all enrollments occur first
  • but then the tiling itself leaks

18

slide-28
SLIDE 28

Fuzzy Extractor: generic construction from SS

Dodis, Reyzin, Smith 2003

SS Rec

W

X X’ UHF UHF x̂

random r

S S’ UHF = universal hash function Gen Rep

public

19

slide-29
SLIDE 29

Almost Universal Hash Functions

Φ:X×R→T is called η-almost universal if, for fixed x, x’

Prob[ΦR(x) = ΦR(x′)] ≤ η

Called Universal for η = 1/ |T |

Distance of F(X,R) from uniformity, given Y and R

Leftover hash lemma

∆(F(X, R)Y R; UℓY R) ≤ 1 2

  • δ + 2ℓ−e

H2(X|Y )

If F: X×R→{0,1}ℓ is 2−ℓ(1+δ)-almost universal, then

20

Carter, Wegman 1979

slide-30
SLIDE 30

Extractable randomness Invert the leftover hash lemma:

ℓε

ext(X|Y ) ≥

H2(X|Y ) + 2 − 2 log 1 ε

penalty due to uniformity requirement

Rather bad

  • H2 ≤ Shannon entropy
  • Penalty term depends on target ε,

not on uniformity improvement.

21

slide-31
SLIDE 31

Basic examples

  • f

Fuzzy Extractors

22

slide-32
SLIDE 32

Code offset method

  • X = binary string
  • Use a linear ECC

Enrollment

  • random key s
  • encode to cs
  • w = cs-x

Reconstruction

  • s’ = decode(x’+w)

23

slide-33
SLIDE 33

Fuzzy Extractor purely from UHFs

MAC key helper data

BŠ, Tuyls 2008

24

slide-34
SLIDE 34

Fuzzy Extractor purely from UHFs

MAC on w

MAC key helper data

BŠ, Tuyls 2008

24

slide-35
SLIDE 35
  • First partition:

equiprobable keys

  • 2nd partition:

helper data,

  • equiprob. subpartitions
  • S | W=w is uniform

Fuzzy Extractor from partitions

Verbitskiy, Tuyls, Obi, Schoenmakers, BŠ 2008

25

slide-36
SLIDE 36

“robustness”

  • f helper data

Protecting the helper data (no PKI)

Boyen 2005

Random oracle model

SS Rec

w and h=hash(x, w)

x x’

Check h’==hash(x̂,w’)

attack

w’, h’

26

slide-37
SLIDE 37

“robustness”

  • f helper data

Protecting the helper data (no PKI)

Boyen 2005

Random oracle model

SS Rec

w and h=hash(x, w)

x x’

Check h’==hash(x̂,w’)

attack

w’, h’

Standard model

Gen Rep

w and m=MAC(s1,w)

x x’ s’ = s’1||s’2 s = s1||s2

attack

w’, m’ Check m’==MAC(s’1,w’) Use s2 as secret

“KMS-MAC”

Cramer et al. 2008

26

slide-38
SLIDE 38

Topics I did not mention

  • noisy identification;

search in fuzzy database

  • cross-correlating users in multiple DBs
  • privacy vs. key rate tradeoffs
  • properties of actual biometric data
  • noise patterns
  • tricks with error-correcting codes
  • ...

27

slide-39
SLIDE 39

Summary (sort of)

Main messages

  • Error correction inevitable in some security scenarios
  • privacy-preserving biometric DB
  • anti-counterfeiting with PUFs
  • key extraction from physical sources
  • Well established primitives
  • Secure Sketch
  • Fuzzy Extractor
  • Universal Hash Functions
  • it can be very simple !

28