finding yourself is the key
play

Finding Yourself Is The Key University of Haifa Biometric Key - PowerPoint PPT Presentation

Orr Dunkelman, Finding Yourself Is The Key University of Haifa Biometric Key Derivation that Joint work with Mahmood Sharif and Margarita Keeps Your Privacy Osadchy Overview Motivation Background : The Fuzziness Problem


  1. Orr Dunkelman, Finding Yourself Is The Key – University of Haifa Biometric Key Derivation that Joint work with Mahmood Sharif and Margarita Keeps Your Privacy Osadchy

  2. Overview ❖ Motivation ❖ Background : • The Fuzziness Problem • Cryptographic Constructions • Previous Work • Requirements ❖ Our System : • Feature Extraction • Binarization • Full System ❖ Experiments ❖ Conclusions

  3. Motivation ❖ Key-Derivation: generating a secret key, from information possessed by the user ❖ Passwords, the most widely used mean for key derivation, are problematic : What’s up doc? ?? 1. Forgettable pwd 2. Easily observable (shoulder-surfing) 3. Low entropy 4. Carried over between systems

  4. Motivation ❖ Suggestion: use biometric data for key generation ❖ Problems : 1. It is hard/impossible to replace the biometric template in case it gets compromised 2. Privacy of the users 1

  5. Overview ❖ Motivation ❖ Background : • The Fuzziness Problem • Cryptographic Constructions • Previous Work • Requirements ❖ Our System : • Feature Extraction • Binarization • Full System ❖ Experiments ❖ Conclusions

  6. Biometric Key Derivation K x

  7. The Fuzziness Problem ❖ Two images of the same face are rarely identical (due to lighting, pose, expression changes ( ❖ Yet we want to consistently create the same key for the user every time ❖ The fuzziness in the samples is handled by : 1. Feature extraction 2. The use of error-correction codes and helper data • Taken one after the other • 81689 pixels are different • only 3061 pixels have identical values !

  8. The 3 Step Process Binarization Error correction Feature extraction 0 1 ECC 1 0 0 1 reduces changes converts to binary removes the due to viewing representation and remaining conditions and removes most of noise small distortions the noise

  9. Feature Extraction User-specific features : Generic Features Eigenfaces (PCA) Histograms of low-level features, e.g.: LBPs, SIFT Fisherfaces (FLD ( Filters : Gabor features, etc training step produces user specific No training, no user parameters, stored for feature extraction specific information is required

  10. Feature Extraction Previous Work ❖ ] FYJ10] used Fisherfaces - public data looks like the users : ❖ Very Discriminative (better recognition) ❖ But compromises privacy – cannot be used!

  11. Feature Extraction Generic Features? ❖ Yes, but require caution. ❖ In [KSVAZ05] high-order dependencies between different channels of the Gabor transform ❖ ➜ correlations between the bits of the suggested representation

  12. Binarization ❖ Essential for using the cryptographic constructions Biometric features can be ❖ Some claim: non-invertibile ] TGN06] approximated ❖ By : - Sign of projection - Quantization Quantization is more accurate, but requires storing additional private information.

  13. Cryptographic Noise Tolerant Constructions ❖ Fuzzy Commitment [JW 99 ]: Enrollment Key Generation Binary Binary Representation of s Representation of Decode k the biometrics the biometrics Encode s 𝑙 ← {0,1} ∗ ❖ Other constructions: Fuzzy Vault [JS06], Fuzzy Extractors [DORS 08 ]

  14. Previous Work Problems 1. Short keys 2. Non-uniformly distributed binary strings as an input for the fuzzy commitment scheme 3. Dependency between bits of the biometric samples 4. Auxiliary data leaks personal information 5. No privacy - protection when the adversary gets hold of the cryptographic key (A.K.A. Strong biometric privacy)

  15. Security Requirements 1. Consistency: identify a person as himself (low FRR) 2. Discrimination: impostor cannot impersonate an enrolled user (low FAR) ] BKR 08 ]: 3. Weak Biometric Privacy (REQ-WBP): computationally infeasible to learn the biometric information given the helper data 4. Strong Biometric Privacy (REQ-SBP): computationally infeasible to learn the biometric information given the helper data and the key 5. Key Randomness (REQ-KR): given access to the helper data, the key should be computationally indistinguishable from random

  16. Overview ❖ Motivation ❖ Background : 1. The Fuzziness Problem 2. Cryptographic Constructions 3. Previous Work 4. Requirements ❖ Our System : 1. Feature Extraction 2. Binarization 3. Full System ❖ Experiments ❖ Conclusions

  17. Feature Extraction 1. Landmark Localization and Alignment ❖ Face landmark localization [ZR12] and affine transformation to a canonical pose : ❖ An essential step, due to the inability to perform alignment between enrolled and newly presented template

  18. Feature Extraction 2. Feature Extraction ❖ Local Binary Patterns (LBPs) descriptors are computed from 21 regions defined on the face : ❖ The same is done with Scale Invariant Feature Transform (SIFT) descriptors ❖ Histograms of Oriented Gradients (HoGs) are computed on the whole face

  19. Feature Extraction 3. Dimension Reduction and Whitening Dimension Reduction and Concatenation of Feature Vectors Removing Correlations Between the Features Rescaling for the [0,1] Interval

  20. Binarization by Projection     1   T h ( x ) sign W x 1 2 x

  21. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h i ( x ) 1 Wi -1

  22. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h i ( x ) 0 Wi -1

  23. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1 h(x’) ?  h i ( x ) 0 Wi -1

  24. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h ( x ) 0 i Wi  h ( x ' ) 0 i -1

  25. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h ( x ) 1 i Wi  h ( x ' ) 1 i -1

  26. Embedding in d-dimensional space -1 +1 W i

  27. Embedding in d-dimensional space -1 +1 W j

  28. Binarization Alg. ❖ Requirements from the binary representation : 1. Consistency and discrimination 2. No correlations between the bits 3. High min-entropy ❖ We find a discriminative projection space W by generalizing an algorithm from [WKC10] (for solving ANN problem) if the pair belongs to the same user ❖ For : otherwise ❖ The aim is to find hyperplanes , s.t. for : if otherwise

  29. Removing Dependencies between Bits Dimension Reduction and Concatenation of Feature Vectors X

  30. Removing Dependencies between Bits Dimension Reduction and Concatenation of Feature Vectors Z=A t X X Removing Correlations A w Between the Features Rescaling for the [0,1] Interval

  31. Removing Dependencies between Bits Dimension Reduction and Concatenation of Feature Vectors Z=A t X X Removing Correlations A w Between the Features 31     1   T h ( z ) sign W z 1 Rescaling for the [0,1] 2 Interval mutually independent bits Projection onto orthogonal hypeplanes W

  32. Full System ❖ Enrollment : Feature Binarization s Extraction Encode ❖ Key-Generation : 𝑙 ← {0,1} ∗ s Decode and Hash Feature Binarization Extraction

  33. Transfer Learning of the Embedding Learning W is done only once using subjects different from the users • of the key derivation system. • How is it done? We learn Instead of learning Is this Alice? … Same? … Is this Bob? Different? A more generic question that can be learnt for population.

  34. Overview ❖ Motivation ❖ Background : • The Fuzziness Problem • Cryptographic Constructions • Previous Work • Requirements ❖ Our System : • Feature Extraction • Binarization • Full System ❖ Experiments ❖ Conclusions

  35. Experiments Constructing the Embedding Performed only once • Subjects are different than those in testing • Number of Images Per Number of View Subjects Subject Hyperplanes Frontal 949 3-4 800 Profile 1117 1-8 800

  36. Experiments Evaluation ❖ Data : • 2 frontal images and 2 profile images of 100 different subjects (not in the training set) were used ❖ Recognition tests : • 5 round cross validation framework was followed to measure TPR-vs-FPR while increasing the threshold (ROC-curves) ❖ Key generation tests : • 100 genuine authentication attempts, and 99*100 impostor authentication attempts

  37. Results Recognition ROC curves

  38. Results Key Generation ❖ There is a trade-off between the amount of errors that the error- correction code can handle and the length of the produced key ❖ The Hamming-bound gives the following relation : n : the code length ( =1600 in our case) - t : the maximal number of corrected errors - k : the length of the encoded message (produced key, in our case) -

  39. Results Key Generation : For FAR= 0 FRR our FRR Random k ≤ t method Projection 595 80 0.30 0.32 609 70 0.16 0.23 624 60 0.12 0.19

  40. Error Correction Code Reed-Solomon Followed by Concatenation (PUFKY) Let X be the biometrics … X 5 bits 5 bits 5 bits Reed-Solomon, 15 symbols 31 symbols GF(2 5 ): over GF(2 5 ) over GF(2 5 ) Probability of error in symbol 1-0.7 5 ≈ 0.83 Probability of error in bit 0.3

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend