how to generate keys from
play

How to Generate Keys from Margarita Osadchy University of Haifa - PowerPoint PPT Presentation

How to Generate Keys from Margarita Osadchy University of Haifa Facial Images and Keep your Joint work with Mahmood Privacy at the Same Time Sharif and Orr Dunkelman Motivation Key-Derivation: generating a secret key, from information


  1. How to Generate Keys from Margarita Osadchy University of Haifa Facial Images and Keep your Joint work with Mahmood Privacy at the Same Time Sharif and Orr Dunkelman

  2. Motivation ❖ Key-Derivation: generating a secret key, from information possessed by the user ❖ Passwords are the most widely used means for key derivation, ❖ but…

  3. Motivation ❖ Passwords are : 1. Forgettable ??

  4. Motivation ❖ Passwords are : 1. Forgettable 2. Easily observable (shoulder-surfing) What’s up doc?

  5. Motivation ❖ Passwords are : 1. Forgettable 2. Easily observable (shoulder-surfing) 3. Low entropy pwd

  6. Motivation ❖ Passwords are : 1. Forgettable 2. Easily observable (shoulder-surfing) 3. Low entropy 4. Carried over between systems pwd pwd

  7. Motivation ❖ Suggestion: use biometric data for key generation ❖ Problems : 1. It is hard/impossible to replace the biometric template in case it gets compromised 2. Privacy of the users

  8. Privacy of Biometric Data K x

  9. The Fuzziness Problem Two images of the same face are rarely identical (due to lighting, pose, expression changes ( • Taken one after the other • 81689 pixels are different • only 3061 pixels have identical values !

  10. The Fuzziness Problem ❖ Two images of the same face are rarely identical (due to lighting, pose, expression changes ( ❖ Yet we want to consistently create the same key for the user every time ❖ The fuzziness in the samples is handled by : 1. Feature extraction 2. Use of error-correction codes and helper data

  11. 3-Step Noise Reduction Process Binarization Error correction Feature extraction 0 1 ECC 1 0 0 1 reduces changes converts to binary removes the due to viewing representation and remaining conditions and removes most of noise small distortions the noise

  12. Feature Extraction User-specific features : Generic Features Eigenfaces (PCA) Histograms of low-level features, e.g.: LBPs, SIFT Fisherfaces (FLD ( Filters : Gabor features, etc training step produces user specific No training, no user parameters, stored for feature extraction specific information is required

  13. Feature Extraction Previous Work ❖ ] FYJ10] used Fisherfaces - public data looks like the users : ❖ Very discriminative (better recognition) ❖ But compromises privacy. Can’t be used!

  14. Use Generic Features? ❖ Yes, but require caution. ❖ In [KSVAZ05] high-order dependencies between different channels of the Gabor transform ❖ ➜ correlations between the bits of the suggested representation

  15. Binarization ❖ Essential for using the cryptographic constructions Biometric features can be ❖ Some claim: non-invertibility [TGN06] approximated ❖ By : - Sign of projection - Quantization Quantization is more accurate, but requires storing additional private information.

  16. Cryptographic Noise Tolerant Constructions ❖ Fuzzy Commitment [JW 99 ]: Enrollment Key Generation Binary Binary Representation of s Representation of Decode k the biometrics the biometrics Encode s 𝑙 ← {0,1} ∗ ❖ Other constructions: Fuzzy Vault [JS06], Fuzzy Extractors [DORS 08 ]

  17. Previous Work Problems 1. Short keys 2. Non-uniformly distributed binary strings as an input for the fuzzy commitment scheme 3. Dependency between bits of the biometric samples 4. Auxiliary data leaks personal information 5. No protection when the adversary gets hold of the cryptographic key

  18. Security Requirements 1. Consistency: identify a person as himself (low FRR) 2. Discrimination: impostor cannot impersonate an enrolled user (low FAR) ] BKR 08 ]: 3. Weak Biometric Privacy (REQ-WBP): computationally infeasible to learn the biometric information given the helper data 4. Strong Biometric Privacy (REQ-SBP): computationally infeasible to learn the biometric information given the helper data and the key 5. Key Randomness (REQ-KR): given access to the helper data, the key should be computationally indistinguishable from random

  19. Feature Extraction 1. Landmark Localization and Alignment ❖ Face landmark localization [ZR12] and affine transformation to a canonical pose : ❖ An essential step, due to the inability to perform alignment between enrolled and newly presented template

  20. Feature Extraction 2. Feature Extraction ❖ Local Binary Patterns (LBPs) descriptors are computed from 21 regions defined on the face : ❖ The same is done with Scale Invariant Feature Transform (SIFT) descriptors ❖ Histograms of Oriented Gradients (HoGs) are computed on the whole face

  21. Binarization by Projection     1   T h ( x ) sign W x 1 2 x

  22. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h i ( x ) 1 Wi -1

  23. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h i ( x ) 0 Wi -1

  24. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1 h(x’) ?  h i ( x ) 0 Wi -1

  25. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h ( x ) 0 i Wi  h ( x ' ) 0 i -1

  26. Binarization by Projection     1   T h ( x ) sign W x 1 2 x +1  h ( x ) 1 i Wi  h ( x ' ) 1 i -1

  27. Embedding in d-dimensional space -1 +1 W i

  28. Embedding in d-dimensional space -1 +1 W j

  29. Binarization Alg. ❖ Requirements from the binary representation : 1. Consistency and discrimination 2. No correlations between the bits 3. High min-entropy ❖ We find a discriminative projection space W by generalizing an algorithm from [WKC10] (for solving ANN problem) if the pair belongs to the same user ❖ For : otherwise ❖ The aim is to find hyperplanes , s.t. for : if otherwise

  30. Removing Dependencies between Bits Dimension Reduction and Concatenation of Feature Vectors X

  31. Removing Dependencies between Bits Dimension Reduction and Concatenation of Feature Vectors Z=A t X X Removing Correlations A w Between the Features Rescaling for the [0,1] Interval

  32. Removing Dependencies between Bits Dimension Reduction and Concatenation of Feature Vectors Z=A t X X Removing Correlations A w Between the Features 32     1   T h ( z ) sign W z 1 Rescaling for the [0,1] 2 Interval mutually independent bits Projection onto orthogonal hypeplanes W

  33. Full System ❖ Enrollment : Feature Binarization s Extraction Encode ❖ Key-Generation : 𝑙 ← {0,1} ∗ s Decode and Hash Feature Binarization Extraction

  34. Transfer Learning of the Embedding Learning W is done only once using subjects different from the users • of the key derivation system. • How is it done? We learn Instead of learning Is this Alice? … Same? … Is this Bob? Different? A more generic question that can be learnt for population.

  35. Experiments Constructing the Embedding Performed only once • Subjects are different than those in testing • Number of Images Per Number of View Subjects Subject Hyperplanes Frontal 949 3-4 800 Profile 1117 1-8 800

  36. Experiments Evaluation ❖ Data : • 2 frontal images and 2 profile images of 100 different subjects (not in the training set) were used ❖ Recognition tests : • 5 round cross validation framework was followed to measure TPR-vs-FPR while increasing the threshold (ROC-curves) ❖ Key generation tests : • 100 genuine authentication attempts, and 99*100 impostor authentication attempts

  37. Results Recognition ROC curves

  38. Results Key Generation ❖ There is a trade-off between the amount of errors that the error- correction code can handle and the length of the produced key ❖ The Hamming-bound gives the following relation : n : the code length ( =1600 in our case) - t : the maximal number of corrected errors - k : the length of the encoded message (produced key, in our case) -

  39. Results Key Generation : For FAR= 0 FRR our FRR Random k ≤ t method Projection 595 80 0.30 0.32 609 70 0.16 0.23 624 60 0.12 0.19

  40. Error Correction Code … X 5 bits 5 bits 5 bits RS,GF(2 5 ): 15,GF(2 5 ) 31,GF(2 5 ) Probability of error in symbol 1-0.7 5 ≈ 0.83 Probability of error in bit 0.3

  41. Possible Solution X X 5 bits X X X RS,GF(2 5 ): 15,GF(2 5 ) 31,GF(2 5 ) Probability of error in symbol 0.3 Probability of error in bit 0.3

  42. Possible Solution Encoding: 31 31 X s1 5 5 X s2 ECC(K) ECC(K) … X s3 X s4 X s5 q times

  43. Possible Solution Decoding: the value of the bit = majority over q values 31 31 X’ s1 X’ s2 decode(C) 5 5 X’ … K C s3 X’ s4 X’ s5

  44. Problem in Security guessing 31 X 15 bits X 5 ECC(K) X X X RS,GF(2 5 ): 15,GF(2 5 ) 31,GF(2 5 )

  45. Problem in Security guessing 31 X 15 bits X 15 bits 5 ECC(K) X 15 bits X 15 bits X 15 bits RS,GF(2 5 ): 15,GF(2 5 ) 31,GF(2 5 ) Probability of error in symbol 0.3 Probability of error in bit 0.3

  46. Problem in Security guessing reveals X 15 bits X 15 bits ECC(K) X 15 bits X 15 bits X 15 bits 15 bit security RS,GF(2 5 ): 15,GF(2 5 ) 31,GF(2 5 )

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend