On the Resilience of Biometric Authentication Systems against - - PowerPoint PPT Presentation

on the resilience of biometric authentication systems
SMART_READER_LITE
LIVE PREVIEW

On the Resilience of Biometric Authentication Systems against - - PowerPoint PPT Presentation

On the Resilience of Biometric Authentication Systems against Random Inputs Benjamin Zhao , Hassan Asghar, Dali Kaafar Biometric Authentication: Overview Metrics Face Feature Extraction FRR, Engineered FPR, Embeddings EER,


slide-1
SLIDE 1

On the Resilience of Biometric Authentication Systems against Random Inputs

Benjamin Zhao, Hassan Asghar, Dali Kaafar

slide-2
SLIDE 2

Biometric Authentication: Overview

2 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

Metrics

  • FRR,
  • FPR,
  • EER,
  • ROC.

Registration Authentication

User Records

Training

Record In Question

1 Record

Classification Authentication Decision Yes / No The Model

Sensors

Speech Gait Fingerprint Face Touch

Feature Extraction

  • Engineered
  • Embeddings
slide-3
SLIDE 3

UI API Secure Enclave / Cloud The Model User/OS Space Feature Extractor API Secure Enclave / Cloud The Model

Features Raw Input

Sensors

User Input

UI Sensors

User Input Raw Input

Raw Input API Feature Vector API

  • Engineered
  • Embeddings

Authentication Decision: Yes/No

Biometrics as an API

Attack Surface

3 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

What if an attacker had access to these APIs?

slide-4
SLIDE 4
  • Perception: FPR is indicative of success of this attacker.
  • Yes, if attacker inputs have the same distribution as biometric data.
  • If the API is available, an attacker has more freedom.
  • In particular, an attacker can submit random inputs.

What is the Security of the biometric system against these Random Inputs?

What is the success of an attacker?

Length of Input Value Bounds User Identifier

Assumptions

5 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-5
SLIDE 5
  • A notion of Acceptance Region (AR): positively classified region of features.
  • Formally and experimentally show AR is larger than positive user’s data region.
  • Show Random Input attacker with black-box feature API access succeeds more

than EER.

  • Show Random Input attacker with Raw Input (before feature extraction) API

succeeds more than EER

  • Demonstrate attack on four real-world biometric schemes, and four ML algorithms.
  • Propose mitigation against attackers with either Raw or Feature API access.
  • Release our code in our Repo : https://imathatguy.github.io/Acceptance-Region/

Contributions

6 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-6
SLIDE 6
  • A notion of Acceptance Region (AR): positively classified region of features.
  • Formally and experimentally show AR is larger than positive user’s data region.
  • Show Random Input attacker with black-box feature API access succeeds more

than EER.

  • Show Random Input attacker with Raw Input (before feature extraction) API

succeeds more than EER

  • Demonstrate attack on four real-world biometric schemes, and four ML algorithms.
  • Propose mitigation against attackers with either Raw or Feature API access.
  • Release our code in our Repo : https://imathatguy.github.io/Acceptance-Region/

Contributions

7 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-7
SLIDE 7

Outline

  • What is the Random Input Attacker?
  • How we evaluate a Random Input Attacker’s Success.
  • Are Random Input Attacker successful on real-world datasets?
  • Factors that may affect the Success of the Random Input Attacker.
  • Evaluation of factors on Synthetic Data.
  • Propose a defence mechanism.
  • Code Available in our Repo: https://imathatguy.github.io/Acceptance-Region/

8 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-8
SLIDE 8

Random Input Attacker

  • How easy can a Random Input Attacker find an accepting sample?
  • The region where biometric samples are labelled as positive, Acceptance Region (AR).
  • And this is exactly equal to success probability of an attacker submitting random inputs.

9 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

The Model

Attacker Assumptions: Length of Input Input Bounds User Identifier

Uniformly Sampled Random Inputs {0 − 1, 0 − 1}

slide-9
SLIDE 9

Evaluation Methodology

10 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

Feature Extractor The Model

Training Features

User Inputs

Gait Touch Face Voice Synthetic Linear SVM Radial SVM Random Forest DNN

Balanced Data Two-class problem

T esting Features

Attacker

Uniformly Sampled Random Inputs Find system parameter for EER Evaluate AR at system parameter for EER.

EER AR

slide-10
SLIDE 10

Real-world Data Evaluation

11 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

EER when FRR = FPR 0.03

Face Dataset , Random Forest Classifier

System Parameter (Threshold)

slide-11
SLIDE 11

Real-world Data Evaluation

12 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

In many cases AR exceeds the EER

EER when FRR = FPR 0.03 0.78

Hides a vulnerability not revealed by EER

Face Dataset , Random Forest Classifier

System Parameter (Threshold)

slide-12
SLIDE 12

Real-world Data Evaluation

13 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

In many cases AR exceeds the EER Hides a vulnerability not revealed by EER AR is now zero, Problem Solved?

Face Dataset , Random Forest Classifier

System Parameter (Threshold)

slide-13
SLIDE 13

Real-world Data Evaluation

14 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

A flat AR response can be

  • bserved in many

configurations

Simply adjusting system parameters is ineffective in mitigating the Random Input Attacker.

Face Dataset , Linear SVM Classifier

System Parameter (Threshold)

slide-14
SLIDE 14

Real-world Data Evaluation - Individuals

Relationship between a user’s AR and EER not always guaranteed.

15 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

Face Dataset, Random Forests & DNN Classifiers

Touch, All Classifiers

Find details in Paper

when AR == EER

slide-15
SLIDE 15

ü Random Input Attacker, Leverages an exposed Feature Vector Input API to submit crafted inputs ü The Acceptance Region an approximate measure of success of a Random Input Attacker ü The Random Input attacker has success comparable to EER in user averages. ü An individual’s EER is not a reliable indicator of Random Input Attacker success

  • Outline factors that may affect the Success of the Random Input Attacker.
  • Evaluation of factors on Synthetic Data.
  • Propose a defence mechanism.

Recap – Real World Data

16 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-16
SLIDE 16

Factors Effecting the Acceptance Region

  • Both the positive and negative examples are

expected to be highly concentrated.

  • It is desirable for models to bound it’s decision

boundary around this region

  • However model-based classifiers do not

penalize empty space.

  • Variability of the Positive class.
  • Variability of the Negative class.

17 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-17
SLIDE 17

Synthetic Data Evaluation – Positive User Variance

A user with high feature variance, will be more susceptible to a Random Input Attacker

18 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

Synthetic Data, DNN Classifier

System-wide success of the Random Input Attacker may not capture the large vulnerability of a few users.

slide-18
SLIDE 18

Synthetic Data Evaluation – Negative User Variance

19 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

A User’s vulnerability to the Random Input Attacker can be decreased by only increasing the variance of the Negative Class. Synthetic Data, DNN Classifier

slide-19
SLIDE 19

ü A user with high feature variance, will be more susceptible to a Random Input Attacker ü A User’s vulnerability to the Random Input Attacker can be decreased by only increasing the variance of the Negative Class.

  • Propose a defence mechanism.

20 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

Recap – Synthetic Data

slide-20
SLIDE 20

Proposed Defence Mechanism

  • If we can increase the variance of the Negative class,

we can reduce the success of the Random Input Attacker.

  • We can increase negative class variation with noise.
  • Conveniently, Beta-distributed noise, will sample

values distant from a user’s values.

  • Far away from user values, minimize impact on existing EER
  • Data manipulation is algorithm Independent
  • Train user model with noise vectors sampled from

beta-distributions defined from the user’s training samples.

Count Feature Value User Noise

21 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-21
SLIDE 21

Proposed Defence Mechanism – Beta Noise

EER AR EER AR 0.09 0.03 0.09 0.00 0.21 0.23 0.21 0.00 0.03 0.78 0.03 0.00 0.04 0.01 0.04 0.00

Before Defence | After Defence Random Forests

EER AR EER AR 0.215 0.20 0.170 0.00 0.325 0.30 0.375 0.00 0.095 0.10 0.065 0.04 0.115 0.08 0.090 0.02

Before Defence | After Defence DNN

Gait Touch Face Voice

The AR has been substantially reduced below EER

22 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

  • Maintain balanced dataset.
  • 1/3 positive, 1/3 negative, 1/3 beta noise.
slide-22
SLIDE 22
  • Proposal of the Random Input Attacker
  • Probability of success by the Random Input Attacker is comparable to EER.
  • Tuning system parameters may not necessarily mitigate the Random Attacker.
  • EER is not a consistent indicator of the Random Input Attacker’s success
  • Class variance tied to the success of the Random Input Attacker.
  • Mitigation the Random Input Attacker with beta-distributed noise at training.

Conclusions

23 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-23
SLIDE 23
  • Formal Treatment of Random Input Attacker and Acceptance Region
  • Success of the Raw Input API Random Attacker
  • More biometric modalities, and ML algorithms
  • More Factors affecting Acceptance Region
  • Distance-based classifier
  • Number of Users.
  • Defending against Raw input Random Attacks.
  • Beta noise not completely sufficient
  • Additional protections proposed.

More in the Paper

24 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-24
SLIDE 24
  • Is the Random Input attacker equally effective against
  • ne-class and multi-class approaches to authentication?
  • The effects of a non-balanced dataset on the success of

the Random Input Attacker.

  • Is the vulnerability of the Random Input Attacker as

measured by the Acceptance Region prevalent in other ML applications?

What else?

25 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao

slide-25
SLIDE 25

Question(s)?

https://imathatguy.github.io/Acceptance-Region/ For details and further info: Benjamin Zhao (benjamin.zhao@unsw.edu.au)

Thank You

More details in Paper + Repo

Repo QR Code

26 | University of New South Wales, Macquarie University, Data61 CSIRO | Benjamin Zhao