Engineering Privacy in Public James Alexander and Jonathan Smith - - PowerPoint PPT Presentation

engineering privacy in public
SMART_READER_LITE
LIVE PREVIEW

Engineering Privacy in Public James Alexander and Jonathan Smith - - PowerPoint PPT Presentation

Engineering Privacy in Public James Alexander and Jonathan Smith University of Pennsylvania Introduction Project Goal: A generalized, experimentally validated privacy metric First experiment: Defeating face recognition


slide-1
SLIDE 1

James Alexander and Jonathan Smith University of Pennsylvania

Engineering Privacy in Public

slide-2
SLIDE 2
  • Project Goal: A generalized, experimentally

validated privacy metric

  • First experiment: Defeating face recognition
  • Experiments with more biometrics to follow

Introduction

slide-3
SLIDE 3

Talk Overview

  • Project Goals
  • Face Recognition: Methodology and Evaluation
  • Disguise Slide Show
  • Analysis
  • Future W
  • rk
slide-4
SLIDE 4
  • Though details differ wildly, the goal of all PETs

is the same: to help the user not be identified

  • Advantages of a common framework:
  • User can tell where they get the most “bang

for the buck”

  • Easier to evaluate the combination of severals

PETs in the presence of multimode surveillance

V alue of PET Generality

slide-5
SLIDE 5

To develop a “benefit” metric for evaluation of privacy enhancing technologies

  • Propose candidate metrics and evaluate

against empiricallymeasured PET performance

Project Goal

slide-6
SLIDE 6
  • Suitable for cost / benefit analysis regardless of

how cost is quantified

  • Explainable to a lay person
  • Places reliable bounds on how well an adversary

can do, even without precise knowledge of adversary’s methods

General Properties

slide-7
SLIDE 7

Adversary knows some predicate holds of a particular individual

  • He builds a probability distribution of this

predicate over the set of all individuals

  • Job of a PET is to make sure the correct

individual does not stand out in the distribution

Modeling Privacy

slide-8
SLIDE 8

noisy channel

identity

slide-9
SLIDE 9

face disguise

  • bstructions

camera

identity

slide-10
SLIDE 10

user network interface mix network adversary network interface(s)

identity

slide-11
SLIDE 11

loyalty card munged identifying info + card swapping grocer customer database

identity

slide-12
SLIDE 12
  • W

e want to predict entropy in the adversary’s model we can’t measure it directly, but perhaps can place bounds on it

  • Theory of noncooperating communicators is not

wellexplored

  • What are the limits of a communication

channel employing a sabotaged encoding?

  • What if noise sources are not random?

Challenges

slide-13
SLIDE 13

Methodolgy

  • Tested face recognition system an eigenfaces

system used in the FERET evaluation

  • 3816 FERET images used as distractors
  • New pictures added to match FERET specs
  • Facial occlusion images from AR database give

statistical behavior of two particular disguises

slide-14
SLIDE 14

Sample Baselines

slide-15
SLIDE 15

AR Sample

slide-16
SLIDE 16

Adversary Model

  • Can obtain highquality frontal probe images
  • Might have more than one gallery image of you
  • System output will consist of up to N candidate

matches, presented to an operator for confirmation

  • Face recognition system will be deployed on a

large scale

  • Do not know if a minimum likelihood cutoff used
slide-17
SLIDE 17

Score Function

score(x) = N

i=1 wx(i)

N

i=1 i

wx(i) =    N − i + 1 if the candidate in the ith position is really x (i.e. a match)

  • therwise
slide-18
SLIDE 18

Effective Disguises

slide-19
SLIDE 19
slide-20
SLIDE 20
slide-21
SLIDE 21

Image group Accuracy Mean Score baseline 99.6 0.6947 sunglasses 15.0 0.0344 scarf 58.7 0.2323

  • verall

45.8 0.2136

AR performance

slide-22
SLIDE 22
slide-23
SLIDE 23
slide-24
SLIDE 24
  • Problem: The score function doesn’t allow

performance comparison among disguises that all score zero

  • Solution: Morphs!

A minor difficulty

slide-25
SLIDE 25
slide-26
SLIDE 26
slide-27
SLIDE 27
slide-28
SLIDE 28

Ineffective Disguises

slide-29
SLIDE 29
slide-30
SLIDE 30
slide-31
SLIDE 31
slide-32
SLIDE 32
slide-33
SLIDE 33
slide-34
SLIDE 34
slide-35
SLIDE 35
slide-36
SLIDE 36
  • The system is attempting to match facial features

and their positions to the closest matches in its training data

  • To fool it, we need to obscure or remove existing

features, or provide decoy features for it to find

  • Features are composed of contrasts in the

photographic data

What’s going on?

slide-37
SLIDE 37
slide-38
SLIDE 38
slide-39
SLIDE 39
slide-40
SLIDE 40
slide-41
SLIDE 41

Grid Model

slide-42
SLIDE 42

A Grid in the Noisy Channel

identity

slide-43
SLIDE 43

Experiments in progress in order to determine:

  • The critical size separating features from non

features i.e. the right size of grid boxes

  • The weights representing the differing

importance of each grid position to system performance

Refining the Grid

slide-44
SLIDE 44

An anomaly

slide-45
SLIDE 45

Performance T radeoffs

0.2 0.4 0.6 0.8 1 200 250 300 350 400 450 500 550 600 650 700 similarity accuracy ave score false negatives

slide-46
SLIDE 46
  • Elaborate the grid model further
  • Test disguises on more subjects
  • Replicate with a face recognition system with a

very different underlying model e.g. FaceIt

  • Extend framework to more biometrics, and

beyond

Future W

  • rk
slide-47
SLIDE 47

Questions?