Estimating and Mitigating Gender Bias in Deep Image Representations - - PowerPoint PPT Presentation

estimating and mitigating gender bias in deep image
SMART_READER_LITE
LIVE PREVIEW

Estimating and Mitigating Gender Bias in Deep Image Representations - - PowerPoint PPT Presentation

Balance ced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations Tianlu Wang University of Virginia Tianlu Wang-Gender Bias in Deep Image Representation Gender Bias in Visual Recognition Systems Deep


slide-1
SLIDE 1

Balance ced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations

Tianlu Wang-Gender Bias in Deep Image Representation

Tianlu Wang University of Virginia

slide-2
SLIDE 2

Gender Bias in Visual Recognition Systems

Tianlu Wang-Gender Bias in Deep Image Representation

Deep Neural Network

slide-3
SLIDE 3

Gender Bias in Visual Recognition Systems

Tianlu Wang-Gender Bias in Deep Image Representation

Trained Deep Neural Network

tie:

slide-4
SLIDE 4

Quantifying Bias

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification

slide-5
SLIDE 5

Quantifying Bias

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification

slide-6
SLIDE 6

Quantifying Bias

Tianlu Wang-Gender Bias in Deep Image Representation

Men Also Like Shopping: Reducing Gender Bias Am Amplifi fication

  • n using Corpus-level Constraints
slide-7
SLIDE 7

Quantifying Bias

Tianlu Wang-Gender Bias in Deep Image Representation

Men Also Like Shopping: Reducing Gender Bias Am Amplifi fication

  • n using Corpus-level Constraints
slide-8
SLIDE 8

Quantifying Bias

Tianlu Wang-Gender Bias in Deep Image Representation

Women Also Snowboard: Overcoming Bias in Captioning Models

slide-9
SLIDE 9

Quantifying Bias

Tianlu Wang-Gender Bias in Deep Image Representation

Trained Deep Neural Network Is this prediction bi bias ased?

slide-10
SLIDE 10

Quantifying Bias: Model Leakage

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Classifier man woman

slide-11
SLIDE 11

Quantifying Bias: Model Leakage

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Classifier

training

Gender Classifier

testing

acc > 50% acc == 50%

slide-12
SLIDE 12

Quantifying Bias: Model Leakage

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Classifier

Mo Model Leakage: gender prediction accuracy of a classifier trained on predictions.

man woman

slide-13
SLIDE 13

Object & Action Recognition Models

Tianlu Wang-Gender Bias in Deep Image Representation

COCO Object ct Recognition

  • 22k images (16k man & 6k woman)
  • 80 objects (kite, ski, handbag, tie…)
  • Recognition performance (F1): 53.75%
  • mo

model leakage: 70.46% im imSit itu Act ction Recognition

  • 24k images (14k man & 10k woman)
  • 211 activities (cooking, shooting, lifting…)
  • Recognition performance (F1): 40.11%
  • mo

model leakage: 76.93%

slide-14
SLIDE 14

Quantifying Bias: Dataset Leakage

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Classifier

Da Dataset Leakage: gender prediction accuracy of a classifier trained on annotations.

Predictions Ground Truth Labels man woman

Does the model inherit 100% dataset leakage?

slide-15
SLIDE 15

Performance Matters!

Tianlu Wang-Gender Bias in Deep Image Representation

Predictions Ground Truth Labels Predictions F1 score = 100% Dataset Leakage = 67.72% F1 score = 53.75% Model Leakage = 70.46% Random Guess F1 score ≈ 0

NO LEAKGE!

slide-16
SLIDE 16

Random Perturbation

Perturbed Labels match ch the performance

Quantifying Bias: Adjusted Dataset Leakage

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Classifier man woman

Ad Adju justed Da Dataset Leakage: gender prediction accuracy of a classifier trained on per pertur urbed bed annotations.

Ground Truth Labels

slide-17
SLIDE 17

Quantifying Bias: Adjusted Dataset Leakage

Tianlu Wang-Gender Bias in Deep Image Representation

slide-18
SLIDE 18

Quantifying Bias: Bias Amplification

Tianlu Wang-Gender Bias in Deep Image Representation

Δ = Model Leakage – Adjusted Dataset Leakage > 0 Bias Amplification!

52 56 60 64 68 72

Model Leakage Adjusted Dataset Leakage COCO Object Recognition

50 55 60 65 70 75 80

Model Leakage Adjusted Dataset Leakage imSitu Action Recognition 20.47 9.93

slide-19
SLIDE 19

Eliminating Bias

Tianlu Wang-Gender Bias in Deep Image Representation

35 40 45 50 55 2 4 6 8 10 F1 Score (%) Bias Amplification in COCO

slide-20
SLIDE 20

Eliminating Bias: Adding Noise

Tianlu Wang-Gender Bias in Deep Image Representation

Convolutional Neural Network (Resnet-50) Fully- connected Layer + Logistic Regressors Handbag Fork Vase Spoon … Knife Car Oven

slide-21
SLIDE 21

Eliminating Bias: Adding Noise

Tianlu Wang-Gender Bias in Deep Image Representation

35 40 45 50 55 2 4 6 8 10 F1 Score (%) Bias Amplification in COCO

  • riginal

randomization

slide-22
SLIDE 22

Eliminating Bias: Balanced Datasets

Tianlu Wang-Gender Bias in Deep Image Representation

man 71% woman 29% man 68% woman 38% man 50% woman 50% Or Original F1 score (%): 53.75 model leakage (%): 70.46 Balance ced 3 F1 score (%): 52.60 model leakage (%): 67.78 Balance ced 1 F1 score (%): 42.89 model leakage (%): 63.22 man 57% woman 43% Balance ced 2 F1 score (%): 51.95 model leakage (%): 64.45

less images, lower performance, lower model leakage

slide-23
SLIDE 23

Eliminating Bias: Balanced Datasets

Tianlu Wang-Gender Bias in Deep Image Representation

35 40 45 50 55 2 4 6 8 10 F1 Score (%) Bias Amplification in COCO

  • riginal

randomization balanced 3 balanced 2 balanced 1

Balancing the co-occurance of gender and target labels does not reduce bias amplification.

slide-24
SLIDE 24

Eliminating Bias: Balanced Datasets

Tianlu Wang-Gender Bias in Deep Image Representation

Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods

slide-25
SLIDE 25

Eliminating Bias: Balanced Datasets

Tianlu Wang-Gender Bias in Deep Image Representation

Balancing the co-occurance of gender and target labels does not reduce bias amplification.

slide-26
SLIDE 26

Eliminating Bias: Using Extra Annotations

Tianlu Wang-Gender Bias in Deep Image Representation

blur-segm

  • riginal

blackout-face blackout-segm blackout-box

slide-27
SLIDE 27

Eliminating Bias: Using Extra Annotations

Tianlu Wang-Gender Bias in Deep Image Representation

35 40 45 50 55 2 4 6 8 10 F1 Score (%) Bias Amplification in COCO

  • riginal

randomization balanced 3 balanced 2 balanced 1 blackout-face blur-segm blackout-segm blackout-box

slide-28
SLIDE 28

Eliminating Bias: Adversarial Training

Tianlu Wang-Gender Bias in Deep Image Representation

Convolutional Neural Network (Resnet-50) Fully- connected Layer + Logistic Regressors Handbag Fork Vase Spoon … Knife Car Oven

Gender Classifier man woman

Gradient Reversal

slide-29
SLIDE 29

Eliminating Bias: Adversarial Training

Tianlu Wang-Gender Bias in Deep Image Representation

Convolutional Neural Network (Resnet-50) Fully- connected Layer + Logistic Regressors Handbag Fork Vase Spoon … Knife Car Oven

Gender Classifier man woman

Gradient Reversal

  • ptimize the

classifier maintain

  • bject features

A good or a bad gender classifier? destroy gender features

slide-30
SLIDE 30

Eliminating Bias: Adversarial Training

Tianlu Wang-Gender Bias in Deep Image Representation

35 40 45 50 55 2 4 6 8 10 F1 Score (%) Bias Amplification in COCO

  • riginal

randomization balanced 3 balanced 2 balanced 1 blackout-face blur-segm blackout-segm blackout-box adv @ image adv@conv4 adv @ conv5

slide-31
SLIDE 31

Visualization of Adversarial Training

Tianlu Wang-Gender Bias in Deep Image Representation

Convolutional Neural Network (Resnet-50) Fully- connected Layer + Logistic Regressors Handbag Fork Vase Spoon … Knife Car Oven

Gender Classifier man woman

Gradient Reversal

X Mask Prediction

slide-32
SLIDE 32

Tianlu Wang-Gender Bias in Deep Image Representation

Adversarial Training: Removing Face Area

slide-33
SLIDE 33

Tianlu Wang-Gender Bias in Deep Image Representation

Adversarial Training: Removing Face and Skin

slide-34
SLIDE 34

Tianlu Wang-Gender Bias in Deep Image Representation

Adversarial Training: Removing Entire Person

slide-35
SLIDE 35

Tianlu Wang-Gender Bias in Deep Image Representation

Adversarial Training: Removing Contextual Cues

slide-36
SLIDE 36

Tianlu Wang-Gender Bias in Deep Image Representation