bias and fairness in ai ml models
play

Bias and Fairness in AI/ML models Swati Gupta Assistant Professor - PowerPoint PPT Presentation

Bias and Fairness in AI/ML models Swati Gupta Assistant Professor School of Industrial and Systems Engineering, Georgia Institute of Technology October 25, 2018 Digital Data Flows Master Class: Emerging Technologies Machine Learning


  1. Bias and Fairness in 
 AI/ML models Swati Gupta Assistant Professor School of Industrial and Systems Engineering, Georgia Institute of Technology October 25, 2018 Digital Data Flows Master Class: Emerging Technologies

  2. Machine Learning Pipeline 2 Machine Data driven Data Learning/AI decisions What is the effect of these decisions on human well-being? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | Source for images: Google images, iStock.

  3. What is Bias/Fairness? 3 “Bias. When scientific or technological decisions are based on a narrow set of systemic, structural or social concepts and norms , the resulting technology can privilege certain groups and harm others.” – Nature comment Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  4. What is Bias/Fairness? 3 “Bias. When scientific or technological decisions are based on a narrow set of systemic, structural or social concepts and norms , the resulting technology can privilege certain groups and harm others.” – Nature comment Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  5. 4 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  6. 4 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  7. 4 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  8. 4 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  9. Outline of the talk 5 � Bias in the data, models and variables � Fairness Metrics � Statistical measures � Equity measures � Trolley Problem of Choice Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  10. [Lum, Isaac, 2016] Predictive Policing 6 “application of analytical techniques to identify likely targets for police intervention and prevent crime or solve past crimes by making statistical predictions” Heat map of drug arrests made Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  11. ML finds patterns in data 7 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  12. ML finds patterns in data 7 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  13. PredPol: crime type, time, loc 8 [Kristian Lum, William Isaac, 2016] Heat map of drug arrests made Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  14. Not just about collection 9 We live in a biased society, so it’s inevitable that data collected about that We live in a biased society, so it’s inevitable that data collected about that society will be biased: inherent bias, test data, feedback, proxies.. society will be biased: inherent bias, test data, feedback, proxies.. Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  15. Not just about collection 9 We live in a biased society, so it’s inevitable that data collected about that We live in a biased society, so it’s inevitable that data collected about that society will be biased: inherent bias, test data, feedback, proxies.. society will be biased: inherent bias, test data, feedback, proxies.. Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  16. Not just about collection 9 We live in a biased society, so it’s inevitable that data collected about that We live in a biased society, so it’s inevitable that data collected about that society will be biased: inherent bias, test data, feedback, proxies.. society will be biased: inherent bias, test data, feedback, proxies.. “We also found that setting the gender to female resulted in getting fewer instances of an ad related to high paying jobs than setting it to male. “ Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  17. Not just about collection 9 We live in a biased society, so it’s inevitable that data collected about that We live in a biased society, so it’s inevitable that data collected about that society will be biased: inherent bias, test data, feedback, proxies.. society will be biased: inherent bias, test data, feedback, proxies.. Proxies : predicting crime using data on arrests , Not on incidence of crime . We do not want such biases to propagate into systems that make life-changing decisions. Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  18. Outline of the talk 10 � Bias in the data, models and variables � Fairness Metrics � Statistical measures � Equity measures � Trolley Problem of Choice Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  19. Machine Learning Pipeline 11 Machine Data driven Data Learning/AI decisions What is the effect of these decisions on human well-being? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | Source of images: Google images, iStock.

  20. Classification 12 Hired for job or not, will re-offend or not (prison), given a loan or not. YES NO Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  21. Statistical Definitions of Fairness 13 Hired for job or not, will re-offend or not (prison), given a loan or not. YES Is it fair to achieve highest accuracy in classification? NO Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | ProPublica, May 2016

  22. Statistical Definitions of Fairness 13 Hired for job or not, will re-offend or not (prison), given a loan or not. YES Is it fair to achieve highest accuracy in classification? NO COMPAS Risk Score: ProPublica Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | ProPublica, May 2016

  23. Statistical Definitions of Fairness 13 Hired for job or not, will re-offend or not (prison), given a loan or not. YES Is it fair to achieve highest accuracy in classification? NO COMPAS Risk Score: ProPublica Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | ProPublica, May 2016

  24. Statistical Definitions of Fairness 13 Hired for job or not, will re-offend or not (prison), given a loan or not. YES Is it fair to achieve highest accuracy in classification? Or is it fair to balance false positives across the groups? NO Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | ProPublica, May 2016

  25. Statistical Definitions of Fairness 13 Hired for job or not, will re-offend or not (prison), given a loan or not. YES Is it fair to achieve highest accuracy in classification? Or is it fair to balance false positives across the groups? NO Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | ProPublica, May 2016

  26. Statistical Definitions of Fairness 14 Hired for job or not, will re-offend or not (prison), given a loan or not. YES Is it fair to achieve highest accuracy in classification? Or is it fair to balance false positives across the groups? False negatives ? NO Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | Wikipedia

  27. Statistical Definitions of Fairness 14 In fact, different stakeholders might have different points of view Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | Wikipedia

  28. Equity Metrics of Fairness 15 What about general decisions: how much loan to give ? where to place an emergency room ? Where to schedule deliveries? distance Group A Group B Is it fair to minimize total distance travelled by any group? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  29. Equity Metrics of Fairness 16 What about general decisions: how much loan to give ? where to place an emergency room ? Where to schedule deliveries? Group A Group B Is it fair to minimize total distance travelled by any group? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  30. Equity Metrics of Fairness 16 What about general decisions: how much loan to give ? where to place an emergency room ? Where to schedule deliveries? Group A Group B Is it fair to minimize total distance travelled by any group? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  31. Equity Metrics of Fairness 17 What about general decisions: how much loan to give ? where to place an emergency room ? Where to schedule deliveries? Group A Group B Is it fair to minimize average distance travelled by any group (per person)? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  32. Equity Metrics of Fairness 17 What about general decisions: how much loan to give ? where to place an emergency room ? Where to schedule deliveries? Group A Group B Group by race ? income ? insurance ? Is it fair to minimize average distance travelled by any group (per person)? Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

  33. Equity Metrics of Fairness 18 Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology | Marsh and Schilling 1993

  34. Outline of the talk 19 � Bias in the data, models and variables � Fairness Metrics � Statistical measures � Equity measures � Trolley Problem of Choice Bias and Fairness in AI/ML models | Swati Gupta | Georgia Institute of Technology

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend