Department of Computer Science CSCI 5622: Machine Learning Chenhao - - PowerPoint PPT Presentation

department of computer science csci 5622 machine learning
SMART_READER_LITE
LIVE PREVIEW

Department of Computer Science CSCI 5622: Machine Learning Chenhao - - PowerPoint PPT Presentation

Department of Computer Science CSCI 5622: Machine Learning Chenhao Tan Lecture 23: Machine learning and society Slides adapted from Chris Ketelsen 1 Learning objectives Learn about the connection between our society and machine learning


slide-1
SLIDE 1

Department of Computer Science CSCI 5622: Machine Learning Chenhao Tan Lecture 23: Machine learning and society Slides adapted from Chris Ketelsen

1

slide-2
SLIDE 2

Learning objectives

  • Learn about the connection between our society

and machine learning

  • Make sure that you think about ethics when

applying machine learning

  • Fill in FCQ

2

slide-3
SLIDE 3

3

Now you understand the magic behind machine learning!

slide-4
SLIDE 4

4

slide-5
SLIDE 5

Now you understand the magic behind machine learning!

5

Machine learning in real life

slide-6
SLIDE 6

Machine learning are commonly used in

  • ur society

6

slide-7
SLIDE 7

Brainstorm

Where did you see machine learning today?

7

slide-8
SLIDE 8

Machine learning and our society

  • Machine learning are increasingly connected with
  • ur society
  • Authorizing credit
  • Sentencing guidelines
  • Suggesting medical treatment

8

And many more!

slide-9
SLIDE 9

Ethical issues in Machine Learning

9

slide-10
SLIDE 10

10

https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing

slide-11
SLIDE 11

Case study: Decision making

11

https://www.propublica.org/article/m achine-bias-risk-assessments-in- criminal-sentencing

slide-12
SLIDE 12

Case study: Decision making

  • Crime prediction, similar situations arise when

someone is getting a loan

  • Harms of allocation

12

https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing

slide-13
SLIDE 13

Case study: Decision making

  • Crime prediction, similar situations arise when

someone is getting a loan

  • Harms of allocation
  • What are potential reasons in the machine

learning pipeline?

13

https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing

slide-14
SLIDE 14

Case study: Word embeddings

14

slide-15
SLIDE 15

Case study: Word embeddings

Figure credit: Lior Shkiller

15

slide-16
SLIDE 16

Case study: Word embeddings

16

Bolukbasi et al. 2016, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings

slide-17
SLIDE 17

Case study: Word embeddings

SVD to rescue

17

Bolukbasi et al. 2016, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings

slide-18
SLIDE 18

18

@mmantyla

slide-19
SLIDE 19

Case study: Recommender Systems

Filter bubble [Pariser, 2011]

  • Main idea: Personalized search determines what

information you see and what information you don’t see

  • Google, Facebook, Netflix

19

slide-20
SLIDE 20

Case study: Recommender Systems

  • Blue feed vs Red feed

20

http://graphics.wsj.com/blue-feed-red-feed/

slide-21
SLIDE 21

21

slide-22
SLIDE 22

Case study: Recommender Systems

Filter bubble [Pariser, 2011]

  • Main idea: Personalized search determines what

information you see and what information you don’t see

  • Google, Facebook, Netflix
  • Debate: a good thing or a bad thing?

22

slide-23
SLIDE 23

Case study: Recommender Systems

  • Bakshy, Messing, Adamic, “Exposure to

ideologically diverse news and opinion on Facebook”

23

slide-24
SLIDE 24

Case study: Recommender Systems

Bakshy, Messing, Adamic, “Exposure to ideologically diverse news and opinion on Facebook”

24

slide-25
SLIDE 25

Case study: Recommender Systems

  • Latanya Sweeney 2013, “Discrimination in Online

Ad Delivery”

25

slide-26
SLIDE 26

26

slide-27
SLIDE 27

Case study: Recommender Systems

  • Latanya Sweeney 2013, “Discrimination in Online

Ad Delivery”

27

slide-28
SLIDE 28

28

Credit: @math_rachel, Kate Crawford

slide-29
SLIDE 29

Representation harms

  • Denigration
  • Stereotype
  • Recognition
  • Under-representation

29

Credit: Solon Barocas, Kate Crawford, Aaron Shapiro, Hanna Wallach

slide-30
SLIDE 30

Case study: Physical Systems

Self-driving cars

30

slide-31
SLIDE 31

Case study: Physical Systems

Self-driving cars

  • In the event of an inevitable crash leading to likely

loss of life, what should the car do?

  • Example: Car crash will either results in death of
  • Driver and several passengers
  • Several pedestrians
  • Debate: How do we choose?

31

slide-32
SLIDE 32

At least think about the following question

  • What question can/should we ask?
  • What data is OK to use?
  • What is the boundary between private data and public

data?

  • If something is public, should you use the data?
  • Example: Enron email corpus
  • Who owns your data?
  • What anonymization should be done and is the data

*really* anonymized?

32

slide-33
SLIDE 33

Plug

  • Thinking more about
  • Why we develop machine learning systems
  • How we develop machine learning systems for

humans

Human-centered Machine Learning in Spring 18 (CS 7000)

33

slide-34
SLIDE 34

34

Thanks! Zhenguo Chen Sean Harrison Tyler Scott Most importantly, all of you!

slide-35
SLIDE 35

35

FCQ time! https://colorado.campuslabs.com/courseeval/