The ethics of electronic privacy Diana Acosta-Navas PhD Candidate, - - PowerPoint PPT Presentation

the ethics of electronic privacy
SMART_READER_LITE
LIVE PREVIEW

The ethics of electronic privacy Diana Acosta-Navas PhD Candidate, - - PowerPoint PPT Presentation

The ethics of electronic privacy Diana Acosta-Navas PhD Candidate, Harvard Philosophy Department Adjunct Lecturer, Harvard Kennedy School of Government diana_acosta_navas@hks.harvard.edu Case 1: Cambridge Analytica Case 2: Facebooks


slide-1
SLIDE 1

The ethics of electronic privacy

Diana Acosta-Navas

PhD Candidate, Harvard Philosophy Department Adjunct Lecturer, Harvard Kennedy School of Government diana_acosta_navas@hks.harvard.edu

slide-2
SLIDE 2

For Today

  • Case 1: Cambridge Analytica
  • Case 2: Facebook’s suicide

prevention program

  • Case 3: Personal Kit: Safe Paths
slide-3
SLIDE 3

For Today

  • Descriptive conception of privacy
  • The moral value of privacy
  • Privacy as an interest
  • Privacy as a right
  • Extenuating circumstances
  • Informed consent
slide-4
SLIDE 4

This is Your Digital Life App

Case 1

slide-5
SLIDE 5

www.theguardian.com

slide-6
SLIDE 6

Facebook’s Role

  • By 2015 Facebook was aware of

the large-scale harvesting of data

  • Failed to alert users: “It was not a

breach”

slide-7
SLIDE 7

Facebook’s Role

  • In 2016, Facebook sent a legal letter to

Cambridge Analytica requesting to delete data

  • Official certification: “ticking a box on a form

and posting it back”

  • Went unanswered for weeks
  • Did not follow up or verify that it was in fact

deleted

  • Users were only notified that their data had

been used after Wylie spoke to The Observer and the information was public

slide-8
SLIDE 8

Mark Zuckerberg

“This was a major breach of trust”

slide-9
SLIDE 9

Was this use of data morally appropriate?

Poll

slide-10
SLIDE 10
slide-11
SLIDE 11

Was this use of data morally appropriate?

Debrief

slide-12
SLIDE 12

Informational Privacy

slide-13
SLIDE 13

Private Information

Heuristic to identify private information:

  • That which you wouldn’t want to be posted
  • n the front page of the Harvard Crimson.
  • Information you would not feel

comfortable sharing if you knew state agents were tapping your phone.

  • Context-and culture-specific notion.
slide-14
SLIDE 14

Informational Privacy: Descriptive Conception

  • Measure of control that you have over others’ access to your personal

information.

  • Though some degree of control is desirable, a line needs to be drawn

to determine problematic cases

slide-15
SLIDE 15

Electronic Privacy

  • Informational privacy as it applies to
  • ur electronic activities.

ØExamples: web browsing, text messaging, emailing, etc.

  • Through our electronic activities,
  • thers can access information about

non-electronic activities, such as our location, trajectories, consumption habits, etc.

slide-16
SLIDE 16

Informational Privacy: Normative Dimensions

  • When are privacy claims justified?
  • Where do we draw the line between

acceptable decreases of privacy and morally problematic ones? ØWhen does a decrease in privacy constitute a “violation” or “invasion” thereof?

slide-17
SLIDE 17

Facebook’s Suicide Prevention Program

Case 2

slide-18
SLIDE 18

Facebook’s Suicide Prevention Program

  • Screening user data to identify at-risk

individuals

  • Collaboration with first responders to

provide support

  • Concerns about the transparency of

the algorithm

  • Concerns about lack of user consent
  • Concerns about how data are handled.
slide-19
SLIDE 19

Is this use of data morally appropriate?

Poll

slide-20
SLIDE 20
slide-21
SLIDE 21

Could we have different intuitions about these cases?

Debrief

slide-22
SLIDE 22

Could we have different intuitions about these cases?

Debrief

slide-23
SLIDE 23

Why is privacy valuable?

slide-24
SLIDE 24

Privacy’s Moral Value

Privacy protects us from:

  • Identity theft
  • Inappropriate migration of

information between various spheres of our lives

  • Control and abuses by powerful

actors

slide-25
SLIDE 25

Privacy’s Moral Value

Privacy as a necessary condition for:

  • Well-being and development,

creativity, mental health

  • Independence: formulating life-plans,

moral and political judgments

  • Diversity of personal choices and

actions

  • Shaping relationships
slide-26
SLIDE 26

Privacy’s Moral Value

Privacy and personal autonomy:

Access to personal information can enhance the range of influence that powerful actors have on individuals.

  • Manipulation
  • Blocking relevant information
  • Withholding opportunities
  • Affecting decisions

Privacy is a form of autonomy: it constitutes self determination with respect to information about oneself.

slide-27
SLIDE 27

The Moral Value of Privacy

  • Instrumental value: Good because it

leads to other good things.

  • Example: Money
  • Intrinsic value: Good in and of itself.
  • Example: Friends
slide-28
SLIDE 28

Some vocabulary

  • Interests: things that make a

difference to how well a person’s life goes

  • Promoted
  • Harmed

Examples: whether they are happy, healthy, safe, able to pursue their goals effectively.

slide-29
SLIDE 29

Privacy as an Interest

  • Sometimes it conflicts with other interests
  • Trade-offs
slide-30
SLIDE 30

Some vocabulary

  • Moral Rights: entitlements to a certain kind of

treatment

  • If you are entitled to a certain kind of

treatment, this means that others are morally obligated to treat you in that way.

  • If you are morally obligated to do

something, then if you fail to do it, then you have done something morally wrong, and may deserve blame or punishment from others. Example rights: the right not to be killed or harmed, the right not to be discriminated against, the right to autonomy, the right to democratic governance

slide-31
SLIDE 31

Privacy as a Right

  • Heuristically, a right is an interest

that is so important that we think it should be protected by an explicit and serious social rule.

  • Rights can’t be overridden by other

considerations (except in extenuating circumstances)

slide-32
SLIDE 32

Additional Complications

slide-33
SLIDE 33

Privacy as an Interest

Problem: Who decides?

ØPresumably, individuals

Problem: What do individuals prefer?

ØExpressed preferences? ØBehavior? qWhat to do in the face of conflicting evidence?

slide-34
SLIDE 34

Privacy as a Right

Problem:

  • When is this right overridden?
  • What counts as extenuating

circumstances?

  • Can the right be waived?
slide-35
SLIDE 35

Personal Kit: Safe Paths

Case 3

slide-36
SLIDE 36

Extenuating Circumstances

slide-37
SLIDE 37

Extenuating Circumstances

“To protect our collective right to health in the current pandemic situation, we need to balance our individual rights with collective responsibilities” Kathryn Sikkink

slide-38
SLIDE 38

Some Data-Driven Measures

  • Hong Kong: Tracking the location of recent international travelers through

WhatsApp to enforce quarantine periods.

  • Taiwan: Tracking quarantined people’s phones using data from cell phone

masts.

  • South Korea: Corona 100m informs individuals within 100m of diagnosed
  • carriers. Broadcasting text messages containing personal information about

individual carriers. Tracking carriers and alerting authorities when they stray off quarantine location.

  • China: Health Check App: self-reported data about places visited and symptoms

to generate an identifying QR code displayed in green, orange or red to signal free movement, 7-day, or 14-day quarantines respectively.

  • Singapore: Available map with detailed information about each case of COVID-
  • 19. TraceTogether records prolonged encounters between people who stand

less than two meters away from each other (phones connect via Bluetooth). When one party tests positive, the information is sent to contact tracers who decrypt it and inform the other party.

  • Israel: Shin Bet was authorized to track and access cell phones of confirmed
  • carriers. Hamagen notifies users when they’ve been in proximity of carriers.
slide-39
SLIDE 39

Data-driven contact-tracing

  • Tailored approach:
  • Alternative to lockdowns and broad social distancing policies
  • Alternative to large-scale testing and symptom-based detection

ØVoluntary isolation of prospective positive cases ØTargeted testing: resource allocation and optimization

  • Most useful when detecting the first signs of an outbreak and as a

resurgence prevention mechanism

slide-40
SLIDE 40

Personal Kit: Safe Paths

  • Employment of phone’s location tracking to

alert individuals at risk of contagion.

  • Users save their location information on

their phones.

  • If diagnosed positive, they can upload their

information to health authorities’ systems.

  • Other users whose paths have crossed with

them are then notified of potential risks of exposure.

slide-41
SLIDE 41

Informed Consent

Individuals’ autonomous decision to waive their privacy rights in a specific domain

slide-42
SLIDE 42

Make a case for or against allowing this app to operate

Small Group Discussion

slide-43
SLIDE 43

Case for/against allowing this app to operate

Debrief

slide-44
SLIDE 44

Informed Consent

People often voluntarily give up certain information about themselves, unaware that certain inferences can be made from an aggregate of such data.

  • Even worse: There seems to be no

way that someone could have predicted the surprising inferences that can be made from the data.

slide-45
SLIDE 45

Informed Consent

  • What do we consent to?
  • Transparency Paradox: Simplicity and

Fidelity cannot be achieved simultaneously

  • How will data be used in the future?
  • Who do we consent for?
  • Tyranny of the minority: the

willingness of a few to disclose information about themselves may implicate others who share the more easily observable traits correlated with the traits disclosed.

slide-46
SLIDE 46

States of Exception, Emergency Powers and Normalization

  • How long will the information

be available?

  • To whom?
  • Slippery slopes?
  • How/when do we roll back

these permissions?

  • How/when do we reinstitute

privacy protections?

slide-47
SLIDE 47

Summarizing…

  • Privacy is the degree of control over
  • thers’ access to information about us.
  • There are reasons to care about privacy,

beyond the fact that it may promote our interests.

  • Whether an interest or a moral right, we

need to figure out when exactly we are justified in restricting privacy.

  • When it comes to big data, we cannot

trust naïve intuitions about consent.

  • Even under extenuating circumstances,

privacy restrictions should be carefully examined.

slide-48
SLIDE 48

Thank you!

https://forms.gle/vcfXwCZ2eLqQnvmq5