Organisa.on Teachers Merel Koning (m.koning@cs.ru.nl) Jaap-Henk - - PowerPoint PPT Presentation

organisa on
SMART_READER_LITE
LIVE PREVIEW

Organisa.on Teachers Merel Koning (m.koning@cs.ru.nl) Jaap-Henk - - PowerPoint PPT Presentation

01/03/18 Organisa.on Teachers Merel Koning (m.koning@cs.ru.nl) Jaap-Henk Hoepman (jhh@cs.ru.nl) Blackboard is not used Merel Koning Website: www.cs.ru.nl/~jhh/secsem.html Privacy Seminar 2017-2018 Wiki:


slide-1
SLIDE 1

01/03/18 1

Privacy Seminar 2017-2018

Merel Koning

1

Organisa.on

  • Teachers

– Merel Koning (m.koning@cs.ru.nl) – Jaap-Henk Hoepman (jhh@cs.ru.nl)

  • Blackboard is not used

– Website: www.cs.ru.nl/~jhh/secsem.html – Wiki: hKp://wiki.science.ru.nl/privacy/

01/03/18 2

What is a seminar?

  • Seminar

– Student lecture – Student paper – Student opposi.on

  • Grade = weighted average

– But only if all grades are at least 5.5 – If not, lowest grade is final grade!

  • Working in groups
  • AKendance required

01/03/18 3

Preliminary Course schedule

01/03/18 4

slide-2
SLIDE 2

01/03/18 2

Topics

  • First come first serve:

– loca.on privacy – Internet of things – iden.ty management – electronic vo.ng – smart metering/smart grids – Search/data retrieval – anonymous messaging – privacy in big data – Anonymous crypto currencies – …your own…

  • Sign up next lecture

01/03/18 5

Research

  • analyse a par.cular prac.cal case

– what are the privacy issues (from a societal and legal perspec.ve) and how are they dealt with

  • give a precise and concise problem descrip.on

– in technical terms: define your model; your assump.ons

  • inves.gate possible PETs that apply

– summarise your analysis

  • pick one and solve the problem (involves a protocol)

– describe this in sufficient detail!

  • (informally) prove or argue correctness

01/03/18 6

Student lecture

  • Goal of lecture

– to inform other students about your research

  • Important

– make lecture interac.ve – add addi.onal material

  • Discuss draf

– Thursday 13:00-13:15 the week before – mail slides etc. at least 24 hours before

01/03/18 7

Student lecture: grading

  • Content

– Argumenta.on

  • Whether your lecture provides a solid basis and backing of all statements and claims made.

– Cohesiveness

  • Whether the rela.onship between the different (sub)topics of your lecture is made clear.

– Comprehensiveness

  • Whether your lecture covers all important aspects, and clearly separates important issues from secondary details. Equal aKen.on should be

paid to technical and legal/societal issues.

  • Form

– Structure

  • Logical ordering of your lecture, and it's intelligibility.

– AKrac.veness

  • Whether your lecture cap.vates the audience, and whether the message comes across (i.e. whether your lecture connects to what your

audience expects and understands).

  • Performance

– Interac.on

  • Level of engagement and contact with the audience, level of interac.vity, the way you respond to ques.ons.

– Lecture technique

  • They way you speak (comprehensibility), your presence in front of the class, your usage of suppor.ng materials (e.g. powerpoint). The liveliness

and tone of your lecture. 01/03/18 8

slide-3
SLIDE 3

01/03/18 3

Student paper

  • Goal

– Report on research – Express own perspec.ve on PETs

  • Format

– Roughly 10 pages (excluding references)

  • A4, reasonable margins, 10-11 pt font
  • Beware

– Find and use your own literature – Use input obtained during presenta.on in class

01/03/18 9

Student paper

  • Typical structure

– Context – Problem descrip.on

  • Including legal/social analysis

– Proposed solu.on – Technical analysis – Conclusions

01/03/18 10

Student paper: planning

  • Average .mespan

– Literature study: 2 weeks – Perform research: 2 weeks – Write skeleton: 1 week – Write final paper: 3 weeks

  • Deadlines

– May 3: Skeleton – June 14: Final paper

  • So start April 1 at the latest

01/03/18 11

Student paper: grading

  • Content

– Technical quality

  • Whether the paper shows an understanding of the technical issues involved. Correctness of all technical statements and claims. Sufficient level
  • f technical detail

– Analysis

  • Whether a proper argumenta.on is given, and whether all main aspects of the topic are addressed, with proper regard of what are the main

points and what are only secondary points. (This covers the criteria argumenta.on, cohesiveness and comprehensiveness used for scoring the presenta.on.)

– Quality of references

  • Whether you found and cite all relevant literature. Originality (finding relevant references yourself) is appreciated.

– Own opinion

  • Whether the paper clearly expresses and argues your own opinion on the subject maKer.
  • Form

– Style

  • Clarity of wri.ng, objec.veness, linguis.c quality (in terms of spelling and grammar).

– Structure

  • Logical structure of the paper, helping the reader understand what he is about to read, giving the paper a natural flow.

– AKrac.veness

  • Formaong of the paper, including precise formaong of the bibliography.

01/03/18 12

slide-4
SLIDE 4

01/03/18 4

Remaining points

  • Contribute to the wiki

– hKp://wiki.science.ru.nl/privacy/

01/03/18 13

What is privacy?

01/03/18 14 01/03/18 15

Government surveillance

01/03/18 16

slide-5
SLIDE 5

01/03/18 5

Commercial surveillance

01/03/18 17

Predic.ons

18 01/03/18

Shopping mall tracking (I call it Mallware)

19 01/03/18 01/03/18

Privacy what is privacy according to you?

20

slide-6
SLIDE 6

01/03/18 6

VALUES/GOODS/ENDS of privacy

  • Personal value privacy e.g.

Self-expression, Good Reputa.on, Repose, In.macy and Formality, Human Dignity, Autonomy, Individualism

  • Societal value privacy e.g.

Limited Government, Tolera.on, Civility

  • Both e.g.

Intellectual Life, Preferences and Tradi.ons

01/03/18 21

VALUES/GOODS/ENDS of privacy limita.ons

  • Na.onal Security, Law Enforcement, Public Right to Know,

Administra.ve Costs, Public Health, Selfish Individualism, Inefficiency, (libertarian view) Excess of Protec.ons, Privacy Rights Should be Limited

01/03/18 22

Privacy assets

  • Personal data
  • Home
  • Reputa.on
  • Informa.on
  • Body
  • Etc..

01/03/18 23

Privacy threats

E.g. Threats to informa.on privacy:

  • Informa.on Collec.on

– Surveillance – Interroga.on

  • Informa.on Processing

– Aggrega.on – Iden.fica.on – Insecurity – Secondary Use – Exclusion

  • Informa.on Dissemina.on

– Breach of Confiden.ality – Disclosure – Exposure – Increased Accessibility – Blackmail – Appropria.on – Distor.on

  • Invasion

– Intrusion – Decisional Interference

01/03/18 24

slide-7
SLIDE 7

01/03/18 7

01/03/18

Collect Process Disseminate Invade/Use Intrusion Interference Surveillance Interrogation Aggregation Identification Insecurity Secondary Use Exclusion Breach of confidentiality Disclosure Exposure Increased availability Blackmail Appropriation Distortion

Based on: Daniel J. Solove,"A Taxonomy of Privacy" 2006.

25

Definitions of privacy

Privacy and identity theory

Brandeis Warren 1890

  • Privacy is ‘the right to be let

alone’.

  • ‘Hiding’

01/03/18 27

Wes.n 1968

  • Privacy is ‘the claim of

individuals, groups, or ins.tu.ons to determine for themselves when, how, and to what extent informa.on about them is communicated to

  • thers’.
  • ‘Control’

01/03/18 28

slide-8
SLIDE 8

01/03/18 8

Agre and Rotenberg 1998

  • Privacy is `the freedom from

unreasonable constraints on the construc.on of one’s own iden.ty’.

  • ‘Dialogue’

01/03/18 29

Nissenbaum 2004

  • Contextual integrity: the right to prevent informa.on to flow

from one context to another

– [Nissenbaum, 2004]

01/03/18 30

Contextual integrity

01/03/18 31

[FIDIS project]

Don’t confuse these concepts!

01/03/18

security privacy data protection

32

slide-9
SLIDE 9

01/03/18 9

Typologies and taxonomies

Privacy and identity theory

Allan Wes.n’s 4 privacy states

  • 1960
  • Privacy is linked to the needs of an individual
  • Classifica.on derived from case law on privacy torts (US)

– Solitude Most complete state of privacy: Individual separated from

  • thers

– In+macy Beyond in.mate rela.ons. State of in.macy prerequisite for close contact – Anonymity Public privacy. Freedom from iden.fica.on and

  • surveillance. Public spaces and anonymous publica.on

– Reserve Dynamic aspect of privacy in daily interpersonal rela.ons. Psychological barrier against unwanted intrusions

01/03/18 34

Finn, Wright, Friedewald’s types of privacy

  • 2013
  • EU data protec.on legisla.on analysis
  • Expanded from Clarke
  • Bio-informa.cs, drones etc.

01/03/18 35

Finn, Wright, Friedewald’s

  • Privacy of the person
  • Privacy of behavior and ac.on
  • Privacy of communica.on
  • Privacy of data and image
  • Privacy of thoughts and feelings
  • Privacy of loca.on and space
  • Privacy of associa.on

01/03/18 36

slide-10
SLIDE 10

01/03/18 10

RFID-enabled travel documents

  • Assets:

– Informa.on on the chip itself:

  • Travel routes
  • Frequent des.na.ons
  • Rare des.na.ons
  • Mode of transport.

– Informa.on in the database:

  • Loca.on .me
  • possible co-travelers etc.
  • Routes

– Iden.ty

01/03/18 37

RFID-enabled travel documents

  • Threats:
  • Data collec.on

– Surveillance – Interroga.on when card is issued or when error occurs

  • Processing

– Secondary use – Aggrega.on – Insecurity

  • Dissemina.on

– Disclosure and exposure Invasion Decisions on iden.fica.on

  • Privacy of the person
  • Privacy of personal data
  • Privacy of loca.on and space
  • Privacy of behavior and ac.on

01/03/18 38

Second genera.on biometrics

  • Measurement and analysis of biometric traits: gait analysis,

voice recogni.on

  • Psychological biometric: pheromone detec.on, heartbeat

analysis, bodyheat etc.

  • Impact all seven types

01/03/18 39

Second genera.on biometrics

  • Privacy of the person
  • Privacy of behavior and ac.on
  • Privacy of communica.on
  • Privacy of data and image
  • Privacy of thoughts and feelings.
  • Privacy of loca.on and space
  • Privacy of associa.on

01/03/18 40

slide-11
SLIDE 11

01/03/18 11

01/03/18

Privacy

computing (1950-)

  • searching becomes efficient
  • data kept forever

networking (1980-)

  • datasharing becomes easy
  • data accessible on-line

“network effect”

41

Transfer

Different types of data/informa.on

  • Volunteered

– What you reveal explicitly when asked

  • Observed

– What you reveal implicitly by your behaviour

  • Inferred

– What is derived from other data about you

01/03/18 42

[World Economic Forum Report Personal Data: The Emergence

  • f a New Asset Class]

Data vs Metadata

  • Metadata (= Behavioural data)

– Condensed (informa.on rich, easy to process) – More ”true” (judge a man not on what he says but on what he does)

01/03/18 43

Why is privacy important

01/03/18 44

slide-12
SLIDE 12

01/03/18 12

01/03/18

“Privacy is essen.al for freedom, democracy, psychological well-being, individuality and crea.vity”

Daniel J. Solove. “Understanding Privacy.” Harvard University Press, 2008.

45

Moral basis for data protec.on

  • preven.on of informa.on-based harm

– Like guns, informa.on may kill people

  • preven.on of informa.onal inequality

– The “market” of informa.on – Non-discrimina.on

  • preven.on of informa.onal injus.ce

– Spheres of privacy must be protected

  • respect for moral autonomy.

– People change

01/03/18 46 Hoven, Jeroen Van Den and Vermaas, Pieter E.(2007) 'Nano-Technology and Privacy: On Con.nuous Surveillance Outside the Panop.con', Journal of Medicine and Philosophy, 32: 3, 283 — 297 01/03/18

Searching for the right metaphor

  • rwell / big brother

chandler / little sister kafka / the trial

47

Of: the Matrix

48 01/03/18

slide-13
SLIDE 13

01/03/18 13

You’ve got nothing to hide

01/03/18 49

I have nothing to hide....

  • Everybody has something to be embarrassed about
  • Assumes that the problem is data you want to hide

– even “innocent” data can harm you

  • Freedom of thought

– That job offer looks interes.ng... – That woman looks “interes.ng”...

  • No dis.nc.on between illegal (legal) vs disgraceful (moral) vs …:

data is data

  • What is the data used for: inves.ga.on, an.-terrorism, or …??

– Func.on creep

01/03/18

Wrong assumption

The point is not that there is data that is apriori “wrong” or illegal (as seen by the “sender”) The point is that “innocent” data can (later) be used wrongly (by the current “receiver”)

Solove, Daniel J., “I’ve got nothing to hide" 2008.

50

Beyond privacy: autonomy

51 01/03/18 01/03/18 52

slide-14
SLIDE 14

01/03/18 14

Resources

  • Websites

– hKp://wiki.science.ru.nl/privacy/ – hKps://www.eff.org/

  • Books

– Agre & Rotenberg: Technology and Privacy: The New Landscape, MIT Press, 1998 – Ilija Trojanow, Juli Zeh “Aanslag op de vrijheid”, de Geus,2010 – Daniel J Solove "Understanding Privacy", Harvard University Press, 2008. – Bart de Koning "Alles onder controle", Uitgeverij Balans, 2008.

01/03/18 53