Engineering Multiagent Systems for Ethics and Privacy-Aware Social - - PowerPoint PPT Presentation

engineering multiagent systems for ethics and privacy
SMART_READER_LITE
LIVE PREVIEW

Engineering Multiagent Systems for Ethics and Privacy-Aware Social - - PowerPoint PPT Presentation

Engineering Multiagent Systems for Ethics and Privacy-Aware Social Computing Nirav Ajmeri (Under the guidance of Professor Munindar P. Singh) Department of Computer Science North Carolina State University December 2018 Nirav Ajmeri EMAS for


slide-1
SLIDE 1

Engineering Multiagent Systems for Ethics and Privacy-Aware Social Computing

Nirav Ajmeri

(Under the guidance of Professor Munindar P. Singh) Department of Computer Science North Carolina State University

December 2018

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 1 / 27

slide-2
SLIDE 2

Introduction

Outline

1

Introduction

2

Contribution Understanding Value Preferences

3

Conclusions and Directions

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 2 / 27

slide-3
SLIDE 3

Introduction

NSF’s “Dear Colleague Letter” on FEAT (NSF 19-016)

Fairness in decision-making Ethics via incorporating values Accountability by social norms Transparency via understanding social context

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 3 / 27

slide-4
SLIDE 4

Introduction Engineering Social Applications Social Reality *WWW 2019 IJCAI 2018 IC 2018 AAMAS 2017 KER 2016 IC 2016 Creativity in Social Computation RE 2016 RE 2017 RE 2018 Formal Specification AAAI 2017 IS 2017 Computer 2017 IJCAI 2016 AI, SE, Privacy, * in-review Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 4 / 27

slide-5
SLIDE 5

Introduction

Examples of Ethical Concerns

Audio leaking: Intrusion of solitude and disclosure of music taste

Source: https://twitter.com/akokitamura/status/728521725172846592 Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 5 / 27

slide-6
SLIDE 6

Introduction

Examples of Privacy Concerns

Location sharing

Google: Location sharing

Source: https://www.csoonline.com Your latest location is auto shared if you do not respond in 5 minutes

Messenger: Live location

When you choose to share, Live Location continues shar- ing your location even when you are not using the app Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 6 / 27

slide-7
SLIDE 7

Introduction

Concepts

Social norm as defined by Singh [2013], is a relation between two parties, a subject and an object, and involves an antecedent (which brings a norm in force) and a consequent (which brings the norm to satisfaction or violation) Social context is the circumstance under which an agent takes an action [Dey, 2001] Deviation is a perceived violation of a norm [Nardin et al., 2016] Values are guiding principles of humans [Schwartz, 2012; Friedman et al., 2008; Rokeach, 1973] Ethics is subsumed in the theory of values [Friedman et al., 2008] Privacy is a value with an ethical import [Langheinrich, 2001; Taylor, 2002;]

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 7 / 27

slide-8
SLIDE 8

Introduction

Research Objective

To help software developers in engineering personal agents that deliver an ethical and privacy-respecting social experience to stakeholders via modeling and reasoning about social norms, social context, and value preferences

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 8 / 27

slide-9
SLIDE 9

Introduction

Socially Intelligent Personal Agent (SIPA)

A SIPA adapts to social context and suppports meeting social expectations

Ethical: Seeks to balance needs of

Primary stakeholder (user), who directly interacts with the agent Secondary stakeholders, who are affected by the agent’s actions

Challenge: Understanding Social Reality Modeling social intelligence Understanding social context Reasoning about values stakeholders

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 9 / 27

slide-10
SLIDE 10

Introduction

A SIPA: Schematically

World Model Social Model Stakeholder Model Context Norms Goals Actions Sanctions Values Decision Module Ethically Appropriate Action

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 10 / 27

slide-11
SLIDE 11

Introduction

Research Questions

RQ Social intelligence: How can modeling social intelligence in a SIPA help deliver a social experience and respects its stakeholders’ privacy? Arnor, a software engineering method RQ Context: How can SIPAs share and adapt to deviation contexts, and learn contextually relevant norms? Poros, a context reasoning approach RQ Values: Does an ability to reason about values promoted or demoted by actions and an understanding of preferences among these values help a SIPA deliver a value-driven social experience to all its stakeholders? Ainur, a decision-making framework

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 11 / 27

slide-12
SLIDE 12

Contribution

Outline

1

Introduction

2

Contribution Understanding Value Preferences

3

Conclusions and Directions

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 12 / 27

slide-13
SLIDE 13

Contribution Understanding Value Preferences

Norms and Values

RQValues: Does an ability to reason about values promoted or demoted by actions and an understanding of preferences among these values help a SIPA deliver a value-driven social experience to all its stakeholders?

Pichu: A location sharing SIPA

Source: https://www.csoonline.com/article/3147286/security/ google-launches-trusted-contacts-location-sharing-app.html

Stakeholders

Frank, a high school student; prefers pleasure and recognition Andrew, Frank’s father; prefers safety Hope, Frank’s aunt and also an intelligence analyst; prefers privacy

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 13 / 27

slide-14
SLIDE 14

Contribution Understanding Value Preferences

Stakeholder Model

A SIPA’s stakeholders and their goals and values

World Model Social Model Stakeholder Model Context Norms Goals Actions Sanctions Values Decision Module Ethically Appropriate Action

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 14 / 27

slide-15
SLIDE 15

Contribution Understanding Value Preferences

World Model

Context in which a SIPA acts

World Model Social Model Stakeholder Model Context Norms Goals Actions Sanctions Values Decision Module Ethically Appropriate Action

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 15 / 27

slide-16
SLIDE 16

Contribution Understanding Value Preferences

Social Model

Norms governing a SIPA’s interactions in a society and the associated sanctions

World Model Social Model Stakeholder Model Context Norms Goals Actions Sanctions Values Decision Module Ethically Appropriate Action

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 16 / 27

slide-17
SLIDE 17

Contribution Understanding Value Preferences

Decision Module

Incorporates VIKOR [Opricovic and Tzeng, 2004], a multicriteria decision-making method

Norms may conflict with actions Stakeholders’ value preferences may not align

World Model Social Model Stakeholder Model Context Norms Goals Actions Sanctions Values Decision Module Ethically Appropriate Action

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 17 / 27

slide-18
SLIDE 18

Contribution Understanding Value Preferences

Evaluation: Crowdsourcing Study

Participants: 58 students enrolled in a mixed graduate and undergraduate-level computer science course Privacy attitude survey: Level of comfort in sharing personal information Context sharing surveys: Select context sharing policy Phase 1. Based on context, including place and social relationship Phase 2. Based on context and values (pleasure, privacy, recognition, safety)

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 18 / 27

slide-19
SLIDE 19

Contribution Understanding Value Preferences

Evaluation: Simulation

Study unit: Pichu SIPA

Hiking at night Studying in a library Attending graduation ceremony Visiting an airport Visiting a bar with fake ID Visiting a drug rehab center Presenting a conference paper Being stuck in a hurricane friend family colleague stranger  Share with all  Share with common friends  Share with companions  Share with no one

Decision-making strategies:

SAinur: Policy based on VIKOR Sprimary: Policy based on primary stakeholder’s preferences Sconservative: Least privacy-violating sharing policy Smajority: Most common sharing policy

Simulated societies

Mixed Fundamentalists, Pragmatists Unconcerneds

Privacy attitude distribution of societies

Highly unconcerned Highly concerned F u n d a m e n t a l i s t s P r a g m a t i s t s U n c

  • n

c e r n e d s Privacy Attitude Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 19 / 27

slide-20
SLIDE 20

Contribution Understanding Value Preferences

Metric

Mean social experience is the mean utility obtained by a society as a whole based on context sharing policy decisions Best individual experience is the maximum utility obtained by one or more

  • f the SIPA’s stakeholders during a single interaction

Worst individual experience is the minimum utility obtained by one or more of the SIPA’s stakeholders during a single interaction Fairness is the reciprocal of the difference between the best and worst individual experience

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 20 / 27

slide-21
SLIDE 21

Contribution Understanding Value Preferences

Experiment with Mixed Privacy Attitudes

Result: Ainur yields better mean social experience, mean worst individual experience, and fairness than other decision-making strategies

5 10 15 20 0.8 1 1.2 1.4 1.6 Time in 100 steps Social Experience SAinur Sprimary Sconservative Smajority

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 21 / 27

slide-22
SLIDE 22

Contribution Understanding Value Preferences

Experiments with Majority Privacy Attitudes

Result: Ainur maximizes the worst individual experience and yields better fairness than

  • ther decision-making strategies

5 10 15 20 0.8 1 1.2 1.4 1.6 Time in 100 steps Social Experience Fundamentalists 5 10 15 20 Time in 100 steps Pragmatists 5 10 15 20 Time in 100 steps Unconcerneds

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 22 / 27

slide-23
SLIDE 23

Contribution Understanding Value Preferences

Threats to Validity and Mitigation

Threats: Simulation as an evaluation methodology Unreliability of self-reported attitudes Survey sample not representative of actual population Limitations (because of logistical reasons): Limited set of predetermined situations Limited set of actions

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 23 / 27

slide-24
SLIDE 24

Conclusions and Directions

Outline

1

Introduction

2

Contribution Understanding Value Preferences

3

Conclusions and Directions

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 24 / 27

slide-25
SLIDE 25

Conclusions and Directions

Conclusions and Relationship to FEAT

Seeking to advance the science of privacy by tackling nuanced notions

  • f privacy (understood as an ethical value) in personal agents

Contributions: Modeling social intelligence: Arnor, a software engineering method to engineer privacy-aware personal agents (Fairness; Accountability) Understanding social context: Poros, an approach that enables personal agents to infer contextually relevant social norms that preserve privacy (Accountability; Transparency) Understanding value preferences: Ainur, a decision-making framework to design personal agents that can reason about values and act ethically (Fairness; Ethics)

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 25 / 27

slide-26
SLIDE 26

Conclusions and Directions

Possible Directions for Future Dissertations

Artificial Intelligence Social reality: White lies and affect in personal agents (building on IJCAI 2018 and Trust 2014 works) Formal specification: Argumentation and value-based reasoning (building on Computer 2017 and IJCAI 2016 works) Software Engineering Creativity: CrowdRE for privacy requirements (building on RE 2016 and RE 2018 works) Social reality: RE for ethical systems (building on AAMAS 2017) Privacy Social reality: Middleware based on Ainur as a privacy-enhancing technology to support ethical decision-making Social reality: Usable privacy and ethics

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 26 / 27

slide-27
SLIDE 27

Acknowledgments

Adviser: Dr. Munindar P. Singh Committee: Drs. Jon Doyle, Will Enck, Chris Mayhorn, Jessica Staddon, and Laurie Williams Past and present collaborators

Engineering privacy and ethics in social applications with PK Murukannaiah and H Guo [IC 2016, AAMAS 2017, IC 2018, IJCAI 2018], and MB van Riemsdjik and P Pasotti; Reasoning about normative conflicts with J Jiang, R Chirkova, and J Doyle [IJCAI 2016; HotSoS 2016]; Sanctions and cybersecurity with H Du, BY Narron, S Al-Amin, S Goyal, E Berglund, and J Doyle [HotSoS 2015, ACySe 2015, SIMPAT 2018]; Norms and sociotechnical systems with ¨ O Kafalı [IS 2016, AAAI 2017]; Sanction typology with LG Nardin, T Balke-Visser, AK Kalia, and JS Sichman [KER 2016]; Trust and emotions with AK Kalia, KS Chan, JH Cho, and S Adalı [TRUST 2014]; Argumentation and secure service policies with CW Hang and SD Parsons [Computer 2017]; Analytic workflow with G Yuan, C Allred, PR Telang, and M Wilson [RCIS 2015]; Creativity, personality, crowdsourcing, and teamwork with PK Murukannaiah [RE 2016, RE 2017]; App review mining with VT Dhinakaran, R Pulle, and PK Murukannaiah [RE 2018], and H Guo and Z Zhang (ongoing) Collective intelligence with AK Kalia, PK Murukannaiah, R Pandita, and H Du (ongoing); Analysis of privacy news with K Sheshadri and J Staddon [PST 2017]; Preserving probe trajectory privacy with R Balu, B Xu, and M Stroila; Agile requirements evolution with S Ghaisas et al. [JSS 2013, MaRK 2013, MaRK 2011, MaRK 2010, RSSE 2010];

Labmates at Multiagent Systems and Service-Oriented Computing Lab Science of Security Lablet at North Carolina State University Laboratory for Analytic Sciences Family and friends

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 27 / 27

slide-28
SLIDE 28

Appendix

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 28 / 27

slide-29
SLIDE 29

Modeling Social Intelligence

Arnor: A Method to Model Social Intelligence

RQSocial intelligence: How can modeling social intelligence in a SIPA help deliver a social experience and respects its stakeholders’ privacy?

Goal modeling: identifying a SIPA’s stakeholders, their goals, and plans Context modeling: identifying the social contexts in which a SIPA’s stakeholders interact Context helps in deciding which goals to bring about or plans to execute Social expectation modeling: identifying norms and sanctions that govern stakeholders’ goals and plans Social experience modeling: identifying a SIPA’s actions that improve social experience, i.e., choosing plans, goals, and norms

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 29 / 27

slide-30
SLIDE 30

Modeling Social Intelligence

Evaluation: Developer Study

Participants: 30 developers Mechanics: One factor; two alternatives Two groups (Arnor and Xipho, a prior method) balanced on skills developed Ringer SIPAs in six weeks Model, Implement, Test Metrics: Coverage and correctness Time and difficulty to develop Study Unit: Ringer SIPAs

Hunt EB2 Carmichael Oval Lab Seminar Meeting Party friend family colleague stranger  Loud  Vibrate  Silent

Result Developers who follow Arnor feel it is easier to develop a SIPA and expend less time, than those who follow Xipho

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 30 / 27

slide-31
SLIDE 31

Modeling Social Intelligence

Evaluation: User Study (Simulations)

Developed Ringer SIPAs simulated in varying adaptation scenarios: Fixed norms Changing norms Changing context Changing sanction Metrics: Adaptability coverage and correctness Norm compliance Proportion of positive sanctions Result SIPAs developed using Arnor yield lower sanction proportions than SIPAs developed using Xipho (a previous approach)

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 31 / 27

slide-32
SLIDE 32

Understanding Social Context

Interaction and Learning in Poros

RQContext: How can SIPAs share deviation contexts and adapt to them, and learn contextually relevant norms? Identify plans that satisfy goals Select plan that maximizes social experience Perform action Reveal context Observe action Receive revealed context Receive sanction Evaluate action and sanction

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 32 / 27

slide-33
SLIDE 33

Understanding Social Context

Evaluation: The Ringer Environment

Emergency Room (ER) Home (H) Library (L) Meeting (M) Party (P) Family Friend Coworker Stranger

Agent Societies Pragmatic Considerate Selfish Agent Types Fixed Sanctioning Poros

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 33 / 27

slide-34
SLIDE 34

Understanding Social Context

Evaluation: Social Simulations

Metric: Social cohesion measures the proportion of agents that perceive actions as norm compliant. Higher the social cohesion, lower is the number of negative sanctions Social experience measures the goal satisfaction delivered by an agent (computed by aggregating payoffs for all stakeholders) Results Pragmatic society: Social cohesion and social experience offered by Poros agents are significantly better than those offered by Fixed and Sanctioning agents Considerate society: Average social experience drops for Sanctioning and Poros agents after they have gained enough confidence Selfish society: Plots are similar to those in the experiment with pragmatic agent societies, but with slightly lower stabilized values

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 34 / 27

slide-35
SLIDE 35

Understanding Social Context

Experiments on Pragmatic Agents (Varying Network Types)

1,000 2,000 3,000 1 2 3 Step Experience payoff Large-Dense Network Poros Sanctioning Fixed 1,000 2,000 3,000 1 2 3 Step Large-Sparse Network 1,000 2,000 3,000 1 2 3 Step Experience payoff Small-Dense Network 1,000 2,000 3,000 1 2 3 Step Small-Sparse Network

Social cohesion and social experience offered by Poros agents are significantly better than those offered by Fixed and Sanctioning agents

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 35 / 27

slide-36
SLIDE 36

Understanding Social Context

Experiments on Considerate and Selfish Agents

1,000 2,000 3,000 −1 1 2 3 Step Experience payoff Considerate Agents

Poros Sanctioning

1,000 2,000 3,000 −1 1 2 3 Step Selfish Agents

Poros Sanctioning

The average social experience drops for considerate Sanctioning and Poros agents after they have gained enough confidence Plots for selfish agents are similar to those in the experiment with pragmatic agents, but with slightly lower stabilized values

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 36 / 27

slide-37
SLIDE 37

Understanding Value Preferences

VIKOR Summary

1 Determine the best and worst numeric payoffs, f ∗

x

and f −

x

for each value preference x over the alternative actions y to bring about a goal. That is, f ∗

x

= maxy fxy , f −

x

= miny fxy . 2 For each alternative action y, compute the weighted and normalized Manhattan distance [Opricovic and Tzeng, 2004]: Sy = n

x=1 wx (f ∗ x − fxy )/(f ∗ x − f − x

), where wx is the weight for value preference x, which is subject to a stakeholder context and preferences over values. In particular, Sy = 0 when f ∗

x

= f −

x

. 3 Compute the weighted and normalized Chebyshev distance [Krause, 1973]: Ry = maxx [wx (f ∗

x − fxy )/(f ∗ x − f − x

)], where wx is the weight for value preference x. 4 Compute Qy = k(Sy − S∗)/(S− − S∗) + (1 − k)(Ry − R∗)/(R− − R∗), where S∗ = miny Sy , S− = maxy Sy , R∗ = miny Ry , R− = maxy Ry , and k is a weight of the strategy to maximum group or individual experience. We set k = 0.5 to select a consensus policy. 5 Rank alternative actions, sorting by the values S, R, and Q, in increasing order. The results are three ranked lists of actions. 6 Choose the alternative based on min Q as the compromise solution if it is better than the second best alternative by a certain threshold or also the best ranked as per S and R. Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 37 / 27

slide-38
SLIDE 38

Understanding Value Preferences

VIKOR Calculations

Policy Alternatives Frank’s Values Hope’s Values Sy Ry Qy Ple Pri Rec Saf Ple Pri Rec Saf y1 All 10 5 10 5 5 5 5 3.5 3 0.75 y2 Common 5 5 5 10 5 5 5 0.4 3 1 y3 Andrew 5 5 15 5 5 0.3 1 wx 1 1 1 1 1 3 1 1 f ∗

x

1 1 1 1 f −

x

k = 0.5, wHope−privacy = 3 Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 38 / 27

slide-39
SLIDE 39

Understanding Value Preferences

Places in the Simulation

Place Safe Sensitive Attending graduation ceremony – No Presenting a conference paper – No Studying in library Yes – Visiting airport Yes – Hiking at night No – Being stuck in a hurricane No – Visiting a bar with fake ID – Yes Visiting a drug rehab center – Yes

Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 39 / 27

slide-40
SLIDE 40

Understanding Value Preferences

Example Numeric Utility Matrix for a Stakeholder

Place Companion Policy Value Pleasure Privacy Recognition Security Graduation Family All 1 1 Conference Co-workers None 1 Library Friends All 1 Airport Friends Common 1 Hiking Alone All 1 1 Hurricane Family All 1 1 Bar Alone None 2 Rehab Friends None 2 Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 40 / 27

slide-41
SLIDE 41

Understanding Value Preferences

Comparing Social Experience and Fairness for Mixed Privacy Attitudes

Strategy Mean Best Worst Fairness p SAinur 1.361 1.715 0.767 1.05 – Sprimary 1.286 1.789 0.579 0.83 <0.01 Sconservative 1.106 1.721 0.472 0.80 <0.01 Smajority 1.339 1.836 0.570 0.78 <0.01 Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 41 / 27

slide-42
SLIDE 42

Understanding Value Preferences

Comparing Social Experience and Fairness for Majority Privacy Attitudes

Strategy Fundamentalist Pragmatist Unconcerned M. B. W. F. M. B. W. F. M. B. W. F. SAinur 1.535 1.664 1.233 2.27 1.329 1.531 0.867 1.51 1.242 1.457 0.768 1.45 Spri. 1.506 1.766 1.082 1.46 1.253 1.592 0.679 1.10 1.129 1.466 0.584 1.13 Scons. 1.366 1.745 1.059 1.46 1.093 1.519 0.608 1.10 0.870 1.338 0.454 1.34 Smaj. 1.551 1.858 1.007 1.18 1.318 1.699 0.575 0.89 1.176 1.534 0.518 0.98 Nirav Ajmeri EMAS for Ethics and Privacy-Aware Social Computing December 2018 42 / 27

slide-43
SLIDE 43

Understanding Value Preferences

Location Sharing Survey: Policy Selection

Companion Check-in Policy Share with all Common friends Companions No one Alone

  • Colleague
  • Friend
  • Family member
  • Crowd
  • Nirav Ajmeri

EMAS for Ethics and Privacy-Aware Social Computing December 2018 43 / 27