A few experiences on the use of trust for social control Laurent - - PowerPoint PPT Presentation

a few experiences on the use of trust for social control
SMART_READER_LITE
LIVE PREVIEW

A few experiences on the use of trust for social control Laurent - - PowerPoint PPT Presentation

A few experiences on the use of trust for social control Laurent Vercouter Laurent.vercouter@insa-rouen.fr W h a t i s t h e u t i l i t y o f a s o c i a l c o n c e p t f o r s e l f - organizing systems ? What is required for its


slide-1
SLIDE 1

A few experiences on the use of trust for social control

Laurent Vercouter Laurent.vercouter@insa-rouen.fr

slide-2
SLIDE 2
  • What is the utility of a social concept for self-
  • rganizing systems ?
  • What is required for its implementation ?
  • How can/should it be adapted to specific

application contexts ?

3 examples using trust for social control

P2P networks sensor networks social networks

slide-3
SLIDE 3

1st case : Social control for P2P

  • Observation = messages
  • Expectations = norms +

violation detection

  • Evaluation = Trust calculation,

gossiping, ...

  • Sanction = re-wiring,
  • stracism

Observation Observation Expectation Expectation Sanction Sanction Evaluation Evaluation

* Vercouter & Muller, « L.I.A.R.: achieving social control in open and decentralized multiagent systems »,

Applied Artificial Intelligence, 24 :723--768, September 2010

slide-4
SLIDE 4

2nd case : Privacy preservation in a social network

Observation Observation Expectation Expectation Sanction Sanction Evaluation Evaluation

  • Users are connected in a social network and

exchange information, that may be private or sensitive

  • Privacy preservation is an issue
  • Users should be assisted to protect their own

privacy and to prevent them from violating

  • thers' privacy
slide-5
SLIDE 5

2nd case : Privacy preservation

« We have a right to privacy, but that is neither a right to control personal information nor a right to have access to this information restricted. Instead, it is a right to live in a world in which

  • ur expectations about the flow of

personal information are, for the most part, met; (...) This is the right I have called contextual integrity, achieved through the harmonious balance of social rules, or norms, with both local

  • r general values, ends and
  • purposes. » (H. Nissembaum)
slide-6
SLIDE 6

Contextual integrity for social networks

Mapping for Privacy Enforcement Agents (PEA)

1) Transmission context matches the

nature of information

2) Recipient is part of the

transmission context

3) Agents do not have incompatible

relationships with the target

4) Privacy policies defined by the

information subjects are satisfied

Organisational model Social relation model Trust model

slide-7
SLIDE 7

Contextual integrity for social networks

Obstacle, problems, opportunities...

  • Expectation :
  • Organisational/social models are only approximations
  • Privacy violations can be detected but should be

confirmed

  • User involvement is essential to deal with

subjectivity

  • Many conflicts appear
  • Sanction
  • Users have to keep the control
  • No automatic re-wiring
  • PEA assists users to prevent privacy violations
  • Explanations are essential
  • Trust = privacy preserving behavior of an entity

Observation Observation Expectation Expectation

?

Sanction Sanction Evaluation Evaluation

?

* Krupa and Vercouter. « Handling privacy as contextual integrity in decentralized virtual communities :

the PrivaCIAS framework ». Web Intelligence and Agent Systems, 10(1) :105--116, 2012

slide-8
SLIDE 8

3rd case : Social control for sensor networks

MANETs :

  • Nodes with 2 roles : sensing

and routing

  • Resource limitations (energy,

memory, communication)

Observation Observation Expectation Expectation

?

Sanction Sanction Evaluation Evaluation

?

Obstacle, problems, opportunities...

  • Observation :
  • Overhearing is an opportunity
  • No identity management !
  • Shift to consider trust in neighborhood
  • Evaluation
  • Trust calculation
  • Almost binary (full trust or distrust)
  • Distrust → backup mode (sensing only) →

contribute to a quarantine

  • Nodes mobility
  • A justification to integrate forgiveness

* Vercouter and Jamont. « Lightweight trusted routing for wireless sensor networks », Progress in AI, 1(2) :193--202, 2012

slide-9
SLIDE 9

What can we learn from these experiences ?

Can we find a « methodology » for mapping a social concept, such as trust, to technical systems ? Are there invariants ? Essential characteristics ?