CM30174 + CM50206 Intelligent Agents Marina De Vos, Julian Padget - - PowerPoint PPT Presentation

cm30174 cm50206 intelligent agents
SMART_READER_LITE
LIVE PREVIEW

CM30174 + CM50206 Intelligent Agents Marina De Vos, Julian Padget - - PowerPoint PPT Presentation

Overview Context Trust Reputation CM30174 + CM50206 Intelligent Agents Marina De Vos, Julian Padget East building: x5053, x6971 Reputation November 8, 2011 De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 1 / 45 Overview


slide-1
SLIDE 1

Overview Context Trust Reputation

CM30174 + CM50206 Intelligent Agents

Marina De Vos, Julian Padget East building: x5053, x6971

Reputation

November 8, 2011

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 1 / 45

slide-2
SLIDE 2

Overview Context Trust Reputation

Authors/Credits for this lecture

Multiagent Systems, Chapter 9 “Trust and Reputation in Multiagent Systems” [Sabater-Mir and Vercouter, 2012]

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 2 / 45

slide-3
SLIDE 3

Overview Context Trust Reputation

Content

1

Overview

2

Context

3

Trust Trust models Trust for agents Sources of trust

4

Reputation Reputation models Implementation and use Drawbacks of reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 3 / 45

slide-4
SLIDE 4

Overview Context Trust Reputation

Content

1

Overview

2

Context

3

Trust

4

Reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 4 / 45

slide-5
SLIDE 5

Overview Context Trust Reputation

Introduction

Autonomy, reactiveness and proactivity change the “rules

  • f the game” for software systems

A component may not do what you tell it: it is “self-interested” because it has goals and an agenda As with human societies, mechanisms are needed to provide some degree of control – inhibit anarchy Solutions?

computational security – technical artifacts normative systems – constraints on behaviour trust and reputation – behavioural influence

What is the computational representation of trust and reputation?

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 5 / 45

slide-6
SLIDE 6

Overview Context Trust Reputation

Plan

Context Trust, then reputation

Roles of trustor and trustee: Ta→b and Ra→b Trust = direct experience Reputation = collective experience

Clear separation of concepts How to build computational model(s) How to integrate with agent decision-making Analyse shortcomings

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 6 / 45

slide-7
SLIDE 7

Overview Context Trust Reputation

Content

1

Overview

2

Context

3

Trust

4

Reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 7 / 45

slide-8
SLIDE 8

Overview Context Trust Reputation

What’s the problem?

How can trust or reputation be relevant to software? Likewise negotiation or argumentation But agent characteristics:

Reactive: on-going interaction with environment, and responds to changes that occur in it (in time for the response to be useful). Pro-active: generates and attempts to achieve goals Social: ability to interact with other agents (and possibly humans) via some kind of agent-communication language, and perhaps cooperate with others

Make conventional approaches to system security obsolete

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 8 / 45

slide-9
SLIDE 9

Overview Context Trust Reputation

Social Ability

The real world is a multi-agent environment: we cannot go around attempting to achieve goals without taking others into account. Some goals can only be achieved with the cooperation of

  • thers. Suggests need for:

Information/models of other agents’ state Trust metrics Reputation models (e.g. FOAF)

Social ability in agents is the ability to interact with other agents (and possibly humans) via some kind of agent-communication language, and perhaps cooperate with others.

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 9 / 45

slide-10
SLIDE 10

Overview Context Trust Reputation

Norms

Meaning: A standard or pattern of social behaviour that is accepted in or expected of a group (OED) a way to specify (in)correct behaviour in a given context Agent characteristics ⇒ agent can observe or ignore group expecations Why should it observe? Why should it ignore? punish! incentivise! But how are norms enforced? Or, who is going to enforce them? Individuals with enforcement powers? “police” agents Groups with enforcement powers? “peer” pressure Trust + reputation “soft” security

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 10 / 45

slide-11
SLIDE 11

Overview Context Trust Reputation

Soft security

Tradeoff: protection vs. restriction Cannot prevent all undesirable events But adapt system to reduce or prevent future incidence Trust and reputation do this for humans Two aspects:

local: integrate into agent decision-making global: social control mechanism

Agents observe each other Fear of ostracism is a deterrent Need representation of social evaluation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 11 / 45

slide-12
SLIDE 12

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Content

1

Overview

2

Context

3

Trust Trust models Trust for agents Sources of trust

4

Reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 12 / 45

slide-13
SLIDE 13

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Computational representation

Different formalisms depending on agent reasoning Tradeoff: simplicity vs. expressiveness Simplicity:

Ease of calculation Loss of information Limitation of reasoning

Expressiveness... the converse

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 13 / 45

slide-14
SLIDE 14

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Boolean

TRUE: ⇒ trustee is trustworthy FALSE: ⇒ not Rarely used Trust (and reputation) is best graded

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 14 / 45

slide-15
SLIDE 15

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Numerical

Real or integer value

Trust in agent X is 0.4 Reputation of agent Y is -1

Most common representation

[−1, 1] ReGreT [Sabater, 2003] [0, 1] [Sen and Sajja, 2002] Indicates degree of trust (distrust) or good (bad) reputation

Permits absolute and relative comparison Semantics: does/should 0 ⇒ neutral, etc.? Interpretation is typically subjective Creates ambiguity if agents communicate values

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 15 / 45

slide-16
SLIDE 16

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Qualitative labels

Human assessments typically imprecise

X’s reputation is very good I only trust Y so far...

Is numerical any better? Compare 0.6 and 0.7 Finite set of labels may (paradoxically) help precision: {bad, neutral, good}

  • r

{very bad, bad, neutral, good, very good} Lose fine-grained comparison Gain recognizable semantics Human accessible See [Abdul-Rahman and Hailes, 2000]

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 16 / 45

slide-17
SLIDE 17

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Probability distribution and fuzzy sets

Some applications need to express expected collective behaviour What is the probability that an agent will behave badly? ... a skewed distribution This agent is unpredictable ... a uniform distribution This agent is bipolar ... double-peaked distribution Similar effects can be achieved using fuzzy sets

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 17 / 45

slide-18
SLIDE 18

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Trust and reputation as beliefs

Integrate with agent architecture so agent can reason Add to BDI model as beliefs Socio-cognitive theory [Castelfranchi and Falcone, 2010]: “an agent i trusts another agent j to do an action α with respect to a goal φ” Connect with intentional systems + speech acts Key observation: trust is relative

To an action, and To a goal

ForTrust model formalizes this as “occurent trust”1 OccTrust(i, j, α, φ) for: trustor i + trustee j + action α + goal φ

1trust that holds here and now De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 18 / 45

slide-19
SLIDE 19

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

BDI + RepAge

RepAge uses distributions (dsicussed earlier) Uses LBC – belief language and logic Adds belief predicate S to represent the community belief If the reputation of agent j playing the role seller is

Rep(j, r, [0.6, 0.1, 0.1, 0.1, 0.1, 0.1] where the proabability distribution is over the finite set:

{VBadProduct, BadProduct, OkProduct, GoodProduct, VGoodProduct}

the belief set is:

S(buy(j), VBadProduct, 0.6, seller) S(buy(j), BadProduct, 0.1, seller) S(buy(j), OKProduct, 0.1, seller) S(buy(j), GoodProduct, 0.1, seller) S(buy(j), VGoodProduct, 0.1, seller)

and available for standard BDI reasoning

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 19 / 45

slide-20
SLIDE 20

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Reliability

Issue so far is representation But to what extent can value be relied upon? Some models add a reliability measure ∈ R capturing

  • pinion count
  • pinion variance (higher variance ⇒ less reliable)
  • pinion recency
  • pinion source credibility

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 20 / 45

slide-21
SLIDE 21

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Trusting and Acting

Accepted definition [Gambetta, 2000] “Trust is the subjective probability by which an individual, A, expects that another individual, B, performs a given action on which its welfare depends”. an estimation, a prediction of the future (!) “an expectation about an uncertain behaviour” [Castelfranchi and Falcone, 2010] identify a second attribute of “trust as an act”: “the decision and the act of relying on, counting on, depending on [the trustee]” Thus, trust is

a mental state of the agent (an evaluation) and a “decision and intention based on that disposition”

which connects evaluation with decision-making

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 21 / 45

slide-22
SLIDE 22

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Dual nature of trust

EPISTEMIC MOTIVATIONAL

social information direct experiences comunicated experiences images reputations trust evaluations self motivations context trust decisions actions

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 22 / 45

slide-23
SLIDE 23

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Trust evaluation

Summary of all beliefs etc. known by trustor

1

image: an evaluative belief; it tells whether the target is good or bad with respect to a given behaviour

Image is the result of internal reasoning using:

Direct experiences: trustor ↔ trustee Communicated experiences: trustor ← X1 ← . . . ← Xn ↔ trustee more inputs but less reliable; liklihood of noise; risk of false reports Social Information: position of trustee in (agent) society

2

reputation: shared social evaluations

Not the same as communicated experiences Collective view of the social entity about an individual May/Will differ from individual’s view

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 23 / 45

slide-24
SLIDE 24

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Image formation

Trustor needs different images for different situations wrt a trustee

context: description of interaction satisfaction criteria information source

Relates image to agent’s skills Empirical evaluation of trust models indicates that self-interested agents shoud not share opinions because false reports destabilize the model Assess communicated against direct experience to identify reliable sources

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 24 / 45

slide-25
SLIDE 25

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Statistical aggregation

Aim: to represent image as a single value Possible approach: t(a, c, td) where a is trustee, c is context, td ∈ {vt, t, u, vu} Trustor evaluates experiences with a in c using {vg, g, b, vb} If most experiences are vg, a is considered vt Inputs usually combined by averaging But weighted for parameters such as:

recency source credibility any other factors considered relevant

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 25 / 45

slide-26
SLIDE 26

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Logic of belief generation

Constitutive elements of trust (c.f. intentional systems):

internal attribution: of the trustee: should have the intention to behave as expected external attribution: of the trustee: should be able to act as expected and to achieve the goal

  • n which the trust decision is made

Known as dispositional trust – subject to some conditions For example:

DispTrust(Alice, % truster Bob, % trustee inform(weather), % action know(Alice, weather), % goal asked(Alice, Bob, weather)) % condition

Alice trusts Tom to perform action in order to achieve goal when condition is true

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 26 / 45

slide-27
SLIDE 27

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Trust decision

Typically: compare trust evaluation against threshold Risks being too crisp Refinement: trust threshold >> distrust threshold trust, distrust, uncertain Thresholds can (should?) depend on criticality of decision

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 27 / 45

slide-28
SLIDE 28

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Trust beliefs

(BDI) Trust decision agent intention Trustor intends to rely on trustee Dispositional trust ⊢ occurrent trust (ForTrust): DispTrust(i, j, α, φ, κ) ∧ ChoicejFφ ∧ Beliκ → OccTrust(i, j, α, φ) trustor i should decide to trust a trustee j to perform an action α in order to achieve a goal φ if agent i:

1

has the corresponding dispositional trust when the conditions κ hold

2

has the goal φ

3

believes that the conditions κ currently hold

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 28 / 45

slide-29
SLIDE 29

Overview Context Trust Reputation Trust models Trust for agents Sources of trust

Diversity and commonality

Different: formalisms, calculation, decision Common: experience gathering, trust evaluation, trust decision Not interoperable

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 29 / 45

slide-30
SLIDE 30

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Content

1

Overview

2

Context

3

Trust

4

Reputation Reputation models Implementation and use Drawbacks of reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 30 / 45

slide-31
SLIDE 31

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Reputation

Many definitions mix and confuse trust and reputation DEFINITION: What social entity says about an individual, where

social entity = set of individuals + set of social relations Makes reputation efficient + effective Group is responsible for evaluation

and

“says” indicates may not be an individual belief emphasizes communication

Associated with a specific behaviour/property (of an agent) (like trust)

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 31 / 45

slide-32
SLIDE 32

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Reputation evaluation

reputation image neighbourhood role third party information social information communicated image communicated third party image communicated reputation inherited reputation shared evaluation shared voice reputation evaluation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 32 / 45

slide-33
SLIDE 33

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Reputation evaluation

Three sources:

communicated/third party image communicated reputation – from another social entity social information – position of individual in entity/ies

Look at each of these in more detail

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 33 / 45

slide-34
SLIDE 34

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Communicated image

Aggregate images from members of social entity For a social entity of agents SEi = {a1, ...an}, n large: reputation(SEi, ak) =

n

  • j=1

image(aj, ak) : j = k Transition from communicated image to reputation:

1

the evaluation is communicated (easy)

2

that the individuals’ images are are a good sample of the images of the social entity as a whole (hard)

Generalizing from a few images is unreliable Some models include a reliability metric Key point: image is not reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 34 / 45

slide-35
SLIDE 35

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Communicated reputation

Aggregate reputation from other social entities For a collection of social entities SE1, ..., SEn: reputation(SEi, ak) =

n

  • j=1

reputation(SEj, ak) : i = j Communication is by individuals (necessarily) Value is from social entity image(ai, aj) ? = reputation(SE(ai), aj) shared voice({a1, a2, a3}, a4) = image(a1, a4) ⊕ image(a2, a4) ⊕ image(a3, a4)

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 35 / 45

slide-36
SLIDE 36

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Inherited reputation

Acquired from third-party agents Depends on role of subject For example:

a CEO is probably good at business an employee of MacDonalds is ... an inhabitant of Kensington/Twerton is ... an article in Nature/The Sun is ...

Not widely used (yet) in computational systems Useful contribution with other information sources ReGreT[Sabater, 2003]:

Role played: system reputation Social relations: neighbourhood reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 36 / 45

slide-37
SLIDE 37

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

ReGreT model

Uses a weighted mean to aggregate sources: Ra→b(φ) =

  • i∈{W,N,S,D}

ξi × Ra i

→b(φ)

where the weights take account of the reliability of the source with a preference ordering on the sources:

W = witness, ξW = RLa

W

→b(φ)

N = neighbourhood, ξN = RLa

N

→b(φ) × (1 − ξW)

S = system, ξS = RLa

S

→b(φ) × (1 − ξW − ξN)

D = default, 1 − ξW − ξN − ξS

Fixed heuristic unsuited to changing environments

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 37 / 45

slide-38
SLIDE 38

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Centralized approaches

Pros:

Reputation accuracy: uses information from all individuals Low sensitivity to bad data Public values: new participants can benefit straightaway

Cons:

Trust the service to be correct Bottleneck + vulnerability Security + attributability of data in repository

Examples: e-Bay, Amazon, etc. Combine rating with comment Large user count Low repeat transaction count Use communicated images for reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 38 / 45

slide-39
SLIDE 39

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Decentralized approaches

Depends on individual information Pros:

No need to trust external services Suitable for scalable systems

Cons:

Hard to obtain enough information for a reliable reputation value Hard for new participants Puts greater burden on agents

Several multiagent models: ReGreT, Travos, FIRE

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 39 / 45

slide-40
SLIDE 40

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Use of reputation

1

As a source of trust

When there is a shortage of direct experience

2

For social order

Maintain or enforce “normal” behaviour Acceptable behaviour repeat/increased interaction Unacceptable behaviour ostracism and exclusion

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 40 / 45

slide-41
SLIDE 41

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Weaknesses of reputation I

Susceptible to a range of attacks from malicious agents Defences can false positives

1

Unfair ratings: giving false feedback Defence: weight in favour of past reliable informers

2

Ballot-stuffing: more feedback than interactions Defences: filter feedback from suspect agents, feedback per interaction instead of accumulated feedback

3

Dynamic personality: generate good reputation through low-value, high-quality services, then offer high-value services Defences: fixed memory transaction window, variable (shorter when reputation falls) memory transaction window

4

Whitewashing: use identity change to lose bad reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 41 / 45

slide-42
SLIDE 42

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Weaknesses of reputation II

Attacks continued:

4

Collusion: enhances other kinds of attack

5

Sybil attacks: use multiple identities to improve reputation through artificial feedback

6

Reputation lag exploitation: uses reputation inertia to behave badly with a good reputation, then behave well to recover bad reputation Defences: make reputation system more responsive; give agents means to identify behavioural patterns

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 42 / 45

slide-43
SLIDE 43

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

Summary

Why we need analogues of trust and reputation Separation of trust and reputation Computational models of these concepts Analysis of the risks of reputation

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 43 / 45

slide-44
SLIDE 44

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

References I

Abdul-Rahman, A. and Hailes, S. (2000). Supporting trust in virtual communities. In System Sciences, 2000. Proceedings of the 33rd Annual Hawaii International Conference on, page 9 pp. vol.1. Castelfranchi, C. and Falcone, R. (2010). Trust Theory: A Socio-Cognitive and Computational Model. Wiley Series in Agent Technology. John Wiley & Sons Ltd. Gambetta, D. (2000). Can we trust trust? In Gambetta, D., editor, Trust: Making and Breaking Cooperative Relations, chapter 13, pages 213–237. Department of Sociology, University of Oxford. electronic edition, http://www.sociology.ox.ac.uk/papers/gambetta213-237.pdf, retrieved 20111107. Sabater, J. (2003). Trust and reputation for agent societies. Technical Report 20, Monografies de lInstitut dInvestigaci en Intel.ligncia Artificial.

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 44 / 45

slide-45
SLIDE 45

Overview Context Trust Reputation Reputation models Implementation and use Drawbacks of reputation

References II

Sabater-Mir, J. and Vercouter, L. (2012). Multiagent Systems, chapter 9. MIT Press, 2nd edition. In draft. Sen, S. and Sajja, N. (2002). Robustness of reputation-based trust: boolean case. In Proceedings of the first international joint conference on Autonomous agents and multiagent systems: part 1, AAMAS ’02, pages 288–293, New York, NY,

  • USA. ACM.

De Vos/Padget (Bath/CS) CM30174/CM50206 November 8, 2011 45 / 45