Navigating Internet Neighborhoods: Reputation, Its Impact on - - PowerPoint PPT Presentation

navigating internet neighborhoods reputation its impact
SMART_READER_LITE
LIVE PREVIEW

Navigating Internet Neighborhoods: Reputation, Its Impact on - - PowerPoint PPT Presentation

Navigating Internet Neighborhoods: Reputation, Its Impact on Security, and How to Crowd-source It Mingyan Liu Department of Electrical Engineering and Computer Science University of Michigan, Ann Arbor, MI November 6, 2013 Intro Motivation


slide-1
SLIDE 1

Navigating Internet Neighborhoods: Reputation, Its Impact on Security, and How to Crowd-source It

Mingyan Liu

Department of Electrical Engineering and Computer Science University of Michigan, Ann Arbor, MI

November 6, 2013

slide-2
SLIDE 2

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Acknowledgment

Collaborators:

  • Parinaz Naghizadeh Ardabili
  • Yang Liu, Jing Zhang, Michael Bailey, Manish Karir

Funding from:

  • Department of Homeland Security (DHS)

Liu (Michigan) Network Reputation November 6, 2013 2 / 52

slide-3
SLIDE 3

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Threats to Internet security and availability

From unintentional to intentional, random maliciousness to economic driven:

  • misconfiguration
  • mismanagement
  • botnets, worms, SPAM, DoS attacks, . . .

Typical operators’ countermeasures: filtering/blocking

  • within specific network services (e.g., e-mail)
  • with the domain name system (DNS)
  • based on source and destination (e.g., firewalls)
  • within the control plane (e.g., through routing policies)

Liu (Michigan) Network Reputation November 6, 2013 3 / 52

slide-4
SLIDE 4

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Host Reputation Block Lists (RBLs)

Commonly used RBLs:

  • daily average volume (unique entries) ranging from 146M (BRBL)

to 2K (PhishTank)

RBL Type RBL Name Spam BRBL, CBL, SpamCop, WPBL, UCEPROTECT Phishing/Malware SURBL, PhishTank, hpHosts Active attack Darknet scanners list, Dshield

Liu (Michigan) Network Reputation November 6, 2013 4 / 52

slide-5
SLIDE 5

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Potential impact of RBLs

2e+11 4e+11 6e+11 8e+11 1e+12 1.2e+12 1.4e+12 20 40 60 80 100 120 140 160 20 40 60 80 100 Traffic volume per hour (Bytes) % of the traffic are blocked by size Time (hour) Total NetFlow Tainted Traffic % of traffic are tainted by volume

(a) By traffic volume (bytes).

1e+07 2e+07 3e+07 4e+07 5e+07 6e+07 20 40 60 80 100 120 140 160 20 40 60 80 100 Number of NetFlow per hour % of the Netflow are blocked Time (hour) Number of total Netflow Number of tainted traffic Netflow % of NetFlow are tainted

(b) By number of flows.

NetFlow records of all traffic flows at Merit Network

  • at all peering edges of the network from 6/20/2012-6/26/2012
  • sampling ratio 1:1
  • 118.4TB traffic: 5.7B flows, 175B packets.

As much as 17% (30%) of overall traffic (flows) “tainted”

Liu (Michigan) Network Reputation November 6, 2013 5 / 52

slide-6
SLIDE 6

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

How reputation lists should be/are used

Strengthen defense:

  • filter configuration, blocking mechanisms, etc.

Strengthen security posture:

  • get hosts off the list
  • install security patches, update software, etc.

Retaliation for being listed:

  • lost revenue for spammers
  • example: recent DDoS attacks against Spamhaus by Cyberbunker

Aggressive outbound filtering:

  • fixing the symptom rather than the cause
  • example: the country of Mexico

Liu (Michigan) Network Reputation November 6, 2013 6 / 52

slide-7
SLIDE 7

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Limitations of host reputation lists

Host identities can be highly transient:

  • dynamic IP address assignment
  • policies inevitably reactive, leading to significant false positives

and misses

  • potential scalability issues

RBLs are application specific:

  • a host listed for spamming can initiate a different attack

Lack of standard and transparency in how they are generated

  • not publicly available: subscription based, query enabled

Liu (Michigan) Network Reputation November 6, 2013 7 / 52

slide-8
SLIDE 8

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

An alternative: network reputation

Define the notion of “reputation” for a network (suitably defined) rather than for hosts A network is typically governed by consistent policies

  • changes in system administration on a much larger time scale
  • changes in resource and expertise on a larger time scale

Policies based on network reputation is proactive

  • reputation reflects the security posture of the entire network,

across all applications, slow changing over time Enables risk-analytical approaches to security; tradeoff between benefits in and risks from communication

  • acts as a proxy for metrics/parameters otherwise unobservable

Liu (Michigan) Network Reputation November 6, 2013 8 / 52

slide-9
SLIDE 9

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

An illustration

20 40 60 80 100 1 10 100 1000 10000 Fraction of IPs that are blacklisted (%) ASes BAD GOOD ?

Figure: Spatial aggregation of reputation

  • Taking the union of 9 RBLs
  • % Addrs blacklisted within an autonomous system (est. total of

35-40K)

Liu (Michigan) Network Reputation November 6, 2013 9 / 52

slide-10
SLIDE 10

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Many challenges to address

  • What is the appropriate level of aggregation
  • How to obtain such aggregated reputation measure, over time,

space, and applications

  • How to use these to design reputation-aware policies
  • What effect does it have on the network’s behavior toward others

and itself

  • How to make the reputation measure accurate representation of

the quality of a network

Liu (Michigan) Network Reputation November 6, 2013 10 / 52

slide-11
SLIDE 11

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Outline of the talk

Impact of reputation on network behavior

  • Can the desire for good reputation (or the worry over bad

reputation) positively alter a network’s decision in investment

  • Within the context of an inter-dependent security (IDS) game:

positive externality Incentivizing input – crowd-sourcing reputation

  • Assume a certain level of aggregation
  • Each network possesses information about itself and others
  • Can we incentivize networks to participate in a collective effort to

achieve accurate estimates/reputation assessment, while

  • bserving privacy and self interest

Liu (Michigan) Network Reputation November 6, 2013 11 / 52

slide-12
SLIDE 12

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Interdependent Security Risks

  • Security investments of a network have positive externalities on
  • ther networks.
  • Networks’ preferences are in general heterogeneous:
  • Heterogeneous costs.
  • Different valuations of security risks.
  • Heterogeneity leads to under-investment and free-riding.

Liu (Michigan) Network Reputation November 6, 2013 12 / 52

slide-13
SLIDE 13

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Network Security Investment Game

Originally proposed by [Jiang, Anantharam & Walrand, 2011]

  • A set of N networks.
  • Ni’s action: invest xi ≥ 0 in security, with increasing effectiveness.
  • Cost ci > 0 per unit of investment (heterogeneous).
  • fi(x) security risk/cost of Ni where:
  • x vector of investments of all users.
  • fi(·) decreasing in each xi and convex.
  • Ni chooses xi to minimize the cost function

hi(x) := fi(x) + cixi .

  • Analyzed the suboptimality of this game.

Liu (Michigan) Network Reputation November 6, 2013 13 / 52

slide-14
SLIDE 14

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Example: a total effort model

A 2-player total effort model: f1(x) = f2(x) = f (x1 + x2), with c1 = c2 = 1. h1(x) = f1(x1 + x2) + x1, h2(x) = f2(x1 + x2) + x2:

  • Let xo be the Nash Equilibrium, and x∗ be the Social Optimum.
  • At NE: ∂hi/∂xi = f ′(xo

1 + xo 2 ) + 1 = 0.

  • At SO: ∂(h1 + h2)/∂xi = 2f ′(x∗

1 + x∗ 2 ) + 1 = 0.

  • By convexity of f (·), xo

1 + xo 2 ≤ x∗ 1 + x∗ 2 ⇒ under-investment.

Liu (Michigan) Network Reputation November 6, 2013 14 / 52

slide-15
SLIDE 15

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

An illustration

0.5 1 1.5 2 2.5

y: = x1+x2

− f’(y) −2 f’(y)

yNRyR y*

2(1−R’)

Figure: Suboptimality gap

Liu (Michigan) Network Reputation November 6, 2013 15 / 52

slide-16
SLIDE 16

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

The same game with reputation

The same model, with the addition:

  • Ni will be assigned a reputation based on its investment.
  • Valuation of reputation given by Ri(x): increasing and concave.
  • Ni chooses xi to minimize the cost function

hi(x) := fi(x) + cixi − Ri(x) .

Liu (Michigan) Network Reputation November 6, 2013 16 / 52

slide-17
SLIDE 17

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

The effect of reputation: the same example

One’s reputation only depends on one’s own investment: Ri(x) = Ri(xi)

  • R1(x) = kR2(x), k > 1: N1 values reputation more than N2.
  • h1(x) = f (x1 + x2) + x1 − R1(x1),

h2(x) = f (x1 + x2) + x2 − R2(x2).

  • At NE: ∂hi/∂xi = f ′(xR

1 + xR 2 ) + 1 − R′ i (xR i ) = 0.

  • R′

1(xR 1 ) = R′ 2(xR 2 ) and thus xR 1 > xR 2 ⇒ The one who values

reputation more, invests more.

  • By convexity of f (·), xo

1 + xo 2 ≤ xR 1 + xR 2 ⇒ Collectively invest

more in security and decrease suboptimality gap.

Liu (Michigan) Network Reputation November 6, 2013 17 / 52

slide-18
SLIDE 18

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

An illustration

0.5 1 1.5 2 2.5

y: = x1+x2

− f’(y) −2 f’(y)

yNRyR y*

2(1−R’)

Figure: Driving equilibrium investments towards the social optimum

Liu (Michigan) Network Reputation November 6, 2013 18 / 52

slide-19
SLIDE 19

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Digress for a moment: can we completely close the gap?

Short answer: Yes, through mechanism design. However:

  • No voluntary participation
  • An individual may be better off opting out than participating in

the mechanism, given all others participate.

Key information in similar models missing in reality:

  • For instance: risk function fi().
  • Another example: how to monitor/enforce the investment levels.
  • Information asymmetry in the security eco-system.

Challenge and goal: have network reputation serve as a proxy for the unobservable

Liu (Michigan) Network Reputation November 6, 2013 19 / 52

slide-20
SLIDE 20

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Outline of the talk

Impact of reputation on network behavior

  • Can the desire for good reputation (or the worry over bad

reputation) positively alter a network’s decision in investment

  • Within the context of an inter-dependent security (IDS) game:

positive externality Incentivizing input – crowd sourcing reputation

  • Assume a certain level of aggregation
  • Each network possesses information about itself and others
  • Can we incentivize networks to participate in a collective effort to

achieve accurate estimates/reputation assessment, while

  • bserving privacy and self interest

Liu (Michigan) Network Reputation November 6, 2013 20 / 52

slide-21
SLIDE 21

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Crowd-sourcing reputation

  • Basic setting
  • A distributed multi-agent system.
  • Each agent has perceptions or beliefs about other agents.
  • The truth about each agent known only to itself.
  • Each agent wishes to obtain the truth about others.
  • Goal: construct mechanisms that incentivize agents to participate

in a collective effort to arrive at correct perceptions.

  • Key design challenges:
  • Participation must be voluntary.
  • Individuals may not report truthfully even if they participate.
  • Individuals may collude.

Liu (Michigan) Network Reputation November 6, 2013 21 / 52

slide-22
SLIDE 22

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Other applicable contexts and related work

Online review/recommendation systems:

  • Example: Amazon, EBay
  • Users (e.g., sellers and buyers) rate each other

Reputation in P2P systems

  • Sustaining cooperative behavior among self-interested individuals.
  • User participation is a given; usually perfect observation.

Elicitation and prediction mechanisms

  • Used to quantify the performance of forecasters; rely on
  • bservable objective ground truth.
  • Users do not attach value to realization of event or the outcome

built by elicitor.

Liu (Michigan) Network Reputation November 6, 2013 22 / 52

slide-23
SLIDE 23

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

The Model

  • K inter-connected networks, N1, N2, · · · , NK.
  • Network Ni’s overall quality or health condition described by a

rii ∈ [0, 1]: true or real quality of Ni.

  • A central reputation system collects input from each Ni and

computes a reputation index ˆ ri, the estimated quality.

Liu (Michigan) Network Reputation November 6, 2013 23 / 52

slide-24
SLIDE 24

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Main Assumptions

  • Ni knows rii precisely, but this is its private information.
  • Ni can sufficiently monitor inbound traffic from Nj to form an

estimate Rij of rjj.

  • Ni’s observation is in general incomplete and may contain

noise/errors: Rij ∼ N(µij, σ2

ij).

  • This distribution is known to network Nj, while Ni itself may or

may not be aware of it.

  • The reputation system may have independent observations R0i for

∀i.

  • The reputation mechanism is common knowledge.

Liu (Michigan) Network Reputation November 6, 2013 24 / 52

slide-25
SLIDE 25

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Designing the mechanism

  • Goal: solution to the centralized problem in an informationally

decentralized system.

  • Choice parameters of the mechanism are:
  • Message space M: inputs requested from agents.
  • Outcome function h(·): a rule according to which the input

messages are mapped to outcomes.

  • Other desirable features: budget balance, and individual

rationality.

Liu (Michigan) Network Reputation November 6, 2013 25 / 52

slide-26
SLIDE 26

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

The centralized problem

Systems’ Objective

Minimize estimation error for all networks. Two possible ways of defining a reputation index:

  • Absolute index ˆ

r A

i : an estimate of rii.

  • Relative index ˆ

r R

i : given true qualities rii, ˆ

r R

i = rii

  • k rkk .

min

  • i

|ˆ r A

i − rii|

  • r

min

  • i

|ˆ r R

i −

rii

  • k rkk

| If the system had full information about all parameters: ˆ r A

i = rii

and ˆ r R

i =

rii

  • k rkk

Liu (Michigan) Network Reputation November 6, 2013 26 / 52

slide-27
SLIDE 27

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

In a decentralized system

Ni’s Objective

The truth element: security Accurate estimate ˆ rj on networks Nj other than itself. Ii = −

  • j=i

fi(|ˆ r A

j − rjj|)

  • r

Ii = −

  • j=i

fi(|ˆ r R

j −

rjj

  • k rkk

|) . fi()’s are increasing and convex. The image element: reachability High reputation ˆ ri for itself. IIi = gi(ˆ r A

i )

  • r

IIi = gi(ˆ r R

i ).

gi()’s are increasing and concave.

Liu (Michigan) Network Reputation November 6, 2013 27 / 52

slide-28
SLIDE 28

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Different types of networks

  • Truth type: dominated by security concerns, e.g., DoD networks,

a buyer on Amazon.

  • Image type: dominated by reachability/traffic attraction concerns:

a blog hosting site, a phishing site, a seller on Amazon.

  • Mixed type: legitimate, non-malicious network; preference in

general increasing in the accuracy of others’ and its own quality estimates. ui = −λ

  • j=i

fi(|ˆ r A

j − rjj|) + (1 − λ)gi(ˆ

r A

i )

  • A homogeneous vs. a heterogeneous environment

Liu (Michigan) Network Reputation November 6, 2013 28 / 52

slide-29
SLIDE 29

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Reputation mechanisms

Design a simple mechanism for each type of environment and investigate its incentive feature.

  • Possible forms of input:
  • cross-reports Xij, j = i: Ni’s assessment of Nj’s quality
  • self-reports Xii: networks’ self-advertised quality measure
  • The qualitative features (increasing in truth and increasing in

image) of the preference are public knowledge; the functions fi(), gi() are private information.

  • Ni is an expected utility maximizer due to incomplete information.
  • Assume external observations are unbiased.
  • If taxation is needed, aggregate utility of Ni defined as

vi := ui − ti.

Liu (Michigan) Network Reputation November 6, 2013 29 / 52

slide-30
SLIDE 30

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Setting I: Truth types, absolute reputation

(Model I) ui = −

  • j=i

fi(|ˆ r A

j − rjj|)

The absolute scoring (AS) mechanism:

  • Message space M: each user reports xii ∈ [0, 1].
  • Outcome function h(·):
  • The reputation system chooses ˆ

r A

i = xii.

  • Ni is charged a tax term ti given by:

ti = |xii − R0i|2 − 1 K − 1

  • j=i

|xjj − R0j|2 .

Liu (Michigan) Network Reputation November 6, 2013 30 / 52

slide-31
SLIDE 31

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Properties of the AS mechanism

Rationale: assign reputation indices assuming truthful reports, ensure truthful reports by choosing the appropriate ti.

  • Truth-telling is a dominant strategy in the induced game

⇒ Achieves centralized solution.

i ti = 0

⇒ Budget balanced.

  • The mechanism is individually rational

⇒ Voluntary participation.

Liu (Michigan) Network Reputation November 6, 2013 31 / 52

slide-32
SLIDE 32

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Truth revelation under AS

Truth-telling is a dominant strategy in the game induced by the AS mechanism E[vi(xii, {Xjj}j=i)] = −

  • j=i

E[fi(|ˆ r A

j − rjj|)]

−E[|xii − R01|2] + 1 K − 1

  • j=i

E[|Xjj − R0j|2]

  • xii can only adjust the 2nd term, thus chosen to minimize the 2nd

term.

  • By assumption, Ni knows R0i ∼ N(rii, σ2

0i), thus optimal choice

xii = rii.

Liu (Michigan) Network Reputation November 6, 2013 32 / 52

slide-33
SLIDE 33

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Individual rationality under AS

The AS mechanism is individually rational.

  • Staying out: reserved utility given by −

j=i E(fi(|Rij − rjj|)).

  • Participating: expected utility −

j=i fi(0) at equilibrium.

  • fi(·) is increasing and convex, thus

E[fi(|Rij − rjj|)] ≥ fi(E(|Rij − rjj|)) = fi(

  • 2

πσij) > fi(0), ∀j = i.

  • The AS mechanism is individually rational.

Liu (Michigan) Network Reputation November 6, 2013 33 / 52

slide-34
SLIDE 34

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Extended-AS Mechanism

  • What if the system does not possess independent observations?
  • Use a random ring to gather cross-observations and assess taxes.
  • Ni is asked to report Xii, as well as Xi(i+1) and Xi(i+2).

Liu (Michigan) Network Reputation November 6, 2013 34 / 52

slide-35
SLIDE 35

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Extended-AS Mechanism

  • Ni is charged two taxes:
  • on the inaccuracy of its self-report wrt what Ni−1 says about Ni
  • on the inaccuracy of its cross-report on Ni+1 wrt what Ni−1 says

ti = |xii − X(i−1)i|2 − 1 K − 2

  • j=i,i+1

|Xjj − X(j−1)j|2 +|xi(i+1) − X(i−1)(i+1)|2 − 1 K − 2

  • j=i,i+1

|Xj(j+1) − X(j−1)(j+1)|2

  • Truthful self-reports achieved by the 1st taxation term.
  • Truthful cross-reports achieved by the 2nd taxation term.
  • Other associations also possible: e.g., random sets.

Extended-AS results in the centralized solution

Liu (Michigan) Network Reputation November 6, 2013 35 / 52

slide-36
SLIDE 36

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Setting II: Truth types, relative reputation

(Model II) ui = −

  • j=i

fi(|ˆ r R

j −

rjj

  • k rkk

|) The fair ranking (FR) mechanism:

  • Message space M: each user reports xii ∈ [0, 1].
  • Outcome function h(·):
  • the system assigns ˆ

r R

i = xii

  • k xkk .
  • No taxation is used.

Liu (Michigan) Network Reputation November 6, 2013 36 / 52

slide-37
SLIDE 37

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Properties of the FR mechanism

  • Truth-telling is a Bayesian Nash equilibrium in the induced game

ui(xii, {rkk}k=i) = −

  • j=i

fi(| rjj(xii − rii) (xii +

k=i rkk)( k rkk)|)

⇒ Achieves centralized solution xii = rii.

  • The mechanism is individually rational

⇒ Voluntary participation.

  • Achievable without cross-observations from other networks, direct
  • bservations by the system, or taxation.

Liu (Michigan) Network Reputation November 6, 2013 37 / 52

slide-38
SLIDE 38

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Setting III: Mixed types, relative reputation

(Model III) ui = −

  • j=i

fi(|ˆ r R

j −

rjj

  • k rkk

|) + gi(ˆ r R

i )

  • The individual’s objective is no longer aligned with the system
  • bjective
  • Direct mechanism possible depending on the specific forms of fi()

and gi().

Liu (Michigan) Network Reputation November 6, 2013 38 / 52

slide-39
SLIDE 39

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Setting IV: Mixed types, absolute reputation

(Model IV) ui = −

  • j=i

fi(|ˆ r A

j − rjj|) + gi(ˆ

r A

i )

An Impossibility result:

  • centralized solution cannot be implemented in BNE.

Consider suboptimal solution:

  • use both self- and cross-reports
  • forgo the use of taxation

Liu (Michigan) Network Reputation November 6, 2013 39 / 52

slide-40
SLIDE 40

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

A simple averaging mechanism

(Model IV) ui = −

  • j=i

fi(|ˆ r A

j − rjj|) + gi(ˆ

r A

i )

  • Solicit only cross-reports.
  • Take ˆ

r A

i

to be the average of all xji, j = i, and R0i.

  • Used in many existing online system: Amazon and Epinions.
  • Truthful revelation of Rji is a BNE.
  • Nj has no influence on its own estimate ˆ

r A

j .

  • Nj’s effective objective is to minimize the first term.
  • The simple averaging mechanism results in ˆ

r A

i ∼ N(rii, σ2/K).

  • ˆ

r A

i

can be made arbitrarily close to rii as K increases.

  • (Under this mechanism, if asked, Ni will always report xii = 1)

Liu (Michigan) Network Reputation November 6, 2013 40 / 52

slide-41
SLIDE 41

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Can we do better?

Instead of ignoring Ni’s self-report, incentivize Ni to provide useful information.

  • Convince Ni that it can contribute to a higher estimated ˆ

r A

i

by supplying input Xii,

  • Use cross-reports to assess Ni’s self-report, and threaten with

punishment if it is judged to be overly misleading.

Liu (Michigan) Network Reputation November 6, 2013 41 / 52

slide-42
SLIDE 42

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Truthful cross-reports

A mechanism in which Ni’s cross-reports are not used in calculating its

  • wn reputation estimate. Then:
  • Ni can only increase its utility by altering ˆ

r A

j when submitting Xij,

  • Ni doesn’t know rjj, can’t use a specific utility function to

strategically choose Xij,

  • Ni’s best estimate of rjj is Rij,

⇒ Truthful cross-reports! Questions:

  • Can Ni make itself look better by degrading Nj?
  • Is it in Ni’s interest to degrade Nj?

Liu (Michigan) Network Reputation November 6, 2013 42 / 52

slide-43
SLIDE 43

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

A punish-reward (PR) mechanism

Denote the output of the simple averaging mechanism by ¯ X0i. ˆ r A

i (Xii, ¯

X0i) =

  • ¯

X0i+Xii 2

if Xii ∈ [ ¯ X0i − ǫ, ¯ X0i + ǫ] ¯ X0i − |Xii − ¯ X0i| if Xii / ∈ [ ¯ X0i − ǫ, ¯ X0i + ǫ]

  • ǫ is a fixed and known constant.
  • Take the average of Xii and ¯

X0i if the two are sufficiently close; else punish Ni for reporting significantly differently. ⇒ Each network only gets to optimize its self-report, knowing all cross-reports are truthful.

Liu (Michigan) Network Reputation November 6, 2013 43 / 52

slide-44
SLIDE 44

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Choice of self-report

Self-report xii determined by maxxii E[ˆ r A

i (xii, ¯

X0i)], where ¯ X0i ∼ N(rii, σ2

K ) assuming common and known σ. Optimal xii, when

ǫ = aσ′ = a σ2

K , is given by:

x∗

ii = rii + aσ′y

0 < y < 1 ⇒ self-report is positively biased and within expected acceptable range.

0.5 1 1.5 2 2.5 3 0.1 0.2 0.3 0.4 0.5 a y

Liu (Michigan) Network Reputation November 6, 2013 44 / 52

slide-45
SLIDE 45

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Performance of the mechanism How close is ˆ r A

i

to the real quality rii: em := E(|ˆ r A

i − rii|)

  • For a large range of a values,

Ni’s self-report benefits the system as well as all networks

  • ther than Ni.
  • Optimal choice of a does not

depend on rii and σ′.

0.5 1 1.5 2 2.5 3 0.15 0.16 0.17 0.18 0.19 0.2 0.21 a Mean Absolute Error Proposed Mechanism Averaging Mechanism

Figure: MAE for rii = 0.75, σ2 = 0.1

Liu (Michigan) Network Reputation November 6, 2013 45 / 52

slide-46
SLIDE 46

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

There is a mutually beneficial region a ∈ [2, 2.5]: the self-report helps Ni obtain a higher estimated reputation, while helping the system reduce its estimation error on Ni.

0.5 1 1.5 2 2.5 3 0.15 0.16 0.17 0.18 0.19 0.2 0.21 a Mean Absolute Error Proposed Mechanism Averaging Mechanism 0.5 1 1.5 2 2.5 3 0.5 0.6 0.7 0.8 0.9 a Final Estimated Reputation Proposed Mechanism Averaging Mechanism Liu (Michigan) Network Reputation November 6, 2013 46 / 52

slide-47
SLIDE 47

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

A heterogenous environment

Example: A mix of T truth types and K − T image types, using the AS mechanism

  • Additional conditions needed to ensure individual rationality
  • The higher the percentage of image types, the less likely is a truth

type to participate

  • The higher a truth type’s own accuracy, the less interested it is to

participate

  • An image type may participate if rii is small.
  • The benefit of the mechanism decreases in the fraction of image

types.

Liu (Michigan) Network Reputation November 6, 2013 47 / 52

slide-48
SLIDE 48

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Handling collusion/cliques

  • Absolute Scoring and Fair Ranking are naturally collusion-proof.
  • PR remains functional using only the cross-observations from a

subset of trusted entities, or even a single observation by the reputation system.

  • If the system lacks independent observations, introducing

randomness can reduce the impact of cliques.

  • E.g. extended-AS mechanism: tax determined by random

matching with peers.

  • Increased likelihood of being matched with non-colluding users

reduces benefit of cliques.

Liu (Michigan) Network Reputation November 6, 2013 48 / 52

slide-49
SLIDE 49

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Other aspects

  • Other mechanisms, e.g., weighted mean of the cross-report, etc.
  • Other heterogeneous environments
  • Presence of malicious networks.

Liu (Michigan) Network Reputation November 6, 2013 49 / 52

slide-50
SLIDE 50

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Conclusion

Network reputation as a way to capture, encourage, and inform the security quality of policies Impact of reputation on network behavior

  • A reputation-augmented security investment game.
  • Reputation can increase the level of investment and drive the

system closer to social optimum.

  • Many interesting open questions.

Incentivizing input – crowd sourcing reputation

  • A number of preference models and environments
  • Incentive mechanisms in each case

Liu (Michigan) Network Reputation November 6, 2013 50 / 52

slide-51
SLIDE 51

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

References

  • P. N. Ardabilli and M. Liu, “Perceptions and Truth: A

Mechanism Design Approach to Crowd- Sourcing Reputation,” under submission. arXiv:1306.0173.

  • “Establishing Network Reputation via Mechanism Design,”

GameNets, May 2012.

  • “Collective revelation through mechanism design,” ITA, February

2012.

  • J. Zhang, A. Chivukula, M. Bailey, M. Karir, and M. Liu,

“Characterization of Blacklists and Tainted Network Traffic,” the 14th Passive and Active Measurement Conference (PAM), Hong Kong, March 2013.

  • P. N. Ardabilli and M. Liu, “Closing the Price of Anarchy Gap in

the Interdependent Security Game,” under submission. arXiv:1308.0979v1.

Liu (Michigan) Network Reputation November 6, 2013 51 / 52

slide-52
SLIDE 52

Intro Motivation Security investment Crowd sourcing Environments Discussion Conclusion

Closing the PoA gap in the IDS game

  • All participants propose an investment profile and a price profile,

(xi, πi) from i; user utility: ui(x) = −fi(x) − cixi − ti.

  • The regulator/mechanism computes:

ˆ x =

N

  • i=1

xi/N; ˆ ti = (πi+1 − πi+2)Tˆ x + balancing term

  • Achieves social optimality

max

(x,t) N

  • i=1

ui(x),

  • s. t.

N

  • i=1

ti = 0

  • Budget balanced, incentive compatible, NOT individually rational.
  • Having the regulator act as an insurer may lead to individual

rationality.

Liu (Michigan) Network Reputation November 6, 2013 52 / 52