Anonymity Loves Company: Usability and the network effect Roger - - PowerPoint PPT Presentation

anonymity loves company usability and the network effect
SMART_READER_LITE
LIVE PREVIEW

Anonymity Loves Company: Usability and the network effect Roger - - PowerPoint PPT Presentation

Anonymity Loves Company: Usability and the network effect Roger Dingledine, Nick Mathewson The Free Haven Project 1 Overview We design and deploy anonymity systems. Version 1: You guys are studying this in academia, and we're


slide-1
SLIDE 1

1

Anonymity Loves Company: Usability and the network effect

Roger Dingledine, Nick Mathewson The Free Haven Project

slide-2
SLIDE 2

2

Overview

  • We design and deploy anonymity systems.
  • Version 1: “You guys are studying this in

academia, and we're building them. Please study us.”

  • Version 2: “Economics of anonymity are still not

considered by (many) researchers.”

  • Version 3: “If you're thinking of building an

anonymity system...”

slide-3
SLIDE 3

3

Rump session follow-up.

  • Yes, usability is an excellent idea. We're working

towards that.

  • But we're curious about the effects on security as

we make progress on usability.

  • (Our notion of usability is very broad – e.g.

anything that grows the user base.)

slide-4
SLIDE 4

4

Security is a collaboration

  • Suppose two encryption programs:

– HeavyCrypto is hard to use properly, but more secure if

you do.

– LightCrypto is easier to use, but can't provide as much

security.

  • Which should you ask your friends to use to send

encrypted mail to you? What if you use both?

  • Security is a collaboration between sender and

receiver.

slide-5
SLIDE 5

5

Security affects usability

  • There are many other cases where usability

impacts security (badly labeled off switches, false sense of security, inconvenient security, bad mental models, ...)

  • But let's talk about anonymity systems: many

people aggregate their traffic to gain security. So now we're talking more than two participants.

slide-6
SLIDE 6

6

Formally: anonymity means indistinguishability within an “anonymity set”

Alice1 Alice4 Alice7 Alice2 Alice6 Alice5 Alice8 Alice3 .... Bob Attacker can't tell which Alice is talking to Bob

slide-7
SLIDE 7

7

We have to make some assumptions about what the attacker can do.

Alice Anonymity network Bob watch (or be!) Bob! watch Alice! Control part of the network! Etc, etc.

slide-8
SLIDE 8

8

Anonymity serves different interests for different user groups.

Anonymity Private citizens Governments Businesses “It's privacy!”

slide-9
SLIDE 9

9

Anonymity serves different interests for different user groups.

Anonymity Private citizens Governments Businesses “It's network security!” “It's privacy!”

slide-10
SLIDE 10

10

Anonymity serves different interests for different user groups.

Anonymity Private citizens Governments Businesses “It's traffic-analysis resistance!” “It's network security!” “It's privacy!”

slide-11
SLIDE 11

11

The simplest designs use a single relay to hide connections.

Bob2 Bob1 Bob3 Alice2 Alice1 Alice3 Relay Bob3,“X” Bob1, “Y” B

  • b

2 , “ Z ” “Y” “Z” “X” (ex: some commercial proxy providers)

slide-12
SLIDE 12

12

So, add multiple relays so that no single one can betray Alice.

Bob Alice R1 R2 R3 R4 R5

slide-13
SLIDE 13

13

But users need to be behave similarly.

  • If two users behave entirely differently, they don't

provide cover for each other.

  • Some partitioning can be avoided by constructing

a better anonymity system (see next workshop).

  • But some is inevitable: using different protocols,

speaking different languages, etc.

  • #1: Users need to consider how usable others will

find the system, to benefit from a larger anonymity set.

slide-14
SLIDE 14

14

But what about users with different security goals?

  • Some designs are high-latency, others low-
  • latency. Protect against different threat models.
  • So which should you use if you're flexible?
  • High-latency: against strong attackers we're in

better shape.

  • But if few others choose high-latency, we're weak

against both strong and weak attackers!

  • #2: Choosing the system with the strongest

security model may not get you the best security.

slide-15
SLIDE 15

15

Options can hurt anonymity.

  • Options hurt security: users are often not the best

people to make security decisions; and non- default configurations don't get tested enough.

  • They're even worse for anonymity, since they can

splinter the anonymity set. E.g. Type I remailer padding settings.

  • #3: Designers must set security parameters.
slide-16
SLIDE 16

16

The default is safer than you think.

  • Even when users' needs genuinely vary, adding
  • ptions is not necessarily smart.
  • In practice, the default will be used by most

people, so those who need security should use the default even when it would not otherwise be their best choice.

  • #4: Design as though the default is the only
  • ption.
slide-17
SLIDE 17

17

Convenience vs. Security

  • How should Mixminion handle MIME-encoded

data? Hard to normalize all possible inputs. Demand that everybody use one mailer?

  • Tor path selection: some users want quick paths

(one hop), whereas two or three hops seems smarter.

  • #5: If you don't support what users want, they'll

do it anyway -- insecurely.

slide-18
SLIDE 18

18

Deployment matters too.

  • Example: Since Tor is a SOCKS proxy, you need

to configure your applications to point to it.

  • This is not intuitive for novice users.
  • A larger user base doesn't help security-conscious

users unless they can configure things right.

  • Need to bundle with support tools that configure

everything automatically.

  • #6: The anonymity questions don't end with

designing the protocol. AKA, “ZKS was right.”

slide-19
SLIDE 19

19

slide-20
SLIDE 20

20

Users want to know what level of security they're getting.

  • JAP uses its anonym-o-meter. This is a great

idea, but we don't think it's a good metric for low-latency systems.

  • Tor doesn't really give users a metric. We don't

know what they use.

  • #7: Give users a security metric, or they'll infer it

from something else.

slide-21
SLIDE 21

21

Bootstrapping

  • Most security systems start with high-needs users

(early adopters).

  • But in anonymity systems, the high-needs users

will wait until there's a user base.

  • Low-needs users can break the deadlock.
  • #8: If you start your system emphasizing security

rather than usability, you will never get off the ground.

slide-22
SLIDE 22

22

Perception and Confidence

  • Our analysis so far relies on users' accurate

perceptions of present and future anonymity set size.

  • #9: Expectations themselves can produce trends:

the metric is not just usability, but perceived usability.

  • So marketing can improve security??
  • (This is made messier because there aren't good

technical metrics to guess the number of users.)

slide-23
SLIDE 23

23

Reputability: the perception of social value based on current users.

  • The more cancer survivors on Tor, the better for

the human rights activists. The more script kiddies, the worse for the normal users.

  • Reputability impacts growth/sustainability of the
  • network. It also dictates how many strong

attackers are attracted.

  • #10: Reputability affects anonymity, and a

network's reputation can be established early.

slide-24
SLIDE 24

24

Anonymity's network effect vs. other network effects.

  • Say I have a ham radio and a telephone. I lose

nothing other than my investment in the ham

  • radio. Same with VHS and Beta.
  • Whereas if I participate in a secure and an

insecure anonymity network, even if I make all my decisions well, I still am worse off.

  • People use number of customers as a signal --

"But if more customers actually improve the quality of the burger..."

slide-25
SLIDE 25

25

Conclusions

  • Bad loop: unusability means insecurity.
  • Good loop: usability means security.
  • We can't just wait to build the most usable and

most secure system: people are going to take their actions anyway, on less safe systems.