Tor and (un)provable privacy Roger Dingledine The Tor Project - - PowerPoint PPT Presentation

tor and un provable privacy
SMART_READER_LITE
LIVE PREVIEW

Tor and (un)provable privacy Roger Dingledine The Tor Project - - PowerPoint PPT Presentation

Tor and (un)provable privacy Roger Dingledine The Tor Project https://torproject.org/ 1 Today's plan 0) Crash course on Tor 1) Anonymity attacks 2) Blocking-resistance 2 What is Tor? Online anonymity 1) open source software, 2)


slide-1
SLIDE 1

1

Tor and (un)provable privacy

Roger Dingledine The Tor Project https://torproject.org/

slide-2
SLIDE 2

2

Today's plan

  • 0) Crash course on Tor
  • 1) Anonymity attacks
  • 2) Blocking-resistance
slide-3
SLIDE 3

3

What is Tor?

Online anonymity 1) open source software, 2) network, 3) protocol Community of researchers, developers, users, and relay operators Funding from US DoD, Electronic Frontier Foundation, Voice of America, Google, NLnet, Human Rights Watch, NSF, US State Dept, SIDA, ...

slide-4
SLIDE 4

4

U.S. 501(c)(3) non-profit

  • rganization dedicated to

the research and development of tools for

  • nline anonymity and

privacy

The Tor Project, Inc.

slide-5
SLIDE 5

5

Estimated ~400,000? daily Tor users

slide-6
SLIDE 6

6

Threat model: what can the attacker do?

Alice Anonymity network Bob watch (or be!) Bob! watch Alice! Control part of the network!

slide-7
SLIDE 7

7

Anonymity isn't encryption: Encryption just protects contents.

Alice Bob “Hi, Bob!” “Hi, Bob!” <gibberish> attacker

slide-8
SLIDE 8

8

Anonymity isn't just wishful thinking...

“You can't prove it was me!” “Promise you won't look!” “Promise you won't remember!” “Promise you won't tell!” “I didn't write my name on it!” “Isn't the Internet already anonymous?”

slide-9
SLIDE 9

9

Anonymity serves different interests for different user groups.

Anonymity

Private citizens “It's privacy!”

slide-10
SLIDE 10

10

Anonymity serves different interests for different user groups.

Anonymity

Private citizens Businesses “It's network security!” “It's privacy!”

slide-11
SLIDE 11

11

Anonymity serves different interests for different user groups.

Anonymity

Private citizens Governments Businesses “It's traffic-analysis resistance!” “It's network security!” “It's privacy!”

slide-12
SLIDE 12

12

Anonymity serves different interests for different user groups.

Anonymity

Private citizens Governments Businesses “It's traffic-analysis resistance!” “It's network security!” “It's privacy!” Human rights activists “It's reachability!”

slide-13
SLIDE 13

13

The simplest designs use a single relay to hide connections.

Bob2 Bob1 Bob3 Alice2 Alice1 Alice3 Relay E(Bob3,“X”) E(Bob1, “Y”) E ( B

  • b

2 , “ Z ” ) “Y” “Z” “X”

(example: some commercial proxy providers)

slide-14
SLIDE 14

14

But a single relay (or eavesdropper!) is a single point of failure.

Bob2 Bob1 Bob3 Alice2 Alice1 Alice3 Evil Relay E(Bob3,“X”) E(Bob1, “Y”) E ( B

  • b

2 , “ Z ” ) “Y” “Z” “X”

slide-15
SLIDE 15

15

... or a single point of bypass.

Bob2 Bob1 Bob3 Alice2 Alice1 Alice3 Irrelevant Relay E(Bob3,“X”) E(Bob1, “Y”) E ( B

  • b

2 , “ Z ” ) “Y” “Z” “X”

Timing analysis bridges all connections through relay ⇒ An attractive fat target

slide-16
SLIDE 16

16

So, add multiple relays so that no single one can betray Alice.

Bob Alice R1 R2 R3 R4 R5

slide-17
SLIDE 17

17

A corrupt first hop can tell that Alice is talking, but not to whom.

Bob Alice R1 R2 R3 R4 R5

slide-18
SLIDE 18

18

A corrupt final hop can tell that somebody is talking to Bob, but not who.

Bob Alice R1 R2 R3 R4 R5

slide-19
SLIDE 19

19

Alice makes a session key with R1 ...And then tunnels to R2...and to R3

Bob Alice R1 R2 R3 R4 R5 Bob2

slide-20
SLIDE 20

20

slide-21
SLIDE 21

21

slide-22
SLIDE 22

22

Today's plan

  • 0) Crash course on Tor
  • 1) Anonymity attacks
  • 2) Blocking-resistance
slide-23
SLIDE 23

23

Operational attacks

  • You need to use https – correctly.
  • Don't use Flash.
  • Who runs the relays?
  • What local traces does Tor leave on the

system?

  • ...Different talk.
slide-24
SLIDE 24

24

Traffic confirmation

  • If you can see the flow into Tor and the

flow out of Tor, simple math lets you correlate them.

  • Feamster's AS-level attack (2004),

Edman's followup (2009), Murdoch's sampled traffic analysis attack (2007).

slide-25
SLIDE 25

25

Countermeasures?

  • Defensive dropping (2004)? Adaptive

padding (2006)?

  • Traffic morphing (2009), Johnson (2010)
  • Tagging attack, traffic watermarking
slide-26
SLIDE 26

26

Congestion attacks (1)

  • Murdoch-Danezis attack (2005) sent

constant traffic through every relay, and when Alice made her connection, looked for a traffic bump in three relays.

  • Couldn't identify Alice – just the relays

she picked.

slide-27
SLIDE 27

27

Congestion attacks (2)

  • Hopper et al (2007) extended this to

(maybe) locate Alice based on latency.

  • Chakravarty et al (2008) extended this to

(maybe) locate Alice via bandwidth tests.

  • Evans et al (2009) showed the original

attack doesn't work anymore (too many relays, too much noise) – but “infinite length circuit” makes it work again?

slide-28
SLIDE 28

28

Throughput fingerprinting

  • Mittal et al, CCS 2011
  • Build a test path through the network.

See if you picked the same bottleneck node as Alice picked.

slide-29
SLIDE 29

29

Anonymity / load balancing

  • Give more load to fast relays, but less

anonymity

  • Client-side network observations, like

circuit-build-timeout or congestion- aware path selection

slide-30
SLIDE 30

30

Bandwidth measurement

  • Bauer et al (WPES 2009)
  • Clients used the bandwidth as reported

by the relay

  • So you could sign up tiny relays, claim

huge bandwidth, and get lots of traffic

  • Fix is active measurement.

(Centralized vs distributed?)

slide-31
SLIDE 31

31

Tor gives three anonymity properties

  • #1: A local network attacker can't learn,
  • r influence, your destination.
  • #2: No single router can link you to your

destination.

  • #3: The destination, or somebody

watching it, can't learn your location.

slide-32
SLIDE 32

32

Tor's safety comes from diversity

  • #1: Diversity of relays. The more relays

we have and the more diverse they, the fewer attackers are in a position to do traffic confirmation.

  • #2: Diversity of users and reasons to use
  • it. 60000 users in Iran means almost all of

them are normal citizens.

slide-33
SLIDE 33

33

Long-term passive attacks

  • Matt Wright's predecessor attack
  • Overlier and Syverson, Oakland 2006
  • The more circuits you make, the more

likely one of them is bad

  • The fix: guard relays
  • But: guard churn so old guards don't

accrue too many users

slide-34
SLIDE 34

34

Website fingerprinting

  • If you can see an SSL-encrypted link,

you can guess what web page is inside it based on size.

  • Does this attack work on Tor? Open-

world vs closed-world analysis.

  • Considering multiple pages (e.g. via

hidden Markov models) would probably make the attack even more effective.

slide-35
SLIDE 35

35

Denial of service as denial of anonymity

  • Borisov et al, CCS 2007
  • If you can't win against a circuit, kill it

and see if you win the next one

  • Guard relays also a good answer here.
slide-36
SLIDE 36

36

Epistemic attacks on route selection

  • Danezis/Syverson (PET 2008)
  • If the list of relays gets big enough, we'd

be tempted to give people random subsets of the relay list

  • But, partitioning attacks
  • Anonymous lookup? DHT? PIR?
slide-37
SLIDE 37

37

Profiling at exit relays

  • Tor reuses the same circuit for 10

minutes before rotating to a new one.

  • (It used to be 30 seconds, but that put too

much CPU load on the relays.)

  • If one of your connections identifies you,

then the rest lose too.

  • What's the right algorithm for allocating

connections to circuits safely?

slide-38
SLIDE 38

38

Declining to extend

  • Tor's directory system prevents an

attacker from spoofing the whole Tor network.

  • But your first hop can still say “sorry, that

relay isn't up. Try again.”

  • Or your local network can restrict

connections so you only reach relays they like.

slide-39
SLIDE 39

39

Attacks on Tor

  • Pretty much any Tor bug seems to turn

into an anonymity attack.

  • Many of the hard research problems are

attacks against all low-latency anonymity

  • systems. Tor is still the best that we know
  • f – other than not communicating.
  • People find things because of the openness

and thoroughness of our design, spec, and

  • code. We'd love to hear from you.
slide-40
SLIDE 40

40

Today's plan

  • 0) Crash course on Tor
  • 1) Anonymity attacks
  • 2) Blocking-resistance
slide-41
SLIDE 41

41

Attackers can block users from connecting to the Tor network

1) By blocking the directory authorities 2) By blocking all the relay IP addresses in the directory, or the addresses of other Tor services 3) By filtering based on Tor's network fingerprint 4) By preventing users from finding the Tor software (usually by blocking website)

slide-42
SLIDE 42

42

Relay versus Discovery

There are two pieces to all these “proxying” schemes: a relay component: building circuits, sending traffic over them, getting the crypto right a discovery component: learning what relays are available

slide-43
SLIDE 43

43

The basic Tor design uses a simple centralized directory protocol.

S2 S1 Alice Trusted directory Trusted directory S3 cache cache Servers publish self-signed descriptors. Authorities publish a consensus list of all descriptors Alice downloads consensus and descriptors from anywhere

slide-44
SLIDE 44

44 R4 R2 R1 R3 Bob Alice Alice Alice Alice Alice Blocked User Blocked User Blocked User Blocked User Blocked User Alice Alice Alice Alice Alice Alice Alice Alice Alice Alice

slide-45
SLIDE 45

45

How do you find a bridge?

1) https://bridges.torproject.org/ will tell you a few based on time and your IP address 2) Mail bridges@torproject.org from a gmail address and we'll send you a few 3) I mail some to a friend in Shanghai who distributes them via his social network 4) You can set up your own private bridge and tell your target users directly

slide-46
SLIDE 46

46

Ways to find bridges (1)

  • 1) Overwhelm the public address

distribution strategies

  • 2) Run a non-guard non-exit relay and look

for connections from non-relays.

  • 3) Run a guard relay and look for protocol

differences

  • 4) Run a guard relay and do timing analysis
  • 5) Run a relay and probe clients as they

connect to you

slide-47
SLIDE 47

47

Ways to find bridges (2)

  • 6) Scan the Internet for things that talk Tor
  • 7) Break into Tor Project infrastructure (or

break the developers)

  • 8) Watch the bridge authority do its

reachability tests

  • 9) Watch your border firewall and DPI for

Tor flows

  • 10) Zig-zag between bridges and users
slide-48
SLIDE 48

48

slide-49
SLIDE 49

49

Pluggable transports

  • In Feb 2012, Iran DPIed for all SSL flows

and cut them

  • No more gmail, facebook, etc etc
  • Pluggable transports
  • Obfsproxy
  • SkypeMorph
  • StegoTorus
  • Need “obfuscation” metrics?
slide-50
SLIDE 50

50

slide-51
SLIDE 51

51

Only a piece of the puzzle

Assume the users aren't attacked by their hardware and software No spyware installed, no cameras watching their screens, etc Users can fetch a genuine copy of Tor?

slide-52
SLIDE 52

52