tor and un provable privacy
play

Tor and (un)provable privacy Roger Dingledine The Tor Project - PowerPoint PPT Presentation

Tor and (un)provable privacy Roger Dingledine The Tor Project https://torproject.org/ 1 Today's plan 0) Crash course on Tor 1) Anonymity attacks 2) Blocking-resistance 2 What is Tor? Online anonymity 1) open source software, 2)


  1. Tor and (un)provable privacy Roger Dingledine The Tor Project https://torproject.org/ 1

  2. Today's plan ● 0) Crash course on Tor ● 1) Anonymity attacks ● 2) Blocking-resistance 2

  3. What is Tor? Online anonymity 1) open source software, 2) network, 3) protocol Community of researchers, developers, users, and relay operators Funding from US DoD, Electronic Frontier Foundation, Voice of America, Google, NLnet, Human Rights Watch, NSF, US State Dept, SIDA, ... 3

  4. The Tor Project, Inc. U.S. 501(c)(3) non-profit organization dedicated to the research and development of tools for online anonymity and privacy 4

  5. Estimated ~400,000? daily Tor users 5

  6. Threat model: what can the attacker do? Alice Anonymity network Bob watch Alice! watch (or be!) Bob! Control part of the network! 6

  7. Anonymity isn't encryption: Encryption just protects contents. “Hi, Bob!” “Hi, Bob!” <gibberish> Alice attacker Bob 7

  8. Anonymity isn't just wishful thinking... “You can't prove it was me!” “Promise you won't look!” “Promise you won't remember!” “Promise you won't tell!” “I didn't write my name on it!” “Isn't the Internet already anonymous?” 8

  9. Anonymity serves different interests for different user groups. Anonymity Private citizens “It's privacy!” 9

  10. Anonymity serves different interests for different user groups. Businesses Anonymity “It's network security!” Private citizens “It's privacy!” 10

  11. Anonymity serves different interests for different user groups. “It's traffic-analysis resistance!” Businesses Governments Anonymity “It's network security!” Private citizens “It's privacy!” 11

  12. Anonymity serves different interests for different user groups. “It's reachability!” Human rights “It's traffic-analysis activists resistance!” Businesses Governments Anonymity “It's network security!” Private citizens “It's privacy!” 12

  13. The simplest designs use a single relay to hide connections. Bob1 Alice1 E(Bob3,“X”) “Y” Relay Alice2 “Z” Bob2 E(Bob1, “Y”) ) “X” ” Z “ , 2 b o B ( E Bob3 Alice3 (example: some commercial proxy providers) 13

  14. But a single relay (or eavesdropper!) is a single point of failure. Bob1 Alice1 E(Bob3,“X”) “Y” Evil Alice2 Relay “Z” Bob2 E(Bob1, “Y”) ) “X” ” Z “ , 2 b o B ( E Bob3 Alice3 14

  15. ... or a single point of bypass. Bob1 Alice1 E(Bob3,“X”) “Y” Irrelevant Alice2 Relay “Z” Bob2 E(Bob1, “Y”) ) “X” ” Z “ , 2 b o B ( E Bob3 Alice3 Timing analysis bridges all connections ⇒ An attractive fat target through relay 15

  16. So, add multiple relays so that no single one can betray Alice. Bob Alice R1 R3 R5 R4 R2 16

  17. A corrupt first hop can tell that Alice is talking, but not to whom. Bob Alice R1 R3 R5 R4 R2 17

  18. A corrupt final hop can tell that somebody is talking to Bob, but not who. Bob Alice R1 R3 R5 R4 R2 18

  19. Alice makes a session key with R1 ...And then tunnels to R2...and to R3 Bob Alice R1 R3 Bob2 R5 R4 R2 19

  20. 20

  21. 21

  22. Today's plan ● 0) Crash course on Tor ● 1) Anonymity attacks ● 2) Blocking-resistance 22

  23. Operational attacks ● You need to use https – correctly. ● Don't use Flash. ● Who runs the relays? ● What local traces does Tor leave on the system? ● ...Different talk. 23

  24. Traffic confirmation ● If you can see the flow into Tor and the flow out of Tor, simple math lets you correlate them. ● Feamster's AS-level attack (2004), Edman's followup (2009), Murdoch's sampled traffic analysis attack (2007). 24

  25. Countermeasures? ● Defensive dropping (2004)? Adaptive padding (2006)? ● Traffic morphing (2009), Johnson (2010) ● Tagging attack, traffic watermarking 25

  26. Congestion attacks (1) ● Murdoch-Danezis attack (2005) sent constant traffic through every relay, and when Alice made her connection, looked for a traffic bump in three relays. ● Couldn't identify Alice – just the relays she picked. 26

  27. Congestion attacks (2) ● Hopper et al (2007) extended this to (maybe) locate Alice based on latency. ● Chakravarty et al (2008) extended this to (maybe) locate Alice via bandwidth tests. ● Evans et al (2009) showed the original attack doesn't work anymore (too many relays, too much noise) – but “infinite length circuit” makes it work again? 27

  28. Throughput fingerprinting ● Mittal et al, CCS 2011 ● Build a test path through the network. See if you picked the same bottleneck node as Alice picked. 28

  29. Anonymity / load balancing ● Give more load to fast relays, but less anonymity ● Client-side network observations, like circuit-build-timeout or congestion- aware path selection 29

  30. Bandwidth measurement ● Bauer et al (WPES 2009) ● Clients used the bandwidth as reported by the relay ● So you could sign up tiny relays, claim huge bandwidth, and get lots of traffic ● Fix is active measurement. (Centralized vs distributed?) 30

  31. Tor gives three anonymity properties ● #1 : A local network attacker can't learn, or influence, your destination. ● #2 : No single router can link you to your destination. ● #3 : The destination, or somebody watching it, can't learn your location. 31

  32. Tor's safety comes from diversity ● #1: Diversity of relays. The more relays we have and the more diverse they, the fewer attackers are in a position to do traffic confirmation. ● #2: Diversity of users and reasons to use it. 60000 users in Iran means almost all of them are normal citizens. 32

  33. Long-term passive attacks ● Matt Wright's predecessor attack ● Overlier and Syverson, Oakland 2006 ● The more circuits you make, the more likely one of them is bad ● The fix: guard relays ● But: guard churn so old guards don't accrue too many users 33

  34. Website fingerprinting ● If you can see an SSL-encrypted link, you can guess what web page is inside it based on size. ● Does this attack work on Tor? Open- world vs closed-world analysis. ● Considering multiple pages (e.g. via hidden Markov models) would probably make the attack even more effective. 34

  35. Denial of service as denial of anonymity ● Borisov et al, CCS 2007 ● If you can't win against a circuit, kill it and see if you win the next one ● Guard relays also a good answer here. 35

  36. Epistemic attacks on route selection ● Danezis/Syverson (PET 2008) ● If the list of relays gets big enough, we'd be tempted to give people random subsets of the relay list ● But, partitioning attacks ● Anonymous lookup? DHT? PIR? 36

  37. Profiling at exit relays ● Tor reuses the same circuit for 10 minutes before rotating to a new one. ● (It used to be 30 seconds, but that put too much CPU load on the relays.) ● If one of your connections identifies you, then the rest lose too. ● What's the right algorithm for allocating connections to circuits safely? 37

  38. Declining to extend ● Tor's directory system prevents an attacker from spoofing the whole Tor network. ● But your first hop can still say “sorry, that relay isn't up. Try again.” ● Or your local network can restrict connections so you only reach relays they like. 38

  39. Attacks on Tor ● Pretty much any Tor bug seems to turn into an anonymity attack. ● Many of the hard research problems are attacks against all low-latency anonymity systems. Tor is still the best that we know of – other than not communicating. ● People find things because of the openness and thoroughness of our design, spec, and code. We'd love to hear from you. 39

  40. Today's plan ● 0) Crash course on Tor ● 1) Anonymity attacks ● 2) Blocking-resistance 40

  41. Attackers can block users from connecting to the Tor network 1) By blocking the directory authorities 2) By blocking all the relay IP addresses in the directory, or the addresses of other Tor services 3) By filtering based on Tor's network fingerprint 4) By preventing users from finding the Tor software (usually by blocking website) 41

  42. Relay versus Discovery There are two pieces to all these “proxying” schemes: a relay component: building circuits, sending traffic over them, getting the crypto right a discovery component: learning what relays are available 42

  43. The basic Tor design uses a simple centralized directory protocol. cache S1 Trusted directory Alice S2 Alice downloads consensus and Trusted directory cache descriptors from anywhere Authorities S3 publish a consensus Servers publish list of all descriptors self-signed descriptors. 43

  44. Alice Alice Alice Blocked Alice Alice User R3 Alice Blocked R4 Bob User Alice Alice R2 Blocked User Alice R1 Alice Blocked Alice User Alice Blocked Alice User Alice Alice 44

  45. How do you find a bridge? 1) https://bridges.torproject.org/ will tell you a few based on time and your IP address 2) Mail bridges@torproject.org from a gmail address and we'll send you a few 3) I mail some to a friend in Shanghai who distributes them via his social network 4) You can set up your own private bridge and tell your target users directly 45

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend