inferring internet inferring internet denial of of
play

Inferring Internet Inferring Internet Denial- -of of- -Service - PowerPoint PPT Presentation

Inferring Internet Inferring Internet Denial- -of of- -Service Activity Service Activity Denial Geoffrey M. Voelker Geoffrey M. Voelker University of California, San Diego University of California, San Diego Joint work with David Moore


  1. Inferring Internet Inferring Internet Denial- -of of- -Service Activity Service Activity Denial Geoffrey M. Voelker Geoffrey M. Voelker University of California, San Diego University of California, San Diego Joint work with David Moore (CAIDA/UCSD) Joint work with David Moore (CAIDA/UCSD) and Stefan Savage (UCSD) and Stefan Savage (UCSD)

  2. Simple Question Simple Question We were interested in answering a simple question: How prevalent are denial-of-service attacks in the Internet? October 17, 2001 University of Virginia 2

  3. Anecdotal Data Anecdotal Data Press reports: Analysts: “Losses … could total more than $1.2 billion” - Yankee Group report “38% of security professionals surveyed Surveys: reported denial of service activity in 2000” - CSI/FBI survey October 17, 2001 University of Virginia 3

  4. Quantitative Data? Quantitative Data? � Is not available (i.e., no one knows) � Inherently hard to acquire � Few content or service providers collect such data � If they do, its usually considered sensitive � Infeasible to collect at Internet scale � How can you monitor enough of the Internet to obtain a representative sample? October 17, 2001 University of Virginia 4

  5. Our Contributions Our Contributions � Backscatter analysis � New technique for estimating global denial-of-service activity � First data describing Internet-wide DoS activity � ~4,000 attacks per week (> 12,000 over 3 weeks) � Instantaneous loads above 600k pps � Characterization of attacks and victims � Paper appeared this August: � Moore, Voelker and Savage, Inferring Internet Denial-of- Service Activity , 2001 USENIX Security October 17, 2001 University of Virginia 5

  6. Overview Overview � Describe backscatter analysis � Experimental setup � Series of analyses and attack characterizations � Tracking the Code Red Worm October 17, 2001 University of Virginia 6

  7. Key Idea Key Idea � Flooding-style DoS attacks � e.g. SYN flood, ICMP flood � Attackers spoof source address randomly � True of all major attack tools � Victims, in turn, respond to attack packets � Unsolicited responses ( backscatter ) equally distributed across IP space � Received backscatter is evidence of an attacker elsewhere October 17, 2001 University of Virginia 7

  8. Backscatter Example Backscatter Example SYN+ACK backscatter SYN packets B V B D V B V C V Victim V Attack Backscatter C D October 17, 2001 University of Virginia 8

  9. Backscatter Analysis Backscatter Analysis � Monitor block of n IP addresses � Expected # of backscatter packets given an attack of m packets: nm E(X) = 32 2 � Extrapolated attack rate R is a function of measured backscatter rate R’: 32 2 ≥ R R ' n October 17, 2001 University of Virginia 9

  10. Assumptions and Biases Assumptions and Biases � Address uniformity � Ingress filtering, reflectors, etc. cause us to underestimate # of attacks � Can bias rate estimation (can we test uniformity?) � Reliable delivery � Packet losses, server overload & rate limiting cause us to underestimate attack rates/durations � Backscatter hypothesis � Can be biased by purposeful unsolicited packets » Port scanning (minor factor at worst in practice) � Do we detect backscatter at multiple sites? October 17, 2001 University of Virginia 10

  11. Experimental Setup Experimental Setup Internet Monitor (w/big disk) Quiescent /8 Network (2 24 addresses) October 17, 2001 University of Virginia 11

  12. Methodology Methodology � Collected three weeks of traces (February 2001) � Analyzed trace data from two perspectives � Flow-based analysis (categorical) � Number, duration, kinds of attacks � Keyed on victim IP address and protocol � Flow duration defined by explicit parameters (min threshold, timeout) � Event-based analysis (intensity) � Rate, intensity over time � Attack event: backscatter packets from IP address in 1 minute window October 17, 2001 University of Virginia 12

  13. Analysis Analysis � Summary statistics � Time behavior � Protocol � Duration � Rate � Victim categorization � DNS, top-level domain (TLD), AS � Popularity October 17, 2001 University of Virginia 13

  14. Attack Breakdow n Attack Breakdow n Week1 Week2 Week3 Attacks 4173 3878 4754 Victim IP’s 1942 1821 2385 Victim prefixes 1132 1085 1281 Victim AS’s 585 575 677 Victim DNS domains 750 693 876 Victim DNS TLDs 60 62 71 October 17, 2001 University of Virginia 14

  15. 15 (Surprisingly uniform, no diurnal effects) Attacks Over Time Attacks Over Time University of Virginia October 17, 2001

  16. 16 Periodic Attack (Daily) Periodic Attack (Daily) (Every day like clockwork) University of Virginia October 17, 2001

  17. 17 Punctuated Attack (1 min) Punctuated Attack (1 min) (Fine-grained behavior as well) University of Virginia October 17, 2001

  18. Attack Protocol/Services Attack Protocol/Services � Protocols � Mostly TCP (90-94% attacks) � A few large ICMP floods (up to 43% of packets) � Services � Most attacks on multiple ports (~80%) � A few services (HTTP, IRC) singled out October 17, 2001 University of Virginia 18

  19. 19 (Most between 3-30 mins) University of Virginia Attack Duration Attack Duration (50% > 10 mins) October 17, 2001

  20. Attack Rate Attack Rate (50% > 350 pps/sec, most intense is 679,000 pps) October 17, 2001 University of Virginia 20

  21. Victim Characterization (DNS) Victim Characterization (DNS) � Entire spectrum of commercial businesses � Yahoo, CNN, Amazon, etc. and many smaller biz � Evidence that minor DoS attacks used for personal vendettas � 10-20% of attacks to home machines � A few very large attacks against broadband � Many reverse mappings clearly compromised (e.g. is.on.the.net.illegal.ly and the.feds.cant.secure.their.shellz.ca) � 5% of attack target infrastructure � Routers (e.g. core2-core1-oc48.paol.above.net) � Name servers (e.g. ns4.reliablehosting.com) October 17, 2001 University of Virginia 21

  22. Victim Top- -Level Domains Level Domains Victim Top 35 30 Week 1 25 Week 2 Percent of Attacks Week 3 20 15 10 5 0 unknown net com ro br org edu ca de uk Top-Level Domain (net == com, edu small, ro and br unusual) October 17, 2001 University of Virginia 22

  23. Victim Autonomous Systems Victim Autonomous Systems 6 5 Percent of Attacks Week 1 4 Week 2 Week 3 3 2 1 0 ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) ) 3 1 2 0 9 6 * 8 7 8 1 6 1 2 9 8 2 ( 6 0 7 3 0 2 3 3 0 7 0 7 0 0 3 6 6 E 7 1 2 7 1 7 2 3 1 7 9 3 2 3 6 6 ( T 6 4 8 1 7 1 3 9 5 2 ( 1 8 5 ( ( S ( ( 1 ( ( 1 U ( ( 1 1 1 T ( S 1 R ( A T ( A K E E ( ( ( O 8 S - T S T T - B E 2 N S G 3 2 T I E R T H E A 0 E F 6 - N I - 3 E W D E O L 3 A L T O 6 N - N 8 S N T E N E T N 2 B E S 5 R Q S H - D A N P T 1 1 R E N A E O A - B A R S S S I N O E L A M R T C R T A A E T T S I S O P L E E B T L A E S H N A M T E Autonomous System (No single AS/set of AS’s are targeted (long tail, too)) October 17, 2001 University of Virginia 23

  24. Victim Popularity Victim Popularity 100 10 % Victims 1 0.1 0.01 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 3 4 5 6 7 8 0 1 2 4 5 7 8 9 0 1 2 4 5 6 8 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 # Attacks (Most victims attacked once, but a few are unfortunate favorites) October 17, 2001 University of Virginia 24

  25. Validation Validation � How do we know we are seeing backscatter from attacks, and not just funky traffic to our network? � Backscatter not explained by port scanning � 98% of backscatter packets do not cause response � Repeated experiment with independent monitor (3 /16’s from Vern Paxson) � Only captured TCP SYN/ACK backscatter � 98% inclusion into larger dataset � Matched to actual attacks detected by Asta Networks on large backbone network October 17, 2001 University of Virginia 25

  26. Summary Summary � Lots of attacks – some very large � >12,000 attacks against >5,000 targets in a week � Most < 1,000 pps, but some over 600,000 pps � Everyone is a potential target � Targets not dominated by any TLD, 2LD or AS » Targets include large e-commerce sites, mid-sized business, ISPs, government, universities and end-users � Something weird is happening in Romania � New attack “styles” � Punctuated/periodic attacks � Attacks against infrastructure targets & broadband October 17, 2001 University of Virginia 26

  27. Code Red Worm Code Red Worm � In July, David Moore used the same technique to track the Code Red Worm � While collecting backscatter data (no way to predict) � Code Red � Infects MS IIS Web servers via security hole � Once infected, victim tries to infect other hosts � Culminates in a coordinated attack against whitehouse.gov � Impact � Tremendous amount of popular press » FBI warning on second round of Code Red Worm October 17, 2001 University of Virginia 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend