Inferring Internet Inferring Internet Denial- -of of- -Service - - PowerPoint PPT Presentation

inferring internet inferring internet denial of of
SMART_READER_LITE
LIVE PREVIEW

Inferring Internet Inferring Internet Denial- -of of- -Service - - PowerPoint PPT Presentation

Inferring Internet Inferring Internet Denial- -of of- -Service Activity Service Activity Denial Geoffrey M. Voelker Geoffrey M. Voelker University of California, San Diego University of California, San Diego Joint work with David Moore


slide-1
SLIDE 1

Inferring Internet Inferring Internet Denial Denial-

  • of
  • f-
  • Service Activity

Service Activity

Geoffrey M. Voelker Geoffrey M. Voelker University of California, San Diego University of California, San Diego Joint work with David Moore (CAIDA/UCSD) Joint work with David Moore (CAIDA/UCSD) and Stefan Savage (UCSD) and Stefan Savage (UCSD)

slide-2
SLIDE 2

October 17, 2001 University of Virginia 2

Simple Question Simple Question

We were interested in answering a simple question: How prevalent are denial-of-service attacks in the Internet?

slide-3
SLIDE 3

October 17, 2001 University of Virginia 3

Anecdotal Data Anecdotal Data

Press reports: Analysts: Surveys: “Losses … could total more than $1.2 billion”

  • Yankee Group report

“38% of security professionals surveyed reported denial of service activity in 2000”

  • CSI/FBI survey
slide-4
SLIDE 4

October 17, 2001 University of Virginia 4

Quantitative Data? Quantitative Data?

Is not available (i.e., no one knows) Inherently hard to acquire

Few content or service providers collect such data If they do, its usually considered sensitive

Infeasible to collect at Internet scale

How can you monitor enough of the Internet to obtain a

representative sample?

slide-5
SLIDE 5

October 17, 2001 University of Virginia 5

Our Contributions Our Contributions

Backscatter analysis

New technique for estimating global denial-of-service activity

First data describing Internet-wide DoS activity

~4,000 attacks per week (> 12,000 over 3 weeks) Instantaneous loads above 600k pps Characterization of attacks and victims

Paper appeared this August:

Moore, Voelker and Savage, Inferring Internet Denial-of-

Service Activity, 2001 USENIX Security

slide-6
SLIDE 6

October 17, 2001 University of Virginia 6

Overview Overview

Describe backscatter analysis Experimental setup Series of analyses and attack characterizations Tracking the Code Red Worm

slide-7
SLIDE 7

October 17, 2001 University of Virginia 7

Key Idea Key Idea

Flooding-style DoS attacks

e.g. SYN flood, ICMP flood

Attackers spoof source address randomly

True of all major attack tools

Victims, in turn, respond to attack packets Unsolicited responses (backscatter) equally distributed

across IP space

Received backscatter is evidence of an attacker

elsewhere

slide-8
SLIDE 8

October 17, 2001 University of Virginia 8

Backscatter Example Backscatter Example

Attack Backscatter Victim B C D V B C V D V SYN packets V V B SYN+ACK backscatter

slide-9
SLIDE 9

October 17, 2001 University of Virginia 9

Backscatter Analysis Backscatter Analysis

Monitor block of n IP addresses Expected # of backscatter packets given an attack of

m packets:

Extrapolated attack rate R is a function of measured

backscatter rate R’:

n R R

32

2 ' ≥

32

2 nm E(X) =

slide-10
SLIDE 10

October 17, 2001 University of Virginia 10

Assumptions and Biases Assumptions and Biases

Address uniformity

Ingress filtering, reflectors, etc. cause us to underestimate # of

attacks

Can bias rate estimation (can we test uniformity?)

Reliable delivery

Packet losses, server overload & rate limiting cause us to

underestimate attack rates/durations

Backscatter hypothesis

Can be biased by purposeful unsolicited packets

» Port scanning (minor factor at worst in practice)

Do we detect backscatter at multiple sites?

slide-11
SLIDE 11

October 17, 2001 University of Virginia 11

Experimental Setup Experimental Setup

Quiescent /8 Network (224 addresses) Monitor (w/big disk)

Internet

slide-12
SLIDE 12

October 17, 2001 University of Virginia 12

Methodology Methodology

Collected three weeks of traces (February 2001) Analyzed trace data from two perspectives Flow-based analysis (categorical)

Number, duration, kinds of attacks Keyed on victim IP address and protocol Flow duration defined by explicit parameters (min threshold,

timeout)

Event-based analysis (intensity)

Rate, intensity over time Attack event: backscatter packets from IP address in 1 minute

window

slide-13
SLIDE 13

October 17, 2001 University of Virginia 13

Analysis Analysis

Summary statistics Time behavior Protocol Duration Rate Victim categorization

DNS, top-level domain (TLD), AS Popularity

slide-14
SLIDE 14

October 17, 2001 University of Virginia 14

Attack Breakdow n Attack Breakdow n

677 575 585 Victim AS’s 71 62 60 Victim DNS TLDs 876 693 750 Victim DNS domains 1281 1085 1132 Victim prefixes 2385 1821 1942 Victim IP’s 4754 3878 4173 Attacks Week3 Week2 Week1

slide-15
SLIDE 15

October 17, 2001 University of Virginia 15

Attacks Over Time Attacks Over Time

(Surprisingly uniform, no diurnal effects)

slide-16
SLIDE 16

October 17, 2001 University of Virginia 16

Periodic Attack (Daily) Periodic Attack (Daily)

(Every day like clockwork)

slide-17
SLIDE 17

October 17, 2001 University of Virginia 17

Punctuated Attack (1 min) Punctuated Attack (1 min)

(Fine-grained behavior as well)

slide-18
SLIDE 18

October 17, 2001 University of Virginia 18

Attack Protocol/Services Attack Protocol/Services

Protocols

Mostly TCP (90-94% attacks) A few large ICMP floods (up to 43% of packets)

Services

Most attacks on multiple ports (~80%) A few services (HTTP, IRC) singled out

slide-19
SLIDE 19

October 17, 2001 University of Virginia 19

Attack Duration Attack Duration

(50% > 10 mins) (Most between 3-30 mins)

slide-20
SLIDE 20

October 17, 2001 University of Virginia 20

Attack Rate Attack Rate

(50% > 350 pps/sec, most intense is 679,000 pps)

slide-21
SLIDE 21

October 17, 2001 University of Virginia 21

Victim Characterization (DNS) Victim Characterization (DNS)

Entire spectrum of commercial businesses

Yahoo, CNN, Amazon, etc. and many smaller biz

Evidence that minor DoS attacks used for personal

vendettas

10-20% of attacks to home machines A few very large attacks against broadband Many reverse mappings clearly compromised (e.g.

is.on.the.net.illegal.ly and the.feds.cant.secure.their.shellz.ca)

5% of attack target infrastructure

Routers (e.g. core2-core1-oc48.paol.above.net) Name servers (e.g. ns4.reliablehosting.com)

slide-22
SLIDE 22

October 17, 2001 University of Virginia 22

Victim Top Victim Top-

  • Level Domains

Level Domains

5 10 15 20 25 30 35

unknown net com ro br

  • rg

edu ca de uk

Top-Level Domain Percent of Attacks

Week 1 Week 2 Week 3

(net == com, edu small, ro and br unusual)

slide-23
SLIDE 23

October 17, 2001 University of Virginia 23

Victim Autonomous Systems Victim Autonomous Systems

1 2 3 4 5 6

S T A R N E T S ( 6 6 6 3 ) N O R O U T E ( * ) A L T E R N E T

  • A

S ( 7 1 ) H O M E

  • N

E T

  • 1

( 6 1 7 2 ) E M B R A T E L

  • B

R ( 4 2 3 ) R D S N E T ( 8 7 8 ) N E T S A T

  • A

S ( 1 1 1 2 7 ) A S 1 2 3 2 ( 1 2 3 2 ) T E L E B A H I A ( 7 7 3 8 ) S P R I N T L I N K ( 1 2 3 9 ) A S N

  • Q

W E S T ( 2 9 ) T E L I A N E T

  • S

E ( 3 3 1 ) T O P E D G E ( 9 1 7 6 ) B H N E T ( 1 1 7 6 ) A S 8 3 3 8 ( 8 3 3 8 ) E C O S O F T ( 1 5 9 7 1 ) A S 1 5 6 6 2 ( 1 5 6 6 2 )

Autonomous System Percent of Attacks

Week 1 Week 2 Week 3

(No single AS/set of AS’s are targeted (long tail, too))

slide-24
SLIDE 24

October 17, 2001 University of Virginia 24

Victim Popularity Victim Popularity

0.01 0.1 1 10 100

1 2 3 4 5 6 7 8 9 1 1 1 1 2 1 3 1 4 1 5 1 6 1 7 1 8 1 9 2 2 1 2 3 2 4 2 5 2 6 2 7 2 8 3 3 1 3 2 3 4 3 5 3 7 3 8 3 9 4 4 1 4 2 4 4 4 5 4 6 4 8

# Attacks % Victims

(Most victims attacked once, but a few are unfortunate favorites)

slide-25
SLIDE 25

October 17, 2001 University of Virginia 25

Validation Validation

How do we know we are seeing backscatter from

attacks, and not just funky traffic to our network?

Backscatter not explained by port scanning

98% of backscatter packets do not cause response

Repeated experiment with independent monitor (3

/16’s from Vern Paxson)

Only captured TCP SYN/ACK backscatter 98% inclusion into larger dataset

Matched to actual attacks detected by Asta Networks

  • n large backbone network
slide-26
SLIDE 26

October 17, 2001 University of Virginia 26

Summary Summary

Lots of attacks – some very large

>12,000 attacks against >5,000 targets in a week Most < 1,000 pps, but some over 600,000 pps

Everyone is a potential target

Targets not dominated by any TLD, 2LD or AS

» Targets include large e-commerce sites, mid-sized business, ISPs, government, universities and end-users

Something weird is happening in Romania

New attack “styles”

Punctuated/periodic attacks Attacks against infrastructure targets & broadband

slide-27
SLIDE 27

October 17, 2001 University of Virginia 27

Code Red Worm Code Red Worm

In July, David Moore used the same technique to track

the Code Red Worm

While collecting backscatter data (no way to predict)

Code Red

Infects MS IIS Web servers via security hole Once infected, victim tries to infect other hosts Culminates in a coordinated attack against whitehouse.gov

Impact

Tremendous amount of popular press

» FBI warning on second round of Code Red Worm

slide-28
SLIDE 28

October 17, 2001 University of Virginia 28

Monitoring Code Red Monitoring Code Red

Victims randomly choose an IP address to infect

Try to establish a HTTP connection to that address 1/256th of connection requests in our /8 (our looking glass) Easy to distinguish from backscatter

As with backscatter, can determine

Who: Set of IP addresses of victims infected

» Breakdown by DNS, TLD, AS, etc.

Infection rate: Real-time spread of worm across Internet Patch rate: Real-time patching, shutdown of infected hosts

slide-29
SLIDE 29

October 17, 2001 University of Virginia 29

Rate of Infection Rate of Infection

359,104 hosts were compromised in approximately 13 hrs.

slide-30
SLIDE 30

October 17, 2001 University of Virginia 30

More Info More Info

Backscatter

http://www.caida.org/outreach/papers/backscatter/

Code Red

http://www.caida.org/analysis/security/code-red/

slide-31
SLIDE 31

October 17, 2001 University of Virginia 31

Protocol Breakdow n (1 w eek) Protocol Breakdow n (1 w eek)

2,309 (4.5) 128 (3.1) TCP (RST) 3 (0.01) 2 (0.05) TCP (Other) 919 (1.8) 378 (9.1) TCP (SYN ACK) 580 (1.1) 486 (12) ICMP (Other) 31468 (62) 453 (11) ICMP (TTL Exceeded) 2892 (5.7) 699 (17) ICMP (Host Unreachable) 12,656 (25) 2027 (49) TCP (RST ACK) BS Packets (x1000) Attacks Backscatter protocol

slide-32
SLIDE 32

October 17, 2001 University of Virginia 32

Attack Protocol Breakdow n Attack Protocol Breakdow n

12 (0.02) 19 (0.46) Other 25 (0.05) 65 (1.6) Proto 0 22,020 (43) 88 (2.1) ICMP 66 (0.13) 99 (2.4) UDP 28705 (56) 3902 (94) TCP BS Packets (x1000) Attacks Attack Protocol