System for DNS Manos Antonakakis, Roberto Perdisci , David Dagon, - - PowerPoint PPT Presentation

system for dns
SMART_READER_LITE
LIVE PREVIEW

System for DNS Manos Antonakakis, Roberto Perdisci , David Dagon, - - PowerPoint PPT Presentation

Notos: Building a Dynamic Reputation System for DNS Manos Antonakakis, Roberto Perdisci , David Dagon, Wenke Lee, and Nick Feamster College of Computing Georgia Institute of Technology Atlanta, Georgia ONR MURI Review Meeting June 10, 2010


slide-1
SLIDE 1

Notos: Building a Dynamic Reputation System for DNS

Manos Antonakakis, Roberto Perdisci , David Dagon, Wenke Lee, and Nick Feamster

College of Computing Georgia Institute of Technology Atlanta, Georgia ONR MURI Review Meeting June 10, 2010

slide-2
SLIDE 2

Report Documentation Page

Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.
  • 1. REPORT DATE

10 JUN 2010

  • 2. REPORT TYPE
  • 3. DATES COVERED

00-00-2010 to 00-00-2010

  • 4. TITLE AND SUBTITLE

Notos: Building a Dynamic Reputation System for DNS

  • 5a. CONTRACT NUMBER
  • 5b. GRANT NUMBER
  • 5c. PROGRAM ELEMENT NUMBER
  • 6. AUTHOR(S)
  • 5d. PROJECT NUMBER
  • 5e. TASK NUMBER
  • 5f. WORK UNIT NUMBER
  • 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)

Georgia Institute of Technology,College of Computing,Atlanta,GA,30332

  • 8. PERFORMING ORGANIZATION

REPORT NUMBER

  • 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES)
  • 10. SPONSOR/MONITOR’S ACRONYM(S)
  • 11. SPONSOR/MONITOR’S REPORT

NUMBER(S)

  • 12. DISTRIBUTION/AVAILABILITY STATEMENT

Approved for public release; distribution unlimited

  • 13. SUPPLEMENTARY NOTES

MURI Review, June 2010. U.S. Government or Federal Rights License

  • 14. ABSTRACT
  • 15. SUBJECT TERMS
  • 16. SECURITY CLASSIFICATION OF:
  • 17. LIMITATION OF

ABSTRACT

Same as Report (SAR)

  • 18. NUMBER

OF PAGES

17

  • 19a. NAME OF

RESPONSIBLE PERSON

  • a. REPORT

unclassified

  • b. ABSTRACT

unclassified

  • c. THIS PAGE

unclassified

Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18
slide-3
SLIDE 3

11/4/09 ONR MURI Review 2

Problems with Static Blacklisting

  • Malware families utilize large number of domains for

discovering the “up-to-date” C&C address – Examples are the Sinowal, Bobax and Conficker bots families that generate tens of thousands new C&C domains every day – IP-based (dynamic or not) blocking technologies cannot keep up with the number of IP addresses that the C&C domains typically use – DNSBL based technologies cannot keep up with the volume of new domain names the botnet uses every day

  • Detecting and blocking such type of agile botnets

cannot be achieve with the current state-of-the-art

slide-4
SLIDE 4

11/4/09 ONR MURI Review 3

Outline

  • Notos

– Notations, Passive DNS trends, and anchor- zones – Network based profile modeling – Network and zone based profiles clustering – Reputation function – System implementation – Results

  • Conclusions and Future Work
slide-5
SLIDE 5

11/4/09 ONR MURI Review 4

Notos

  • Network and zone based features that

capture the characteristics of resource provisioning, usages, and management by domains.

– Learn the models of legitimate and malicious domains

  • Classify new domains with a very low FP%

(0.3846%) and high TP% (96.8%).

– Days or even weeks before they appear on static blacklists.

slide-6
SLIDE 6

11/4/09 ONR MURI Review 5

Notation & Terminology

  • Resource Record (RR)

– www.example.com 192.0.32.10

  • 2nd level domain (2LD) and 3rd level domain (3LD)

– For the domain name www.example.com: 2LD is the example.com and 3LD is the www.example.com

  • Related Historic IPs (RHIPs)

– All “routable” IPs that historically have been mapped with the domain name in the RR, or any domain name under the 2LD and 3LD

  • Related Historic Domains (RHDNs)

– All fully qualified domain names (FQDN) that historically have been linked with the IP in the RR, its corresponding CIDR and AS

slide-7
SLIDE 7

11/4/09 ONR MURI Review 6

Passive DNS data

  • Successful DNS resolutions that can be
  • bserved in a given network
  • Data set has traffic from 2 ISP sensors - one in

west coast and one in east coast, also data from SIE

  • We observe that different classes of zones

demonstrate different passive DNS behaviors

  • The number of new domain names and IPs we
  • bserve every day is in the range of 150,000 to

200,000

slide-8
SLIDE 8

11/4/09 ONR MURI Review 7

Passive DNS trends

Anchor classes in pDNS: Akamai, CDN, Popular, DYNDNS and Common

(a) Uniqu:e AIRs In The Two ISPs Sensors (per day}

~

:m
  • 1
.~

f"o.f/ ......

t/

  • · ·~v/·.f"\

. .

.........

! r

....

A

/!_

/'

~ e+

' .....

  • J., .J... .-••• :}..

;

v

1.

  • \/
  • I

l j

5 80 Unique ARs ----------

i 1 20 30 40 50 60 70

Days (b) Nmv A.Rs Growtfl ln pONS OB For All Zones

· ~i~ t

~ ~ -

: :

  • : I : l

"'

E

2

>

10 20 30 40 50 60 70

(e) Pop Class Growth Over Time (Days)

1000 100 10 1 10

Unique DN Unique IP New RRs

100

Oavs

"'

E

:::>

>

~

Pep.n

.

~

ee

(f) Dyn. DNS Class Growth Over Tme (Days)

10

Unique DN Unique IP New RRs

100

(c) Akamai Class Growltl

(d) CON Class Growltl Over Time (Days) Over Time (Days)

10000 10000 ....-----.----. 1000 10

"'

10

Unique ON Unique IPs New RRs

100

(g) Common Class Growth Over Time (Days)

~ 10

>

10

Unique DN Unique IPs New RRs

100

1000 100 10 '-"-

~

1

10 100

Uniquo DNs Unique IP New RRs

(h) CDF Of RR Growth For All Classes

100000 ,....---....-----. 10000 1000 100

'!'

..

10

.

./

1

0_01 0.1

Akamai

CON

Common ........

Dynamic Pop

..

slide-9
SLIDE 9

11/4/09 ONR MURI Review 8

Features

Notos computes three feature vectors for a RR, based on its RHIPs, RHDNs and Evidence data. The analysis of these feature vectors is forwarded to the reputation engine. These 3 vectors are the Network Based Feature Vector [18], Zone Based Feature Vector [17] and the Evidence Based Feature Vector [6].

slide-10
SLIDE 10

11/4/09 ONR MURI Review 9

Network Profile Modeling

  • Train a Meta-Classifier based on the 5 anchor-classes
  • The network feature vector of a domain name d is translated into the network

modeling output (NM(d)) The NM(d) is a feature vector composed from the confidence scores for each different anchor-class

slide-11
SLIDE 11

11/4/09 ONR MURI Review 10

Domain Clustering

The network and zone based feature vectors of a domain d are used to produce the domain clustering output (DC(d)) In this step we are able to characterize unknown domains within clusters based on already labeled domains in close proximity. The DC(d) is a 5-feature vector characterizing the position of d in the cluster.

slide-12
SLIDE 12

11/4/09 ONR MURI Review 11

Reputation Function

  • Each domain d in our dataset is transformed into

three feature vectors by Notos: NM(d), DC(d) and EV(d) (evidence profile output); these vectors assemble the reputation vector v(d)

  • The reputation function f(v(d)) assigns a score to

the domain name d between [0,1]

  • The reputation function is a statistical classifier

(Decision Tree with Logistic Boost - after model selection)

  • The reputation function is trained using labeled

domain data

slide-13
SLIDE 13

11/4/09 ONR MURI Review 12

Operational Model of Notos

  • Notos utilizes the

Off-line mode to train classifiers, build the clusters and train the reputation function

  • In the In-line mode,

Notos assigns reputation to new RRs observed at the monitoring point

slide-14
SLIDE 14

11/4/09 ONR MURI Review 13

_____ _..,.-

  • Malicous £~-7}

Conf'.C Sinkhole -{B}

  • Aka~nai

(net) and NTP

  • - ISP-Rev.
  • Lookups -{0-7}

......

Malicious -{B} ....

  • Aka~nai

(net)

Dyna~nic

DNs

: ; Akadns and ; Google -{0-S,B.Ji :

  • Mal. -{6,7,9}

_ _ Dyn. DNS -{7,9}

.-

  • ..

..

..

..

~-

..

4 5 6

  • ISP-Rev.

Lookups

~-

  • 3

......

.......

2 7 \,

  • 1'

1'

CDNs, Aka~naitech Few Spa~n, Malicious

  • --........ --

.

.. /.

1

t 1 •

  • ---.,_

J

9 8

  • _..
  • Malicious,

f'ew Popular f'ew CDN

  • Li

~e

. co~n,

f'acebook,

_ - ·

~nyspace,

Malware and

_ _ _

CDNs

  • .,~-
  • -
  • _..,...._----
  • ~

Pep.n

~

ee

slide-15
SLIDE 15

11/4/09 ONR MURI Review 14

Results from the Reputation Function

FP%=0.3849% and TP%=96.8%

Cl>

  • al

a:

Cl>

.:::

  • "(j)

a...

Cl>

::::::1

.._

I-

False Positive Rate vs True Positive Rate 1 0.99 0.98 0.97 : 0.96 1 0.98 0.96 TP'

  • ver All Pos. vs Threshold

0.95

' §

0.94

"(j)

0.92 0.94 0.93 0.92 0.91 0.9

"(3

0.9

~

0.88

a...

0.86 0.84 0.82 ROC ····+-'··

0. 8 .,_.__.__.___.___......___.......___.......___.......___..___.

0 0.10.2J.3).40.fD.00.70.00.9 1 Threshold 0.02 0.04 0.06 False Positive Rate

~

Pep.n

.

~

ee

ROC ----+---· 0.08 0.1

slide-16
SLIDE 16

11/4/09 ONR MURI Review 15

Results from the Reputation Function (cont’d)

# of days the detection earlier than public BLs

(f) Q)

E

~

z

c

"itS

E

Cl

5

Q)

E

::J

>

(a) Overall Volume of Malicious Domains

10000 1000 100 10

1

20 40 60 80 100

Days After Training (c)Malware Dropping/Trojans, Exploits and Rogue AV Domain Names Identified

1000 100 10

1

20 40 60 80 100

Days After Training Malware Rogue AV ~ Exploit 1< x x x 1

~

(b) Flux and Spam Domain Names Identified (d) Botnet Domain Names Identified

~

10000 100

z

c "

itS

E

Cl

5

Q)

E

::J

>

1000 100 10

=-

  • :
~ I>: I>< I>< I><

1

5

10

15 20 25 30 35 Days After Training Flux Spam 1< x x x 1

~

Pep.n

.

~

ee

r- r- r- r-

  • :

n ~n K

h ~n

I

20 40 60 80 100

Days After Training Zeus Koobface 1< x x x 1

R . F. I R'h'$?$?$&5

Various Bets

slide-17
SLIDE 17

11/4/09 ONR MURI Review 16

Tech Transfer

  • Damballa is actively evaluating Notos
  • ISPs are interested in having us extend this line of

research

  • DNS vendors and other network operators

– Have been spending millions of $ and years trying to build similar system, but fail to match Notos’ capability/performance – Trying to get Notos technologies

slide-18
SLIDE 18

11/4/09 ONR MURI Review 17

Conclusions and Future Work

  • Conclusions:

– Combining network, zone, and evidence features provides the ability to dynamically associate unknown domains to known domains/networks

  • Benefits: with limited labeled domains we can identify

new malicious ones, much sooner than BLs

  • Future Work:

– Targeted detection: use an additional clustering step based on association with specific fraudulent domain name class (RBN, Zeus, etc.) to enable targeted detection – Combine Notos with Spam/Flux detection systems