TIDE: Proactive threat detection Introduction Ph.D. student from - - PowerPoint PPT Presentation

tide proactive threat detection introduction
SMART_READER_LITE
LIVE PREVIEW

TIDE: Proactive threat detection Introduction Ph.D. student from - - PowerPoint PPT Presentation

2019-06-20 Olivier van der Toorn <o.i.vandertoorn@utwente.nl> University of Twente, Design and Analysis of Communication Systems TIDE: Proactive threat detection Introduction Ph.D. student from the University of Twente System


slide-1
SLIDE 1

TIDE: Proactive threat detection

Olivier van der Toorn <o.i.vandertoorn@utwente.nl> 2019-06-20

University of Twente, Design and Analysis of Communication Systems

slide-2
SLIDE 2

Introduction

slide-3
SLIDE 3

Who am I

  • Ph.D. student from the University of Twente
  • System administrator @SNT

(ftp.nl.debian.org/ftp.snt.utwente.nl?)

  • First FIRST conference

Contact details tide-project.nl

  • .i.vandertoorn@utwente.nl

1

slide-4
SLIDE 4

Introduction

2

slide-5
SLIDE 5

Introduction

Is there a better way?

  • Typically reactive detection approaches, or as it happens…
  • Based on passive measurement
  • Proof of suspicious activity is required

3

slide-6
SLIDE 6

Introduction

Is there a better way?

  • Typically reactive detection approaches, or as it happens…
  • Based on passive measurement
  • Proof of suspicious activity is required

3

slide-7
SLIDE 7

We propose

Proactive threat detection!

  • Transition towards proactive security
  • Use active measurement to pick up on clues of upcoming attacks
  • Proactive threat detection gives an early warning

4

slide-8
SLIDE 8

We propose

Proactive threat detection!

  • Transition towards proactive security
  • Use active measurement to pick up on clues of upcoming attacks
  • Proactive threat detection gives an early warning

4

slide-9
SLIDE 9

Why do we propose this?

slide-10
SLIDE 10

Why do we propose this?

We want to improve attack detection:

  • Proactive threat detection gives us, the defenders, a better chance against attacks
  • In the field of DNS this approach works, more on this later

The advantages of an proactive approach are:

  • Unbiased towards your own network (depends on the underlaying measurement)
  • Possible time advantage (alert before the attack happens)

5

slide-11
SLIDE 11

Why do we propose this?

We want to improve attack detection:

  • Proactive threat detection gives us, the defenders, a better chance against attacks
  • In the field of DNS this approach works, more on this later

The advantages of an proactive approach are:

  • Unbiased towards your own network (depends on the underlaying measurement)
  • Possible time advantage (alert before the attack happens)

5

slide-12
SLIDE 12

Why do we propose this?

We want to improve attack detection:

  • Proactive threat detection gives us, the defenders, a better chance against attacks
  • In the field of DNS this approach works, more on this later

The advantages of an proactive approach are:

  • Unbiased towards your own network (depends on the underlaying measurement)
  • Possible time advantage (alert before the attack happens)

5

slide-13
SLIDE 13

Why do we propose this?

We want to improve attack detection:

  • Proactive threat detection gives us, the defenders, a better chance against attacks
  • In the field of DNS this approach works, more on this later

The advantages of an proactive approach are:

  • Unbiased towards your own network (depends on the underlaying measurement)
  • Possible time advantage (alert before the attack happens)

5

slide-14
SLIDE 14

Why do we propose this?

We want to improve attack detection:

  • Proactive threat detection gives us, the defenders, a better chance against attacks
  • In the field of DNS this approach works, more on this later

The advantages of an proactive approach are:

  • Unbiased towards your own network (depends on the underlaying measurement)
  • Possible time advantage (alert before the attack happens)

5

slide-15
SLIDE 15

Why do we propose this?

We want to improve attack detection:

  • Proactive threat detection gives us, the defenders, a better chance against attacks
  • In the field of DNS this approach works, more on this later

The advantages of an proactive approach are:

  • Unbiased towards your own network (depends on the underlaying measurement)
  • Possible time advantage (alert before the attack happens)

5

slide-16
SLIDE 16

What do we need to do proactive threat detection?

Three components:

  • Data from active measurements (DNS, ICMP, etc.)
  • Knowledge about what you are measuring (what sets the abnormal apart from the

normal?)

  • Ability to use the detection results

6

slide-17
SLIDE 17

What do we need to do proactive threat detection?

Three components:

  • Data from active measurements (DNS, ICMP, etc.)
  • Knowledge about what you are measuring (what sets the abnormal apart from the

normal?)

  • Ability to use the detection results

6

slide-18
SLIDE 18

What do we need to do proactive threat detection?

Three components:

  • Data from active measurements (DNS, ICMP, etc.)
  • Knowledge about what you are measuring (what sets the abnormal apart from the

normal?)

  • Ability to use the detection results

6

slide-19
SLIDE 19

What do we need to do proactive threat detection?

Three components:

  • Data from active measurements (DNS, ICMP, etc.)
  • Knowledge about what you are measuring (what sets the abnormal apart from the

normal?)

  • Ability to use the detection results

6

slide-20
SLIDE 20

Use cases

slide-21
SLIDE 21

Use cases

Use case Does proactive security work? Snowshoe spam domains Yes! DDoS domains Maybe DNS TXT records Maybe Combo-squat domains No

7

slide-22
SLIDE 22

Use cases

Use case Does proactive security work? Snowshoe spam domains Yes! DDoS domains Maybe DNS TXT records Maybe Combo-squat domains No

7

slide-23
SLIDE 23

Use cases

Use case Does proactive security work? Snowshoe spam domains Yes! DDoS domains Maybe DNS TXT records Maybe Combo-squat domains No

7

slide-24
SLIDE 24

Use cases

Use case Does proactive security work? Snowshoe spam domains Yes! DDoS domains Maybe DNS TXT records Maybe Combo-squat domains No

7

slide-25
SLIDE 25

Use cases

Use case Does proactive security work? Snowshoe spam domains Yes! DDoS domains Maybe DNS TXT records Maybe Combo-squat domains No

7

slide-26
SLIDE 26

OpenINTEL: How we measure

  • OpenINTEL performs an active measurement, sending a fixed set of queries for all

covered domains once every 24 hours

  • We do this at scale, covering over 216 million domains per day:
  • gTLDs:

.com, .net, .org, .info, .mobi, .aero, .asia, .name, .biz, .gov + almost 1200 “new” gTLDs (.xxx, .xyz, .amsterdam, .berlin, ...)

  • ccTLDs:

.nl, .se, .nu, .ca, .fi, .at, .dk, .ru, .рф, .us, <your ccTLD here?>

8

slide-27
SLIDE 27

Use case: Snowshoe spam

slide-28
SLIDE 28

Snowshoe Spam

9

Internet

slide-29
SLIDE 29

Snowshoe Spam

9

Internet Spam

  • few hosts
slide-30
SLIDE 30

Snowshoe Spam

9

Internet Spam

  • few hosts
  • many messages per host
slide-31
SLIDE 31

Snowshoe Spam

9

Internet Spam Snowshoe Spam

  • few hosts
  • many messages per host
  • many hosts
slide-32
SLIDE 32

Snowshoe Spam

9

Internet Spam Snowshoe Spam

  • few hosts
  • many messages per host
  • many hosts
  • few messages per host
slide-33
SLIDE 33

Snowshoe spam: Hypothesis

While snowshoe spammers are hard to detect, but still leave a trace in the DNS.

10

slide-34
SLIDE 34

Snowshoe spam: Hypothesis

While snowshoe spammers are hard to detect, but still leave a trace in the DNS. Snowshoe spam + SPF

10

slide-35
SLIDE 35

Snowshoe spam: Hypothesis

While snowshoe spammers are hard to detect, but still leave a trace in the DNS. Snowshoe spam + SPF Many hosts + a DNS record for each host or a long SPF record

10

slide-36
SLIDE 36

Snowshoe spam: Hypothesis

While snowshoe spammers are hard to detect, but still leave a trace in the DNS. Snowshoe spam + SPF Many hosts + a DNS record for each host or a long SPF record Domain with many records or long SPF records

10

slide-37
SLIDE 37

Snowshoe spam: Hypothesis

While snowshoe spammers are hard to detect, but still leave a trace in the DNS. Snowshoe spam + SPF Many hosts + a DNS record for each host or a long SPF record Domain with many records or long SPF records Active DNS measurements are a good way to detect snowshoe spam domains.

10

slide-38
SLIDE 38

Snowshoe spam: Methodology

11

OpenINTEL (DNS data source) Machine Learning (processing) Realtime Blackhole List (storage) SURFnet (validation)

slide-39
SLIDE 39

Snowshoe spam: Datasets & Features

37 features

  • Simple: number of MX addresses
  • Complex: number of IP addresses inside an SPF record

These features are not computed for every domain in OpenINTEL.

12

slide-40
SLIDE 40

Snowshoe spam: Datasets & Features

37 features

  • Simple: number of MX addresses
  • Complex: number of IP addresses inside an SPF record

These features are not computed for every domain in OpenINTEL.

12

slide-41
SLIDE 41

Snowshoe spam: Datasets & Features

37 features

  • Simple: number of MX addresses
  • Complex: number of IP addresses inside an SPF record

These features are not computed for every domain in OpenINTEL.

12

slide-42
SLIDE 42

Snowshoe spam: Datasets & Features

37 features

  • Simple: number of MX addresses
  • Complex: number of IP addresses inside an SPF record

These features are not computed for every domain in OpenINTEL.

12

slide-43
SLIDE 43

Snowshoe spam: Long Tail Analysis

13

slide-44
SLIDE 44

Snowshoe spam: Methodology

14

OpenINTEL (DNS data source) Machine Learning (processing) Realtime Blackhole List (storage) SURFnet (validation)

slide-45
SLIDE 45

Snowshoe spam: Results

20 40 40% 60% 80% 100%

11.2 16.6

Number of A records CDF spam ham 20 40 60 80 100 90% 92% 94% 96% 98% 100%

77.0

Number of MX records CDF spam ham

15

slide-46
SLIDE 46

Snowshoe spam: Example

Domain A records MX records (ham) google.com 1 5 (spam) giftiedan.com 61 1 (spam) twirlmore.com 1 253

16

slide-47
SLIDE 47

Snowshoe spam: Example

Domain A records MX records (ham) google.com 1 5 (spam) giftiedan.com 61 1 (spam) twirlmore.com 1 253

16

slide-48
SLIDE 48

Snowshoe spam: Example

Domain A records MX records (ham) google.com 1 5 (spam) giftiedan.com 61 1 (spam) twirlmore.com 1 253

16

slide-49
SLIDE 49

RBL comparison (2 month period)

17

10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-50
SLIDE 50

RBL comparison (2 month period)

17

30705

Δt < 2 days

10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-51
SLIDE 51

RBL comparison (2 month period)

17

30705

Δt < 2 days

1972

Δt ≥ 2 days

10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-52
SLIDE 52

RBL comparison (2 month period)

17

30705

Δt < 2 days

1972

Δt ≥ 2 days

1154 10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-53
SLIDE 53

RBL comparison (2 month period)

17

30705

Δt < 2 days

1972

Δt ≥ 2 days

1154 1105 10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-54
SLIDE 54

RBL comparison (2 month period)

17

30705

Δt < 2 days

1972

Δt ≥ 2 days

1154 1105 971 10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-55
SLIDE 55

RBL comparison (2 month period)

17

30705

Δt < 2 days

1972

Δt ≥ 2 days

1154 1105 971 949 10 20 30 40 50 60 70 80 Detection in advance (days) 1 10 100 1000 10000 100000 Num ber of detected dom ains

slide-56
SLIDE 56

RBL comparison (9 month period)

18

20 40 60 80 100 120 140 160 180 Detection in advance (days) 1 10 100 1000 10000 Number of detected domains

57724 6710 1305 205

Δt < 2 days Δt ≥ 2 days

slide-57
SLIDE 57

SURFnet evaluation

19

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names

slide-58
SLIDE 58

SURFnet evaluation

20

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names

slide-59
SLIDE 59

SURFnet evaluation

20

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names Blacklisted Detected

slide-60
SLIDE 60

SURFnet evaluation

Δt < 2 days

  • 45% of received emails fall in this category
  • 18% of observed domains fall in this category

20

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names Blacklisted Detected

slide-61
SLIDE 61

SURFnet evaluation

Δt ≥ 2 days

  • 17% of received emails fall in this category
  • 26% of observed domains fall in this category

20

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names Blacklisted Detected

slide-62
SLIDE 62

SURFnet evaluation ?

domain not on existing blacklist yet

  • 38% of received emails fall in this category
  • 57% of observed domains fall in this category

20

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names Blacklisted Detected

slide-63
SLIDE 63

SURFnet evaluation

  • 41% of emails were received in the purple areas
  • 59% of these emails have not been marked as spam

20

2017-05-24 2017-06-23 2017-07-23 Observation dates daadzgam.com realdrippy.com coachspoke.com stillscratch.com homerope.com quittradition.com Domain names Blacklisted Detected

slide-64
SLIDE 64

Use case: DDoS domains

slide-65
SLIDE 65

DDoS domains

In DDoS attacks the amplification factor is important. Domains crafted for DDoS attacks typically have:

  • Many records
  • Long (TXT) records

21

slide-66
SLIDE 66

DDoS domains

In DDoS attacks the amplification factor is important. Domains crafted for DDoS attacks typically have:

  • Many records
  • Long (TXT) records

21

slide-67
SLIDE 67

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-68
SLIDE 68

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-69
SLIDE 69

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-70
SLIDE 70

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-71
SLIDE 71

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-72
SLIDE 72

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-73
SLIDE 73

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-74
SLIDE 74

Lifetime of a DDoS domain

Possible methodology could be:

  • 1. Filter domains with more than average number of records, or longer than average

TXT record

  • 2. Gather the records for the past X days
  • 3. Determine trend lines
  • 4. Predict the size of the domain, say, ten days from now
  • 5. Flag the domain if the predicted size is above a certain threshold

22

2 1 5

  • 2

2 1 5

  • 3

2 1 5

  • 4

2 1 5

  • 5

2 1 5

  • 6

2 1 5

  • 7

2 1 5

  • 8

2 1 5

  • 9

500 1000 1500 2000 2500 3000 3500 4000 Estimated ANY size (bytes) Attacks observed A NS SOA TXT

slide-75
SLIDE 75

Use case: DNS TXT records

slide-76
SLIDE 76

DNS TXT records

  • Majority of TXT records are related to email (~70%)
  • 1.2% falls in the ‘other’ category

23

2015-07 2016-01 2016-07 2017-01 2017-07 2018-01 2018-07

Date

20 M 30 M 40 M 50 M 60 M 70 M 80 M

Number of TXT records

Crypto Coins Email Encoded Miscellaneous Other Patterns Verification

slide-77
SLIDE 77

DNS TXT records

  • Majority of TXT records are related to email (~70%)
  • 1.2% falls in the ‘other’ category

23

2015-07 2016-01 2016-07 2017-01 2017-07 2018-01 2018-07

Date

20 M 30 M 40 M 50 M 60 M 70 M 80 M

Number of TXT records

Crypto Coins Email Encoded Miscellaneous Other Patterns Verification

slide-78
SLIDE 78

DNS TXT records

  • Majority of TXT records are related to email (~70%)
  • 1.2% falls in the ‘other’ category

23

2015-07 2016-01 2016-07 2017-01 2017-07 2018-01 2018-07

Date

20 M 30 M 40 M 50 M 60 M 70 M 80 M

Number of TXT records

Crypto Coins Email Encoded Miscellaneous Other Patterns Verification

slide-79
SLIDE 79

DNS TXT records

One of the hightlights of this ‘other’ category is single character records.

  • More than 278K TXT records consisting of a single charcter
  • Majority contains a ~
  • Almost all of these domains are hosted in the same AS

24

slide-80
SLIDE 80

DNS TXT records

One of the hightlights of this ‘other’ category is single character records.

  • More than 278K TXT records consisting of a single charcter
  • Majority contains a ~
  • Almost all of these domains are hosted in the same AS

24

slide-81
SLIDE 81

DNS TXT records

One of the hightlights of this ‘other’ category is single character records.

  • More than 278K TXT records consisting of a single charcter
  • Majority contains a ~
  • Almost all of these domains are hosted in the same AS

24

slide-82
SLIDE 82

DNS TXT records

One of the hightlights of this ‘other’ category is single character records.

  • More than 278K TXT records consisting of a single charcter
  • Majority contains a ~
  • Almost all of these domains are hosted in the same AS

24

slide-83
SLIDE 83

DNS TXT records

25

slide-84
SLIDE 84

DNS TXT records

Are these records useful for threat detection?

  • Generally, no
  • The ‘~’; case could be an identifier for domains from a specific AS

26

slide-85
SLIDE 85

Use case: Combo-squat domains

slide-86
SLIDE 86

Combo-squat: What is a combo-squat domain?

Many types of squatting domains: Type Example (target: utwente.nl) Typosquatting utwent.nl Combosquatting utwente-login.nl Bitsquatting utwenpe.nl Homograph-Based squatting utvvente.nl

27

slide-87
SLIDE 87

Combo-squat: A general approach?

We started out by developing a general machine-learning based detection model. Feeding the detection model a list of trademarks worked a lot better! Trademark Number of domains Apple 8751 Paypal 1241 Microsoft 711

28

slide-88
SLIDE 88

Combo-squat: A general approach?

We started out by developing a general machine-learning based detection model. Feeding the detection model a list of trademarks worked a lot better! Trademark Number of domains Apple 8751 Paypal 1241 Microsoft 711

28

slide-89
SLIDE 89

Combo-squat: A general approach?

We started out by developing a general machine-learning based detection model. Feeding the detection model a list of trademarks worked a lot better! Trademark Number of domains Apple 8751 Paypal 1241 Microsoft 711

28

slide-90
SLIDE 90

Combo-squat: The problems with a generic approach

However, a larger problem is the life time of a combosquat domain.

29

slide-91
SLIDE 91

Combo-squat: The problems with a generic approach

However, a larger problem is the life time of a combosquat domain.

29

slide-92
SLIDE 92

Use cases

Where it works:

  • Snowshoe spam domains

Where it might work:

  • DDoS Domains
  • Malicious TXT records

Where it doesn’t work:

  • Combo-squat domains

30

slide-93
SLIDE 93

Use cases

Where it works:

  • Snowshoe spam domains

Where it might work:

  • DDoS Domains
  • Malicious TXT records

Where it doesn’t work:

  • Combo-squat domains

30

slide-94
SLIDE 94

Use cases

Where it works:

  • Snowshoe spam domains

Where it might work:

  • DDoS Domains
  • Malicious TXT records

Where it doesn’t work:

  • Combo-squat domains

30

slide-95
SLIDE 95

Reflection

slide-96
SLIDE 96

Reflection

What have we learned from these use cases?

  • The data needs to contain hints
  • This approach works for relatively long setup times

(in our case >1d)

31

slide-97
SLIDE 97

Reflection

What have we learned from these use cases?

  • The data needs to contain hints
  • This approach works for relatively long setup times

(in our case >1d)

31

slide-98
SLIDE 98

Reflection

What have we learned from these use cases?

  • The data needs to contain hints
  • This approach works for relatively long setup times

(in our case >1d)

31

slide-99
SLIDE 99

Improvement?

We realize that our solution is not perfect.

32

slide-100
SLIDE 100

Improvement?

We realize that our solution is not perfect. We think the “ultimate” solution is to combine passive and active measurements.

32

slide-101
SLIDE 101

Improvement?

We think the “ultimate” solution is to combine passive and active measurements. Use proactive threat detection to prime passive approaches.

32

slide-102
SLIDE 102

Conclusion

slide-103
SLIDE 103

Conclusion

We should move towards proactive threat detection.

  • Pick up on clues of an upcoming attack
  • Look beyond your own network

Use the early warning from these methods to feed passive detection approaches.

  • Combine the high level of detail of passive measurements with the time advantage

from active measurements

33

slide-104
SLIDE 104

Conclusion

We should move towards proactive threat detection.

  • Pick up on clues of an upcoming attack
  • Look beyond your own network

Use the early warning from these methods to feed passive detection approaches.

  • Combine the high level of detail of passive measurements with the time advantage

from active measurements

33

slide-105
SLIDE 105

Conclusion

We should move towards proactive threat detection.

  • Pick up on clues of an upcoming attack
  • Look beyond your own network

Use the early warning from these methods to feed passive detection approaches.

  • Combine the high level of detail of passive measurements with the time advantage

from active measurements

33

slide-106
SLIDE 106

Future work

slide-107
SLIDE 107

Future work

  • Research other areas of attack:
  • DDoS domains
  • C&C domains
  • etc.
  • Collaborate with pDNS @ CERT.at
  • Are there more benefits of combining passive and active (DNS) measurements?

34

slide-108
SLIDE 108

Thank you

Thank you for listening! Any questions?

Contact details tide-project.nl

  • .i.vandertoorn@utwente.nl

35