Simulated Penetration Testing: From Dijkstra to Turing Test++ J - - PowerPoint PPT Presentation

simulated penetration testing from dijkstra to turing test
SMART_READER_LITE
LIVE PREVIEW

Simulated Penetration Testing: From Dijkstra to Turing Test++ J - - PowerPoint PPT Presentation

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now? Simulated Penetration Testing: From Dijkstra to Turing Test++ J org Hoffmann June 26, 2015 J org Hoffmann Simulated Penetration Testing 1/58 What? Classical


slide-1
SLIDE 1

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Simulated Penetration Testing: From “Dijkstra” to “Turing Test++”

  • rg Hoffmann

June 26, 2015

  • rg Hoffmann

Simulated Penetration Testing 1/58

slide-2
SLIDE 2

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-3
SLIDE 3

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-4
SLIDE 4

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-5
SLIDE 5

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-6
SLIDE 6

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-7
SLIDE 7

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-8
SLIDE 8

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-9
SLIDE 9

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-10
SLIDE 10

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-11
SLIDE 11

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

slide-12
SLIDE 12

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Details: See paper.

slide-13
SLIDE 13

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Details: See old town.

slide-14
SLIDE 14

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Details: See old town.

slide-15
SLIDE 15

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Simulated Penetration Testing: From “Dijkstra” to “Turing Test++”

  • rg Hoffmann

June 26, 2015

  • rg Hoffmann

Simulated Penetration Testing 1/49

slide-16
SLIDE 16

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 2/49

slide-17
SLIDE 17

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Network Hacking

Router Firewall DB Server Workstation

DMZ SENSITIVE USERS

Web Server Application Server Internet Attacker

  • rg Hoffmann

Simulated Penetration Testing 3/49

slide-18
SLIDE 18

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Network Hacking

Router Firewall DB Server Workstation

DMZ SENSITIVE USERS

Web Server Application Server Internet Attacker

  • rg Hoffmann

Simulated Penetration Testing 3/49

slide-19
SLIDE 19

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Network Hacking

Router Firewall DB Server Workstation

DMZ SENSITIVE USERS

Web Server Application Server Internet Attacker

  • rg Hoffmann

Simulated Penetration Testing 3/49

slide-20
SLIDE 20

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Network Hacking

Router Firewall DB Server Workstation

DMZ SENSITIVE USERS

Web Server Application Server Internet Attacker

  • rg Hoffmann

Simulated Penetration Testing 3/49

slide-21
SLIDE 21

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Penetration Testing (Pentesting)

Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would.

  • rg Hoffmann

Simulated Penetration Testing 4/49

slide-22
SLIDE 22

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Penetration Testing (Pentesting)

Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would. Well-established industry (roots back to the 60s). Points out specific dangerous attacks (as opposed to vulnerability scanners).

  • rg Hoffmann

Simulated Penetration Testing 4/49

slide-23
SLIDE 23

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Penetration Testing (Pentesting)

Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would. Well-established industry (roots back to the 60s). Points out specific dangerous attacks (as opposed to vulnerability scanners). Pentesting tools sold by security companies, like Core Security. → Core IMPACT (since 2001); Immunity Canvas (since 2002); Metasploit (since 2003). Run security checks launching exploits.

  • rg Hoffmann

Simulated Penetration Testing 4/49

slide-24
SLIDE 24

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Penetration Testing (Pentesting)

Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would. Well-established industry (roots back to the 60s). Points out specific dangerous attacks (as opposed to vulnerability scanners). Pentesting tools sold by security companies, like Core Security. → Core IMPACT (since 2001); Immunity Canvas (since 2002); Metasploit (since 2003). Run security checks launching exploits. Core IMPACT uses Metric-FF for automation since 2010.

  • rg Hoffmann

Simulated Penetration Testing 4/49

slide-25
SLIDE 25

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

Security teams are typically small:

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-26
SLIDE 26

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

Security teams are typically small:

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-27
SLIDE 27

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

Increase testing coverage:

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-28
SLIDE 28

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

The security officer’s “rat race”:

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-29
SLIDE 29

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

The security officer’s “rat race”:

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-30
SLIDE 30

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

= ⇒ Simulated Pentesting: Make a model of the network and exploits. Run attack planning on the model to simulate attacks.

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-31
SLIDE 31

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Automation

= ⇒ Simulated Pentesting: Make a model of the network and exploits. Run attack planning on the model to simulate attacks. Running the rat race ≈ update the model, go drink a coffee.

  • rg Hoffmann

Simulated Penetration Testing 5/49

slide-32
SLIDE 32

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Turing Test

  • rg Hoffmann

Simulated Penetration Testing 6/49

slide-33
SLIDE 33

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Turing Test++: “Hacking, not Talking!”

Ultimate vision: realistically simulate a human hacker!

  • rg Hoffmann

Simulated Penetration Testing 7/49

slide-34
SLIDE 34

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Turing Test++: “Hacking, not Talking!”

Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical.

  • rg Hoffmann

Simulated Penetration Testing 7/49

slide-35
SLIDE 35

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Turing Test++: “Hacking, not Talking!”

Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical. However: socio-technical attacks, e. g. social network reconnaissance.

  • rg Hoffmann

Simulated Penetration Testing 7/49

slide-36
SLIDE 36

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Turing Test++: “Hacking, not Talking!”

Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical. However: socio-technical attacks, e. g. social network reconnaissance. → Turing Test as a sub-problem of spying on people .

  • rg Hoffmann

Simulated Penetration Testing 7/49

slide-37
SLIDE 37

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Turing Test++: “Hacking, not Talking!”

Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical. However: socio-technical attacks, e. g. social network reconnaissance. → Turing Test as a sub-problem of spying on people (e. g. [Huber et

  • al. (2009)]).

  • rg Hoffmann

Simulated Penetration Testing 7/49

slide-38
SLIDE 38

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 8/49

slide-39
SLIDE 39

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 9/49

slide-40
SLIDE 40

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Simulated Pentesting at Core Security

Core IMPACT system architecture:

Planner Plan PDDL Description Actions Initial conditions Pentesting Framework Exploits & Attack Modules Attack Workspace

transform transform execution

  • rg Hoffmann

Simulated Penetration Testing 10/49

slide-41
SLIDE 41

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Simulated Pentesting at Core Security

Core IMPACT system architecture:

Planner Plan PDDL Description Actions Initial conditions Pentesting Framework Exploits & Attack Modules Attack Workspace

transform transform execution

→ In practice, the attack plans are being used to point out to the security team where to look.

  • rg Hoffmann

Simulated Penetration Testing 10/49

slide-42
SLIDE 42

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Simulated Pentesting at Core Security

“Point out to the security team where to look”

  • rg Hoffmann

Simulated Penetration Testing 10/49

slide-43
SLIDE 43

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Simulated Pentesting at Core Security

“Point out to the security team where to look”

  • rg Hoffmann

Simulated Penetration Testing 10/49

slide-44
SLIDE 44

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Classical Planning

Definition A STRIPS planning task is a tuple P, A, s0, G: P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), del(a), c(a) of precondition, add list, delete list, and non-negative cost. s0: initial state; G: goal.

  • rg Hoffmann

Simulated Penetration Testing 11/49

slide-45
SLIDE 45

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Classical Planning

Definition A STRIPS planning task is a tuple P, A, s0, G: P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), del(a), c(a) of precondition, add list, delete list, and non-negative cost. s0: initial state; G: goal. Definition A STRIPS planning task’s state space is a tuple S, A, T, s0, SG: S: set of all states; A: actions as above. T: state transitions (s, a, s′) s0: initial state as above; SG: goal states. → Objective: Find cheapest path from s0 to (a state in) SG.

  • rg Hoffmann

Simulated Penetration Testing 11/49

slide-46
SLIDE 46

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Core Security Attack Planning PDDL

Actions:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 12/49

slide-47
SLIDE 47

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Core Security Attack Planning PDDL

Actions:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 12/49

slide-48
SLIDE 48

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Core Security Attack Planning PDDL

Actions:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 12/49

slide-49
SLIDE 49

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Core Security Attack Planning PDDL

Actions:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 12/49

slide-50
SLIDE 50

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Core Security Attack Planning PDDL

Actions:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

Action cost: Average execution time. Success statistic against hosts with the same/similar observable configuration parameters.

  • rg Hoffmann

Simulated Penetration Testing 12/49

slide-51
SLIDE 51

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Core Security Attack Planning PDDL, ctd.

Actions:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

Initial state: “connected” predicates: network graph. “has *” predicates: host configurations. One compromised host: models the internet. Goal: Compromise one or several goal hosts.

  • rg Hoffmann

Simulated Penetration Testing 13/49

slide-52
SLIDE 52

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Remarks

History: Planning domain “of this kind” (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC’08 and IPC’11.

  • rg Hoffmann

Simulated Penetration Testing 14/49

slide-53
SLIDE 53

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Remarks

History: Planning domain “of this kind” (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC’08 and IPC’11. Presented encoding proposed by [Lucangeli et al. (2010)]. Used commercially by Core Security in Core INSIGHT since 2010, running a variant of Metric-FF [Hoffmann (2003)].

  • rg Hoffmann

Simulated Penetration Testing 14/49

slide-54
SLIDE 54

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Remarks

History: Planning domain “of this kind” (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC’08 and IPC’11. Presented encoding proposed by [Lucangeli et al. (2010)]. Used commercially by Core Security in Core INSIGHT since 2010, running a variant of Metric-FF [Hoffmann (2003)]. Do Core Security’s customers like this? I am told they do.

  • rg Hoffmann

Simulated Penetration Testing 14/49

slide-55
SLIDE 55

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Remarks

History: Planning domain “of this kind” (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC’08 and IPC’11. Presented encoding proposed by [Lucangeli et al. (2010)]. Used commercially by Core Security in Core INSIGHT since 2010, running a variant of Metric-FF [Hoffmann (2003)]. Do Core Security’s customers like this? I am told they do. In fact, they like it so much already that Core Security is very reluctant to invest money in making this better . . .

  • rg Hoffmann

Simulated Penetration Testing 14/49

slide-56
SLIDE 56

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Remarks

And now: . . . some remarks about the model.

  • rg Hoffmann

Simulated Penetration Testing 14/49

slide-57
SLIDE 57

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iii)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

→ Which of the predicates are static?

  • rg Hoffmann

Simulated Penetration Testing 15/49

slide-58
SLIDE 58

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iii)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

→ Which of the predicates are static? All except “compromised”.

  • rg Hoffmann

Simulated Penetration Testing 15/49

slide-59
SLIDE 59

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iii)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

→ Which of the predicates are static? All except “compromised”.

  • rg Hoffmann

Simulated Penetration Testing 15/49

slide-60
SLIDE 60

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iv)

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) . . . :effect (and (compromised ?t) (increase (time) 10)))

→ Are you missing something?

  • rg Hoffmann

Simulated Penetration Testing 16/49

slide-61
SLIDE 61

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iv)

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) . . . :effect (and (compromised ?t) (increase (time) 10)))

→ Are you missing something? There are no delete effects.

  • rg Hoffmann

Simulated Penetration Testing 16/49

slide-62
SLIDE 62

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iv)

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) . . . :effect (and (compromised ?t) (increase (time) 10)))

→ Are you missing something? There are no delete effects. The attack is monotonic (growing set of attack assets). = delete-relaxed planning.

  • rg Hoffmann

Simulated Penetration Testing 16/49

slide-63
SLIDE 63

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (iv)

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) . . . :effect (and (compromised ?t) (increase (time) 10)))

→ Are you missing something? There are no delete effects. The attack is monotonic (growing set of attack assets). = delete-relaxed planning. Metric-FF solves this once in every search state . . . Generating an attack is polynomial-time. Generating an optimal attack is NP-complete.

  • rg Hoffmann

Simulated Penetration Testing 16/49

slide-64
SLIDE 64

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (v)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd))

→ Which preconditions are not static?

  • rg Hoffmann

Simulated Penetration Testing 17/49

slide-65
SLIDE 65

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (v)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd))

→ Which preconditions are not static? Just 1: “(compromised ?s)”.

  • rg Hoffmann

Simulated Penetration Testing 17/49

slide-66
SLIDE 66

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (v)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd))

→ Which preconditions are not static? Just 1: “(compromised ?s)”. 1 positive precondition, 1 positive effect.

  • rg Hoffmann

Simulated Penetration Testing 17/49

slide-67
SLIDE 67

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (v)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd))

→ Which preconditions are not static? Just 1: “(compromised ?s)”. 1 positive precondition, 1 positive effect. Optimal attack planning for single goal host = Dijkstra.

  • rg Hoffmann

Simulated Penetration Testing 17/49

slide-68
SLIDE 68

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (v)

:precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd))

→ Which preconditions are not static? Just 1: “(compromised ?s)”. 1 positive precondition, 1 positive effect. Optimal attack planning for single goal host = Dijkstra. Fixed # goal hosts polynomial-time [Bylander (1994)]. Scaling # goal hosts = Steiner tree [Keyder and Geffner (2009)].

  • rg Hoffmann

Simulated Penetration Testing 17/49

slide-69
SLIDE 69

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Concluding Remarks?

Simulated Pentesting at Core Security ≈ Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits.

  • rg Hoffmann

Simulated Penetration Testing 18/49

slide-70
SLIDE 70

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Concluding Remarks?

Simulated Pentesting at Core Security ≈ Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway:

  • rg Hoffmann

Simulated Penetration Testing 18/49

slide-71
SLIDE 71

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Concluding Remarks?

Simulated Pentesting at Core Security ≈ Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway: Extensibility to more fine-grained models of exploits, socio-technical aspects, detrimental side effects.

  • rg Hoffmann

Simulated Penetration Testing 18/49

slide-72
SLIDE 72

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Concluding Remarks?

Simulated Pentesting at Core Security ≈ Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway: Extensibility to more fine-grained models of exploits, socio-technical aspects, detrimental side effects. Bounded sub-optimal search to suggest several solutions not just a single “optimal” one.

  • rg Hoffmann

Simulated Penetration Testing 18/49

slide-73
SLIDE 73

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Concluding Remarks?

Simulated Pentesting at Core Security ≈ Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway: Extensibility to more fine-grained models of exploits, socio-technical aspects, detrimental side effects. Bounded sub-optimal search to suggest several solutions not just a single “optimal” one. Quicker & cheaper than building a proprietary solver.

  • rg Hoffmann

Simulated Penetration Testing 18/49

slide-74
SLIDE 74

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 19/49

slide-75
SLIDE 75

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell

Community: Application-oriented security. Approach: Describe attack actions by preconditions and effects. Identify/give overview of dangerous action combinations.

  • rg Hoffmann

Simulated Penetration Testing 20/49

slide-76
SLIDE 76

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell

Community: Application-oriented security. Approach: Describe attack actions by preconditions and effects. Identify/give overview of dangerous action combinations. Example model:

RSH Connection Spoofing: requires with Trusted Partner: TP; TP.service is RSH; Service Active: SA; SA.service is RSH; . . . . . . provides with push channel: PSC; PSC using := RSH; remote execution: REX; REX.using := RSH; . . . . . .

  • rg Hoffmann

Simulated Penetration Testing 20/49

slide-77
SLIDE 77

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell, ctd.

Brief overview of variants:

Who and When? What? Terminology

  • rg Hoffmann

Simulated Penetration Testing 21/49

slide-78
SLIDE 78

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell, ctd.

Brief overview of variants:

Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) STRIPS actions “attack graph” = action descriptions

  • rg Hoffmann

Simulated Penetration Testing 21/49

slide-79
SLIDE 79

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell, ctd.

Brief overview of variants:

Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) STRIPS actions “attack graph” = action descriptions Ritchey and Ammann (2000) BDD model checking “attack graph” = state space

  • rg Hoffmann

Simulated Penetration Testing 21/49

slide-80
SLIDE 80

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell, ctd.

Brief overview of variants:

Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) STRIPS actions “attack graph” = action descriptions Ritchey and Ammann (2000) BDD model checking “attack graph” = state space Ammann et al. (2002) “Attacks are monotonic!”

  • rg Hoffmann

Simulated Penetration Testing 21/49

slide-81
SLIDE 81

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell, ctd.

Brief overview of variants:

Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) STRIPS actions “attack graph” = action descriptions Ritchey and Ammann (2000) BDD model checking “attack graph” = state space Ammann et al. (2002) “Attacks are monotonic!” Since then, e. g. Ammann et

  • al. (2002); Noel et al. (2009)

Relaxed planning “attack graph” = relaxed planning graph

  • rg Hoffmann

Simulated Penetration Testing 21/49

slide-82
SLIDE 82

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack Graphs in a Nutshell, ctd.

Brief overview of variants:

Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) STRIPS actions “attack graph” = action descriptions Ritchey and Ammann (2000) BDD model checking “attack graph” = state space Ammann et al. (2002) “Attacks are monotonic!” Since then, e. g. Ammann et

  • al. (2002); Noel et al. (2009)

Relaxed planning “attack graph” = relaxed planning graph

→ Attack graphs ≈ practical security-analysis tools based on variants of, and analyses on, relaxed planning graphs. → “AI ⇔ attack graphs” community bridge could be quite useful . . .

  • rg Hoffmann

Simulated Penetration Testing 21/49

slide-83
SLIDE 83

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Dimension (B): Action Model

Two major dimensions for simulated pentesting models: (A) Uncertainty Model: Up next. (B) Action Model: Degree of interaction between individual attack components.

  • rg Hoffmann

Simulated Penetration Testing 22/49

slide-84
SLIDE 84

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Dimension (B): Action Model

Two major dimensions for simulated pentesting models: (A) Uncertainty Model: Up next. (B) Action Model: Degree of interaction between individual attack components. Dimension (B) distinction lines: Explicit Network Graph: Actions = “hops from ?s to ?t”. 1 positive precond, 1 positive effect. Subset of compromised hosts. Monotonic actions: Attacker can only gain new attack assests. Installed software, access rights, knowledge (e. g. passwords) etc. General actions: No restrictions (STRIPS, in simplest case). Can model detrimental side effects.

  • rg Hoffmann

Simulated Penetration Testing 22/49

slide-85
SLIDE 85

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 23/49

slide-86
SLIDE 86

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An Additional Assumption . . .

  • rg Hoffmann

Simulated Penetration Testing 24/49

slide-87
SLIDE 87

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumptions (i) and (ii)

Known network graph: No uncertainty about network graph topology. Known host configurations: No uncertainty about host configurations.

  • rg Hoffmann

Simulated Penetration Testing 24/49

slide-88
SLIDE 88

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An Overview Before We Begin . . .

Uncertainty Model, Dimension (A): None: Classical planning. → CoreSec-Classical: Core Security’s model, as seen. Assumptions (i)–(v). Uncertainty of action outcomes: MDPs. → CoreSec-MDP: Minimal extension of CoreSec-Classical. Assumptions (ii)–(viii). Uncertainty of state: POMDPs. → CoreSec-POMDP: Minimal extension of CoreSec-Classical. Assumptions (ii)–(vii).

  • rg Hoffmann

Simulated Penetration Testing 25/49

slide-89
SLIDE 89

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Partially Observable MDP (POMDP)

Definition A POMDP is a tuple S, A, T, O, O, b0: S states, A actions, O observations. T(s, a, s′): probability of coming to state s′ when executing action a in state s. O(s, a, o): probability of making observation o when executing action a in state s. b0: initial belief, probability distribution over S. Respectively, some (possibly factored) description thereof.

  • rg Hoffmann

Simulated Penetration Testing 26/49

slide-90
SLIDE 90

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Partially Observable MDP (POMDP)

Definition A POMDP is a tuple S, A, T, O, O, b0: S states, A actions, O observations. T(s, a, s′): probability of coming to state s′ when executing action a in state s. O(s, a, o): probability of making observation o when executing action a in state s. b0: initial belief, probability distribution over S. Respectively, some (possibly factored) description thereof. → I’ll discuss optimization objectives later on. For now, assume observable goal states Sg, minimizing undiscounted expected cost-to-goal in a Stochastic Shortest Path (SSP) formulation.

  • rg Hoffmann

Simulated Penetration Testing 26/49

slide-91
SLIDE 91

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Basic Problem

  • rg Hoffmann

Simulated Penetration Testing 27/49

slide-92
SLIDE 92

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Basic Idea [Sarraute et al. (2012)]

  • rg Hoffmann

Simulated Penetration Testing 27/49

slide-93
SLIDE 93

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

States

H0-win2000 H0-win2000-p445 H0-win2000-p445-SMB H0-win2000-p445-SMB-vuln H0-win2000-p445-SMB-agent H0-win2003 H0-win2003-p445 H0-win2003-p445-SMB H0-win2003-p445-SMB-vuln H0-win2003-p445-SMB-agent H0-winXPsp2 H0-winXPsp2-p445 H0-winXPsp2-p445-SMB H0-winXPsp2-p445-SMB-vuln H0-winXPsp2-p445-SMB-agent terminal

”H0”: the host. “winXXX”: OS. “p445”: is port 445 open? “SMB”: if so, SAMBA server? “vuln”: SAMBA server vulnerable? “agent”: has attacker exploited that vulnerability yet? “terminal”: attacker has given up.

  • rg Hoffmann

Simulated Penetration Testing 28/49

slide-94
SLIDE 94

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumptions (vi) and (vii)

Succeed-or-nothing: Exploits have only two possible outcomes, succeed

  • r fail. Fail has an empty effect.

→ Abstraction mainly regarding detrimental side effects.

  • rg Hoffmann

Simulated Penetration Testing 29/49

slide-95
SLIDE 95

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumptions (vi) and (vii)

Succeed-or-nothing: Exploits have only two possible outcomes, succeed

  • r fail. Fail has an empty effect.

→ Abstraction mainly regarding detrimental side effects. Configuration-deterministic actions: Action outcome depends deterministically on network configuration. → Abstraction only in case of more fine-grained dependencies.

  • rg Hoffmann

Simulated Penetration Testing 29/49

slide-96
SLIDE 96

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Exploit Actions

Same syntax:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 30/49

slide-97
SLIDE 97

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Exploit Actions

Same syntax:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

. . . but with a different semantics: Consider s a − → s′

T(s, a, s′) =    1 s | = pre(a), s′ = appl(s, a) 1 s | = pre(a), s′ = s

  • therwise

O(s, a, o) =    1 s | = pre(a), s′ = appl(s, a), o = “success” 1 s | = pre(a), s′ = s, o = “fail”

  • therwise

  • rg Hoffmann

Simulated Penetration Testing 30/49

slide-98
SLIDE 98

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Sensing Actions

Example: (:action OS Detect

:parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :observe (and (when (has OS ?t Windows2000) (“win”)) (when (has OS ?t Windows2003) (“win”)) (when (has OS ?t WindowsXPsp2) (“winXP”)) (when (has OS ?t WindowsXPsp3) (“winXP”)))

  • rg Hoffmann

Simulated Penetration Testing 31/49

slide-99
SLIDE 99

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Sensing Actions

Example: (:action OS Detect

:parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :observe (and (when (has OS ?t Windows2000) (“win”)) (when (has OS ?t Windows2003) (“win”)) (when (has OS ?t WindowsXPsp2) (“winXP”)) (when (has OS ?t WindowsXPsp3) (“winXP”)))

Network reconnaissance also satisfies the benign assumption: → Non-injective but deterministic function of configuration.

  • rg Hoffmann

Simulated Penetration Testing 31/49

slide-100
SLIDE 100

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

So, we’re done, right?

  • rg Hoffmann

Simulated Penetration Testing 32/49

slide-101
SLIDE 101

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

So, we’re done, right?

Computation!

  • rg Hoffmann

Simulated Penetration Testing 32/49

slide-102
SLIDE 102

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

So, we’re done, right?

Computation! But: Can use single-machine case + decomposition.

C5 C2 C3 C4 C6 C7 * C1 ∗ C2 C3 N2 N1 N3

C1

F 1

3

F ∗

1

F ∗

3

F 1

2

F 3

2

C3 N1 m m′

k

. . . m′

1

F 1

3

F 1

3

F 1

3

∅ ∅ N3

  • rg Hoffmann

Simulated Penetration Testing 32/49

slide-103
SLIDE 103

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

So, we’re done, right?

Modeling!

  • rg Hoffmann

Simulated Penetration Testing 32/49

slide-104
SLIDE 104

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

So, we’re done, right?

Modeling! But: Can use outcome of standard scanning scripts?

  • rg Hoffmann

Simulated Penetration Testing 32/49

slide-105
SLIDE 105

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 33/49

slide-106
SLIDE 106

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Markov Decision Process (MDP)

Definition An MDP is a tuple S, A, T, s0: S states, A actions. T(s, a, s′): probability of coming to state s′ when executing action a in state s. s0: initial state. Respectively, some (possibly factored) description thereof. → I’ll discuss optimization objectives later on. For now, assume goal states Sg, minimizing undiscounted expected cost-to-goal in a Stochastic Shortest Path (SSP) formulation.

  • rg Hoffmann

Simulated Penetration Testing 34/49

slide-107
SLIDE 107

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Basic Idea

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t) (has OS ?t Windows) (has OS edition ?t Professional) (has OS servicepack ?t Sp2) (has OS version ?t WinXp) (has architecture ?t I386) (has service ?t ovtrcd)) :effect (and (compromised ?t) (increase (time) 10)))

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :effect (and (probabilistic 0.3 (compromised ?t)) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 35/49

slide-108
SLIDE 108

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

How to Obtain Action Outcome Probabilities?

= ⇒ outcome occurs iff φ(host configurations)

  • rg Hoffmann

Simulated Penetration Testing 36/49

slide-109
SLIDE 109

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

How to Obtain Action Outcome Probabilities?

= ⇒ outcome occurs iff φ(host configurations) = ⇒ outcome probability ≈ P(φ(host configurations), b0)

  • rg Hoffmann

Simulated Penetration Testing 36/49

slide-110
SLIDE 110

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

How to Obtain Action Outcome Probabilities?

= ⇒ outcome occurs iff φ(host configurations) = ⇒ outcome probability ≈ P(φ(host configurations), b0) = ⇒ just need success probability as function of host configurations in b0

  • rg Hoffmann

Simulated Penetration Testing 36/49

slide-111
SLIDE 111

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

How to Obtain Action Outcome Probabilities?

= ⇒ outcome occurs iff φ(host configurations) = ⇒ outcome probability ≈ P(φ(host configurations), b0) = ⇒ just need success probability as function of host configurations in b0 = ⇒ Use Core Security success statistics as success probabilities.

  • rg Hoffmann

Simulated Penetration Testing 36/49

slide-112
SLIDE 112

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

MDP vs. POMDP

Where did we cheat on the previous slide?

  • rg Hoffmann

Simulated Penetration Testing 37/49

slide-113
SLIDE 113

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

MDP vs. POMDP

Where did we cheat on the previous slide? = ⇒ outcome prob ≈ P(φ(host configs), b0) → b0 just captures the attacker’s initial knowledge.

  • rg Hoffmann

Simulated Penetration Testing 37/49

slide-114
SLIDE 114

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

MDP vs. POMDP

Where did we cheat on the previous slide? = ⇒ outcome prob ≈ P(φ(host configs), b0) → b0 just captures the attacker’s initial knowledge. Hence: Inability to learn. Success probabilities develop with knowledge in the POMDP, but remain constant in the MDP.

  • rg Hoffmann

Simulated Penetration Testing 37/49

slide-115
SLIDE 115

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

MDP vs. POMDP

Where did we cheat on the previous slide? = ⇒ outcome prob ≈ P(φ(host configs), b0) → b0 just captures the attacker’s initial knowledge. Hence: Inability to learn. Success probabilities develop with knowledge in the POMDP, but remain constant in the MDP. (But: Maintain flags for partial belief-tracking in the MDP?)

  • rg Hoffmann

Simulated Penetration Testing 37/49

slide-116
SLIDE 116

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (viii)

Assume that ?t doesn’t have the required configuration:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :effect (and (probabilistic 0.3 (compromised ?t)) (increase (time) 10)))

  • rg Hoffmann

Simulated Penetration Testing 38/49

slide-117
SLIDE 117

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (viii)

Assume that ?t doesn’t have the required configuration:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :effect (and (probabilistic 0.3 (compromised ?t)) (increase (time) 10)))

→ The probability of breaking into ?t eventually is

  • rg Hoffmann

Simulated Penetration Testing 38/49

slide-118
SLIDE 118

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (viii)

Assume that ?t doesn’t have the required configuration:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :effect (and (probabilistic 0.3 (compromised ?t)) (increase (time) 10)))

→ The probability of breaking into ?t eventually is 1.

  • rg Hoffmann

Simulated Penetration Testing 38/49

slide-119
SLIDE 119

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (viii)

Assume that ?t doesn’t have the required configuration:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :effect (and (probabilistic 0.3 (compromised ?t)) (increase (time) 10)))

→ The probability of breaking into ?t eventually is 1. This contradicts our benign assumptions (iii) and (vii).

  • rg Hoffmann

Simulated Penetration Testing 38/49

slide-120
SLIDE 120

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Assumption (viii)

Assume that ?t doesn’t have the required configuration:

(:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host ?t - host) :precondition (and (compromised ?s) (connected ?s ?t)) :effect (and (probabilistic 0.3 (compromised ?t)) (increase (time) 10)))

→ The probability of breaking into ?t eventually is 1. This contradicts our benign assumptions (iii) and (vii). Hence: Apply-once constraint: Allow to apply each exploit, on each target host, at most once.

  • rg Hoffmann

Simulated Penetration Testing 38/49

slide-121
SLIDE 121

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 39/49

slide-122
SLIDE 122

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now? J¨

  • rg Hoffmann

Simulated Penetration Testing 40/49

slide-123
SLIDE 123

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Remember?

  • rg Hoffmann

Simulated Penetration Testing 40/49

slide-124
SLIDE 124

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

A Model Taxonomy

(Lucangeli et al. 2010)

CoreSec−Classical

e.g. (Amman et al. 2002)

Attack Graphs

(Boddy et al. 2005)

CyberSecurity Current POMDP Model

(Sarraute et al. 2012)

CoreSec−MDP

(Durkota and Lisy 2014)

CoreSec−POMDP

PO−CHP Attack−Asset POMDP Attack−Asset MDP Problem (CHP) Canadian Hacker Factored POMDP Factored MDP Classical Planning Classical Planning Delete−Relaxed Graph Distance

(i) −− (v) (i) −− (iv) (i) (iii) (vii) (viii) (i) (iii) −− (viii) (i) (iii) −− (viii)

Explicit Network Graph Monotonic Actions General Actions None Outcomes Action States (A) Uncertainty Model (B) Action Model

(i) (iii) (iv) (vi) −− (viii) (i) (iii) (iv) (vi) −− (viii) (i) (iii) (vii) (viii) (i) −− (iii)

  • rg Hoffmann

Simulated Penetration Testing 40/49

slide-125
SLIDE 125

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The 3rd Dimension

Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve?

  • rg Hoffmann

Simulated Penetration Testing 41/49

slide-126
SLIDE 126

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The 3rd Dimension

Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve? Options: Finite-horizon: Ok. But: Offline problem, horizon not meaningful unless for overall attack (see below). Maximize discounted reward: Ok. But: Discounting unintuitive. And who’s to set the rewards?

  • rg Hoffmann

Simulated Penetration Testing 41/49

slide-127
SLIDE 127

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The 3rd Dimension

Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve? Options: Finite-horizon: Ok. But: Offline problem, horizon not meaningful unless for overall attack (see below). Maximize discounted reward: Ok. But: Discounting unintuitive. And who’s to set the rewards? Minimize non-discounted expected cost-to-goal (SSP): Seems good. Non-0 action costs, give-up action. But: Give-up cost?

  • rg Hoffmann

Simulated Penetration Testing 41/49

slide-128
SLIDE 128

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The 3rd Dimension

Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve? Options: Finite-horizon: Ok. But: Offline problem, horizon not meaningful unless for overall attack (see below). Maximize discounted reward: Ok. But: Discounting unintuitive. And who’s to set the rewards? Minimize non-discounted expected cost-to-goal (SSP): Seems good. Non-0 action costs, give-up action. But: Give-up cost? Limited-budget goal probability maximization (MAXPROP): My

  • favorite. Non-0 action costs, give-up action, hence finite-runs SSP.

No “but” I can think of.

  • rg Hoffmann

Simulated Penetration Testing 41/49

slide-129
SLIDE 129

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Interesting Sub-Classes

CoreSec−MDP

(Durkota and Lisy 2014)

CoreSec−POMDP

PO−CHP Attack−Asset POMDP Attack−Asset MDP Problem (CHP) Canadian Hacker

(i) (iii) −− (viii) (i) (iii) −− (viii)

Explicit Network Graph Monotonic Actions Outcomes Action States (A) Uncertainty Model (B) Action Model

(i) (iii) (iv) (vi) −− (viii) (i) (iii) (iv) (vi) −− (viii)

  • rg Hoffmann

Simulated Penetration Testing 42/49

slide-130
SLIDE 130

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Interesting Sub-Classes we will Discuss

CoreSec−MDP

(Durkota and Lisy 2014)

Attack−Asset MDP Problem (CHP) Canadian Hacker

(i) (iii) −− (viii)

Explicit Network Graph Monotonic Actions Outcomes Action (A) Uncertainty Model (B) Action Model

(i) (iii) (iv) (vi) −− (viii)

  • rg Hoffmann

Simulated Penetration Testing 42/49

slide-131
SLIDE 131

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

We Start With:

CoreSec−MDP

Problem (CHP) Canadian Hacker

(i) (iii) −− (viii)

Explicit Network Graph Outcomes Action (A) Uncertainty Model (B) Action Model

  • rg Hoffmann

Simulated Penetration Testing 42/49

slide-132
SLIDE 132

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Traveller Problem

  • rg Hoffmann

Simulated Penetration Testing 43/49

slide-133
SLIDE 133

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

+

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-134
SLIDE 134

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

+

action-outcome uncertainty =

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-135
SLIDE 135

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

+

action-outcome uncertainty =

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-136
SLIDE 136

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

+

action-outcome uncertainty =

+ =

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-137
SLIDE 137

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

+

action-outcome uncertainty =

+ =

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-138
SLIDE 138

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-139
SLIDE 139

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-140
SLIDE 140

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-141
SLIDE 141

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-142
SLIDE 142

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem

  • rg Hoffmann

Simulated Penetration Testing 44/49

slide-143
SLIDE 143

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem: And Now?

Wrap-up: Variant of Canadian Traveller Problem where we “have” a monotonically growing set of nodes (“no need to drive back”).

  • rg Hoffmann

Simulated Penetration Testing 45/49

slide-144
SLIDE 144

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem: And Now?

Wrap-up: Variant of Canadian Traveller Problem where we “have” a monotonically growing set of nodes (“no need to drive back”). Research Challenges/Opportunities:

  • rg Hoffmann

Simulated Penetration Testing 45/49

slide-145
SLIDE 145

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

The Canadian Hacker Problem: And Now?

Wrap-up: Variant of Canadian Traveller Problem where we “have” a monotonically growing set of nodes (“no need to drive back”). Research Challenges/Opportunities: 1001 CTP papers to be adapted to this . . .

  • rg Hoffmann

Simulated Penetration Testing 45/49

slide-146
SLIDE 146

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs

(Durkota and Lisy 2014)

Attack−Asset MDP Monotonic Actions Outcomes Action (A) Uncertainty Model (B) Action Model

(i) (iii) (iv) (vi) −− (viii)

  • rg Hoffmann

Simulated Penetration Testing 46/49

slide-147
SLIDE 147

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs

Definition An Attack-Asset MDP is a tuple P, A, s0, G: P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), p(a), c(a) of precondition, add list, success probability, and non-negative cost. s0: initial state; G: goal.

  • rg Hoffmann

Simulated Penetration Testing 46/49

slide-148
SLIDE 148

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs

Definition An Attack-Asset MDP is a tuple P, A, s0, G: P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), p(a), c(a) of precondition, add list, success probability, and non-negative cost. s0: initial state; G: goal. The probabilistic transitions T arise from these rules: States: STRIPS s, available actions A ⊆ A. a is applicable to (s, A) if pre(a) ⊆ s and a ∈ A.

  • rg Hoffmann

Simulated Penetration Testing 46/49

slide-149
SLIDE 149

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs

Definition An Attack-Asset MDP is a tuple P, A, s0, G: P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), p(a), c(a) of precondition, add list, success probability, and non-negative cost. s0: initial state; G: goal. The probabilistic transitions T arise from these rules: States: STRIPS s, available actions A ⊆ A. a is applicable to (s, A) if pre(a) ⊆ s and a ∈ A. With probability p(a) we obtain s′ = s ∪ add(a), and with probability 1 − p(a) we obtain s′ = s. In both cases, we pay cost c(a), and remove a from A.

  • rg Hoffmann

Simulated Penetration Testing 46/49

slide-150
SLIDE 150

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-151
SLIDE 151

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities:

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-152
SLIDE 152

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-153
SLIDE 153

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is “nothing happens”.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-154
SLIDE 154

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is “nothing happens”. Every probabilistic action yields a single deterministic action.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-155
SLIDE 155

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is “nothing happens”. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-156
SLIDE 156

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is “nothing happens”. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects. Weak plans and determinization heuristics = standard delete relaxation heuristics.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-157
SLIDE 157

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is “nothing happens”. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects. Weak plans and determinization heuristics = standard delete relaxation heuristics. “Landmark action outcomes” = deterministic delete-relaxation landmarks.

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-158
SLIDE 158

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Attack-Asset MDPs: And Now?

Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is “nothing happens”. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects. Weak plans and determinization heuristics = standard delete relaxation heuristics. “Landmark action outcomes” = deterministic delete-relaxation landmarks. Limited-budget goal probability maximization: landmarks reduce budget ` a la [Mirkis and Domshlak (2014)].

  • rg Hoffmann

Simulated Penetration Testing 47/49

slide-159
SLIDE 159

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

Agenda

1

What is this all about?

2

Classical Planning: The Core Security Model [Lucangeli et al. (2010)]

3

Attack Graphs

4

Towards Accuracy: POMDP Models [Sarraute et al. (2012)]

5

The MDP Middle Ground

6

A Model Taxonomy

7

And Now?

  • rg Hoffmann

Simulated Penetration Testing 48/49

slide-160
SLIDE 160

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker!

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-161
SLIDE 161

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs.

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-162
SLIDE 162

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott – best modeled in PPDDL, at least the MDP variants.)

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-163
SLIDE 163

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott – best modeled in PPDDL, at least the MDP variants.) Diverse attacks,

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-164
SLIDE 164

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott – best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria,

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-165
SLIDE 165

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott – best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria, situation report,

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-166
SLIDE 166

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott – best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria, situation report, suggest fixes.

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-167
SLIDE 167

What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?

An AI (Sequential Decision Making) Challenge

Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott – best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria, situation report, suggest fixes. Ultimately, an AI-complete problem.

  • rg Hoffmann

Simulated Penetration Testing 49/49

slide-168
SLIDE 168

References

Thanks for Your Attention!

. . . and enjoy the old city tour.

slide-169
SLIDE 169

References

References I

Paul Ammann, Duminda Wijesekera, and Saket Kaushik. Scalable, graph-based network vulnerability analysis. In ACM Conference on Computer and Communications Security, pages 217–224, 2002. Mark Boddy, Jonathan Gohde, Tom Haigh, and Steven Harp. Course of action generation for cyber security using classical planning. In Susanne Biundo, Karen Myers, and Kanna Rajan, editors, Proceedings of the 15th International Conference

  • n Automated Planning and Scheduling (ICAPS-05), pages 12–21, Monterey, CA,

USA, 2005. Morgan Kaufmann. Rainer B¨

  • hme and M´

ark F´ elegyh´

  • azi. Optimal information security investment with

penetration testing. In Proceedings of the 1st International Conference on Decision and Game Theory for Security (GameSec’10), pages 21–37, 2010. Tom Bylander. The computational complexity of propositional STRIPS planning. Artificial Intelligence, 69(1–2):165–204, 1994. Russell Greiner, Ryan Hayward, Magdalena Jankowska, and Michael Molloy. Finding

  • ptimal satisficing strategies for and-or trees. Artificial Intelligence, 170(1):19–58,

2006.

slide-170
SLIDE 170

References

References II

Russell Greiner. Finding optimal derivation strategies in redundant knowledge bases. Artificial Intelligence, 50(1):95–115, 1991. J¨

  • rg Hoffmann. The Metric-FF planning system: Translating “ignoring delete lists” to

numeric state variables. Journal of Artificial Intelligence Research, 20:291–341, 2003. Markus Huber, Stewart Kowalski, Marcus Nohlberg, and Simon Tjoa. Towards automating social engineering using social networking sites. In Proceedings of the 12th IEEE International Conference on Computational Science and Engineering (CSE’09), pages 117–124. IEEE Computer Society, 2009. Emil Keyder and Hector Geffner. Trees of shortest paths vs. Steiner trees: Understanding and improving delete relaxation heuristics. In C. Boutilier, editor, Proceedings of the 21st International Joint Conference on Artificial Intelligence (IJCAI 2009), pages 1734–1739, Pasadena, California, USA, July 2009. Morgan Kaufmann. Barbara Kordy, Sjouke Mauw, Sasa Radomirovic, and Patrick Schweitzer. Foundations

  • f attack-defense trees. In Proceedings of the 7th International Workshop on

Formal Aspects in Security and Trust (FAST’10), pages 80–95, 2010.

slide-171
SLIDE 171

References

References III

Barbara Kordy, Piotr Kordy, Sjouke Mauw, and Patrick Schweitzer. ADTool: security analysis with attack-defense trees. In Proceedings of the 10th International Conference on Quantitative Evaluation of Systems (QEST’13), pages 173–176, 2013. Viliam Lis´ y and Radek P´ ıbil. Computing optimal attack strategies using unconstrained influence diagrams. In Pacific Asia Workshop on Intelligence and Security Informatics, pages 38–46, 2013. Jorge Lucangeli, Carlos Sarraute, and Gerardo Richarte. Attack planning in the real

  • world. In Proceedings of the 2nd Workshop on Intelligent Security (SecArt’10),

2010. Vitaly Mirkis and Carmel Domshlak. Landmarks in oversubscription planning. In Thorsten Schaub, editor, Proceedings of the 21st European Conference on Artificial Intelligence (ECAI’14), pages 633–638, Prague, Czech Republic, August 2014. IOS Press. Steven Noel, Matthew Elder, Sushil Jajodia, Pramod Kalapa, Scott O’Hare, and Kenneth Prole. Advances in topological vulnerability analysis. In Proceedings of the 2009 Cybersecurity Applications & Technology Conference for Homeland Security (CATCH’09), pages 124–129, 2009.

slide-172
SLIDE 172

References

References IV

Ronald W. Ritchey and Paul Ammann. Using model checking to analyze network

  • vulnerabilities. In IEEE Symposium on Security and Privacy, pages 156–165, 2000.

Carlos Sarraute, Olivier Buffet, and J¨

  • rg Hoffmann. POMDPs make better hackers:

Accounting for uncertainty in penetration testing. In J¨

  • rg Hoffmann and Bart

Selman, editors, Proceedings of the 26th AAAI Conference on Artificial Intelligence (AAAI’12), pages 1816–1824, Toronto, ON, Canada, July 2012. AAAI Press.

  • B. Schneier. Attack trees. Dr. Dobbs Journal, 1999.

Milind Tambe. Security and Game Theory: Algorithms, Deployed Systems, Lessons

  • Learned. Cambridge University Press, 2011.

Steven J. Templeton and Karl E. Levitt. A requires/provides model for computer

  • attacks. In Proceedings of the Workshop on New Security Paradigms (NSPW’00),

pages 31–38, 2000.

slide-173
SLIDE 173

References

Attack Trees

Community: Application-oriented security, some academic research. Approach: “Graphical Security Models”. Organize known possible attacks by top-down refinement over attack actions and sub-actions.

slide-174
SLIDE 174

References

Attack Trees

Community: Application-oriented security, some academic research. Approach: “Graphical Security Models”. Organize known possible attacks by top-down refinement over attack actions and sub-actions. On the side: Many attack tree models are equivalent to AI “formula evaluation” [e. g. Greiner (1991); Greiner et al. (2006)]. Apparently unnoticed by both communities; pointed out by Lis´ y and P´ ıbil (2013).

slide-175
SLIDE 175

References

Dimension (B): In Other Words

Explicit Network Graph: Actions = “hops from ?s to ?t”. Monotonic actions: Attacker can only gain new attack assests. General actions: No restrictions.

slide-176
SLIDE 176

References

Dimension (B): In Other Words

Explicit Network Graph: Actions = “hops from ?s to ?t”. Monotonic actions: Attacker can only gain new attack assests. General actions: No restrictions. → Note that (v) implies (iv).

slide-177
SLIDE 177

References

Dimension (B): In Other Words

Explicit Network Graph: Actions = “hops from ?s to ?t”. Monotonic actions: Attacker can only gain new attack assests. General actions: No restrictions. → Note that (v) implies (iv). And each of (iv) and (v) implies (iii):

slide-178
SLIDE 178

References

Dimension (B) Assumptions: Overview

Explicit Network Graph: Actions = “hops from ?s to ?t”. Relax: More general attack assets (software/passwords . . . ). Monotonic actions: Attacker can only gain new attack assests. Relax: E.g. detrimental side effects, crashing the host. Static network: Host connections & configurations not affected. Relax: E.g. detrimental side effects, crashing the host.

slide-179
SLIDE 179

References

Game-Theoretic Models

What about modeling the defender?

slide-180
SLIDE 180

References

Game-Theoretic Models

What about modeling the defender? My 5 cents: How to get realistic models? Is a network intrusion actually a game? → Typically mentioned, if at all, as “detection risk” as in “potential detrimental side effect of an attack action”.

slide-181
SLIDE 181

References

Game-Theoretic Models

What about modeling the defender? My 5 cents: How to get realistic models? Is a network intrusion actually a game? → Typically mentioned, if at all, as “detection risk” as in “potential detrimental side effect of an attack action”. GameSec series http://www.gamesec-conf.org/

slide-182
SLIDE 182

References

Game-Theoretic Models

What about modeling the defender? My 5 cents: How to get realistic models? Is a network intrusion actually a game? → Typically mentioned, if at all, as “detection risk” as in “potential detrimental side effect of an attack action”. GameSec series http://www.gamesec-conf.org/ B¨

  • hme and F´

elegyh´ azi (2010) introduce a model of the entire pentesting life cycle, and prove that pentesting pays off.

slide-183
SLIDE 183

References

Game-Theoretic Models

What about modeling the defender? My 5 cents: How to get realistic models? Is a network intrusion actually a game? → Typically mentioned, if at all, as “detection risk” as in “potential detrimental side effect of an attack action”. GameSec series http://www.gamesec-conf.org/ B¨

  • hme and F´

elegyh´ azi (2010) introduce a model of the entire pentesting life cycle, and prove that pentesting pays off. Attack-defense trees [Kordy et al. (2010, 2013)].

slide-184
SLIDE 184

References

Game-Theoretic Models

What about modeling the defender? My 5 cents: How to get realistic models? Is a network intrusion actually a game? → Typically mentioned, if at all, as “detection risk” as in “potential detrimental side effect of an attack action”. GameSec series http://www.gamesec-conf.org/ B¨

  • hme and F´

elegyh´ azi (2010) introduce a model of the entire pentesting life cycle, and prove that pentesting pays off. Attack-defense trees [Kordy et al. (2010, 2013)]. Security games (e. g. Tambe (2011)): Completely different application.