Security of Cyber-Physical Systems Henrik Sandberg hsan@kth.se - - PowerPoint PPT Presentation

security of cyber physical systems
SMART_READER_LITE
LIVE PREVIEW

Security of Cyber-Physical Systems Henrik Sandberg hsan@kth.se - - PowerPoint PPT Presentation

KTH ROYAL INSTITUTE OF TECHNOLOGY Security of Cyber-Physical Systems Henrik Sandberg hsan@kth.se Department of Automatic Control, KTH, Stockholm, Sweden 7th oCPS PhD School on Cyber-Physical Systems, Lucca, Italy Outline Background and


slide-1
SLIDE 1

KTH ROYAL INSTITUTE OF TECHNOLOGY

Security of Cyber-Physical Systems

Henrik Sandberg

hsan@kth.se Department of Automatic Control, KTH, Stockholm, Sweden 7th oCPS PhD School on Cyber-Physical Systems, Lucca, Italy

slide-2
SLIDE 2

Outline

  • Background and motivation
  • CPS attack models
  • Risk management
  • Attack detectability and security metrics
  • Attack identification and secure state estimation

2

slide-3
SLIDE 3

ICS-CERT = Industrial Control Systems Cyber Emergency Response Team (https://ics-cert.us-cert.gov/) Part of US Department of Homeland Security

3

slide-4
SLIDE 4

4

Example 1: Industrial Control System (ICS) Infrastructure

slide-5
SLIDE 5

Example 2: The Stuxnet Worm (2010)

Targets: Windows, ICS, and PLCs connected to variable-frequency drives Exploited 4 zero-day flaws

  • Speculated goal:

Harm centrifuges at uranium enrichment facility in Iran

  • Attack mode:
  • 1. Delivery with USB stick (no

internet connection necessary)

  • 2. Replay measurements to control

center and execute harmful controls

[“The Real Story of Stuxnet”, IEEE Spectrum, 2013] (See also http://www.zerodaysfilm.com/ )

5

slide-6
SLIDE 6

6

slide-7
SLIDE 7

Example 3: Events in Ukraine (December, 2015)

7

slide-8
SLIDE 8

Example 3: Events in Ukraine (December, 2015)

  • BlackEnergy (2007-)
  • From arstechnica.com:
  • “In 2014 … targeted the North Atlantic Treaty

Organization, Ukrainian and Polish government agencies, and a variety of sensitive European industries”

  • “booby-trapped macro functions embedded in Microsoft

Office documents”

  • “render infected computers unbootable”
  • “KillDisk, which destroys critical parts of a computer hard

drive”

  • “backdoored secure shell (SSH) utility that gives

attackers permanent access to infected computers”

  • More advanced, more autonomous, follow-up attack in

2016: “Crash Override”

8

slide-9
SLIDE 9

Some Statistics

50 100 Power Grid Water Nuclear Government Facilities Healthcare Percentage (%) Sectors 50 100 150 200 250 300 2009 2010 2011 2012 2013 Number of Attacks Year 28X

[ICS-CERT, 2013] [S. Zonouz, 2014]

Cyber incidents in critical infrastructures in the US (Voluntarily reported to ICS-CERT)

9

slide-10
SLIDE 10

Cyber-Physical Security

10

Networked control systems

  • are being integrated with

business/corporate networks

  • have many potential points of cyber-physical

attack

Need tools and strategies to understand and mitigate attacks:

  • Which threats should we care about?
  • What impact can we expect from attacks?
  • Which resources should we protect (more),

and how?

Is it enough to apply cyber (IT) security solutions?

slide-11
SLIDE 11

Example of Classic Cyber Security: The Byzantine Generals Problem

  • Consider 𝑜 generals and 𝑟 unknown traitors among them. Can the 𝑜 − 𝑟

loyal generals always reach an agreement?

  • Traitors (“Byzantine faults”) can do anything: different message to different

generals, send no message, change forwarded message,…

  • Agreement protocol exists iff 𝑜 ≥ 3𝑟 + 1
  • If loyal generals use unforgeable signed messages (“authentication”) then

agreement protocol exists for any 𝑟!

  • Application to linear consensus computations: See [Pasqualetti et al., CDC,

2007], [Sundaram and Hadjicostis, ACC, 2008] [Lamport et al., ACM TOPLAS, 1982]

11

slide-12
SLIDE 12

Special Controls Perspective Needed?

Clearly cyber (IT) security is needed: Authentication, encryption, firewalls, etc. But not sufficient… Interaction between physical and cyber systems make control systems different from normal IT systems Malicious actions can enter anywhere in the closed loop and cause harm, whether channels secured or not Can we trust the interfaces and channels are really secured? (see OpenSSL Heartbleed bug…)

[Cardenas et al., 2008]

12

slide-13
SLIDE 13

Security Challenges in ICS

“New” vulnerabilities and “new” threats:

  • Controllers are computers (Relays → Microprocessors)
  • Networked (Access from corporate network)
  • Commodity IT solutions (Windows, TCP/IP,…)
  • Open design (Protocols known)
  • Increasing size and functionality (New services,

wireless,...)

  • Large and highly skilled IT global workforce (More IT

knowledge)

  • Cybercrime (Attack tools available)

[Cardenas et al., 2008]

13

slide-14
SLIDE 14

Security Challenges in ICS

[Cardenas et al., 2008]

Differences to traditional IT systems:

  • Patching and frequent updates are not well suited

for control systems

  • Real-time availability (Strict operational environment)
  • Legacy systems (Often no authentication or

encryption)

  • Protection of information and physical world

(Estimation and control algorithms)

  • Simpler network dynamics (fixed topology, regular

communication, limited number of protocols,…)

14

slide-15
SLIDE 15

CIA in Cyber Security [Bishop, 2002]

15

C – Confidentiality I – Integrity A – Availability

“Privacy”

(See recent work by Le Ny, Pappas, Dullerud, Cortes, Tanaka, Sandberg,… )

“Security”

(Focus here. Good intro in CSM 2015 special issue)

slide-16
SLIDE 16

Outline

  • Background and motivation
  • CPS attack models
  • Risk management
  • Attack detectability and security metrics
  • Attack identification and secure state estimation

16

slide-17
SLIDE 17
  • Physical Attacks
  • Disclosure Attacks
  • Deception Attacks
  • Physical plant (𝒬)
  • Feedback controller (ℱ)
  • Anomaly detector (𝒠)

17

Networked Control System under Attack

[Teixeira et al., HiCoNS, 2012]

slide-18
SLIDE 18

Adversary Model

  • Attack policy: Goal of the attack? Destroy equipment, increase

costs,…

  • Model knowledge: Adversary knows models of plant and

controller? Possibility for stealthy attacks…

  • Disruption/disclosure resources: Which channels can the

adversary access?

[Teixeira et al., HiCoNS, 2012]

18

slide-19
SLIDE 19

Networked Control System with Adversary Model

19

slide-20
SLIDE 20

Attack Space

Eavesdropping [M. Bishop] Replay [B. Sinopoli] Covert [R. Smith] [Teixeira et al., HiCoNS, 2012]

20

[Teixeira et al., Automatica, 2015]

In focus today

slide-21
SLIDE 21

Outline

  • Background and motivation
  • CPS attack models
  • Risk management
  • Attack detectability and security metrics
  • Attack identification and secure state estimation

21

slide-22
SLIDE 22

Why Risk Management?

Complex control systems with numerous attack scenarios Examples: Critical infrastructures (power, transport, water, gas, oil) often with weak security guarantees Too costly to secure the entire system against all attack scenarios What scenarios to prioritize? What components to protect? When possible to identify attacks?

Power transmission Industrial automation Transportation

22

slide-23
SLIDE 23

Defining Risk

Scenario

  • How to describe the system under attack?

Likelihood

  • How much effort does a given attack require?

Impact

  • What are the consequences of an attack?

Risk = (Scenario, Likelihood, Impact)

[Kaplan & Garrick, 1981], [Bishop, 2002] ([Teixeira et al., IEEE CSM, 2015])

23

slide-24
SLIDE 24

Risk Management Cycle

Main steps in risk management

  • Scope definition

– Models, Scenarios, Objectives

  • Risk Analysis

– Threat Identification – Likelihood Assessment – Impact Assessment

  • Risk Treatment

– Prevention, Detection, Mitigation

[Sridhar et al., Proc. IEEE, 2012]

24

slide-25
SLIDE 25

Example: Power System State Estimator

25

slide-26
SLIDE 26

Example: Power System State Estimator

Security index 𝛽 (to be defined) indicates sensors with inherent weak redundancy (∼security). These should be secured first!

[Teixeira et al., IEEE CSM, 2015], [Vukovic et al., IEEE JSAC, 2012]

26

slide-27
SLIDE 27

Outline

  • Background and motivation
  • CPS attack models
  • Risk management
  • Attack detectability and security metrics
  • Attack identification and secure state estimation

27

slide-28
SLIDE 28

Basic Notions: Input Observability and Detectability

Definitions: 1. The input 𝑣 is observable with knowledge of 𝑦 0 if 𝑧 𝑙 = 0 for 𝑙 ≥ 0 implies 𝑣 𝑙 = 0 for 𝑙 ≥ 0, provided 𝑦 0 = 0 2. The input 𝑣 is observable if 𝑧 𝑙 = 0 for 𝑙 ≥ 0 implies 𝑣 𝑙 = 0 for 𝑙 ≥ 0 (𝑦(0) unknown) 3. The input 𝑣 is detectable if 𝑧 𝑙 = 0 for 𝑙 ≥ 0 implies 𝑣 𝑙 → 0 for 𝑙 → ∞ (𝑦(0) unknown)

[Hou and Patton, Automatica, 1998]

28

slide-29
SLIDE 29

Basic Notions: Input Observability and Detectability

First observations:

  • Necessary condition for Definitions 1-3
  • Fails if number of inputs larger than number of outputs (𝑛 > 𝑞)
  • Necessary and sufficient conditions involve the invariant zeros:

(Transmission zeros + uncontrollable/unobservable modes, Matlab command: tzero)

The Rosenbrock system matrix:

29

slide-30
SLIDE 30

Basic Notions: Input Observability and Detectability

  • Theorems. Suppose (𝐵, 𝐶, 𝐷, 𝐸) is minimal realization.

1. The input 𝑣 is observable with knowledge of 𝑦 0 ⇔ 2. The input 𝑣 is observable ⇔ (no invariant zeros) 3. The input 𝑣 is detectable ⇔ (1) and

(invariant zeros are all stable = system is minimum phase)

[Hou and Patton, Automatica, 1998]

30

slide-31
SLIDE 31

Basic Notions: Input Observability and Detectability

  • Theorems. (𝐵, 𝐶, 𝐷, 𝐸) possibly non-minimal realization

1. The input 𝑣 is observable with knowledge of 𝑦 0 ⇔ 2’. The input 𝑣 is observable ⇔ (1) and (invariant zeros are all unobservable modes) 3’. The input 𝑣 is detectable ⇔ (1) and

(invariant zeros that are not unobservable modes are all stable)

[Hou and Patton, Automatica, 1998]

31

slide-32
SLIDE 32

Fault Detection vs. Secure Control

Typical condition used in fault detection/fault tolerant control: 1. The input 𝑣 is observable with knowledge of 𝑦 0 ⇔ Typical conditions used in secure control/estimation: 2. The input 𝑣 is observable ⇔ (no invariant zeros) 3/3’. The input 𝑣 is detectable ⇔ (1) and

(invariant zeros are all stable = system is minimum phase)

32

[Sundaram, Tabuada] [Pasqualetti, Sandberg] [Ding, Patton]

slide-33
SLIDE 33

Example

Invariant zeros = 𝜏 𝑄 𝑨 = {1.1} 1. The input 𝑣 is observable with knowledge of 𝑦 0 : Yes! 2. The input 𝑣 is observable: No! 3. The input 𝑣 is detectable: No! With 𝑦 0 = −0.705 0.470 0.352 and 𝑣 𝑙 = 1.1𝑙 −0.282 0.282 then 𝑧 𝑙 = 0, 𝑙 ≥ 0 OK for fault detection but perhaps not for security!

33

slide-34
SLIDE 34

Attack and Disturbance Model

Consider the linear system 𝑧 = 𝐻𝑒𝑒 + 𝐻𝑏𝑏 (the controlled infrastructure):

  • Unknown state 𝑦 𝑙 ∈ ℝ𝑜 (𝑦 0 in particular)
  • Unknown (natural) disturbance 𝑒 𝑙 ∈ ℝ𝑝
  • Unknown (malicious) attack 𝑏 𝑙 ∈ ℝ𝑛
  • Known measurement 𝑧 𝑙 ∈ ℝ𝑞
  • Known model 𝐵, 𝐶𝑒, 𝐶𝑏, 𝐷, 𝐸𝑒, 𝐸𝑏
  • Definition: Attack signal 𝑏 is persistent if 𝑏 𝑙 ↛ 0 as 𝑙 → ∞
  • Definition: A (persistent) attack signal 𝑏 is undetectable if there exists a

simultaneous (masking) disturbance signal 𝑒 and initial state 𝑦(0) such that 𝑧(𝑙) = 0, 𝑙 ≥ 0 (Cf. Theorem 3’)

34

slide-35
SLIDE 35

Undetectable Attacks and Masking

The Rosenbrock system matrix:

  • Attack signal 𝑏(𝑙) = 𝑨0

𝑙𝑏0, 0 ≠ 𝑏0 ∈ ℂ𝑛, 𝑨0 ∈ ℂ , is undetectable iff

there exists 𝑦0 ∈ ℂ𝑜 and 𝑒0 ∈ ℂ𝑝 such that

  • Attack signal is undetectable if indistinguishable from measurable (𝑧)

effects of natural noise (𝑒) or uncertain initial states (𝑦0) [masking]

35

slide-36
SLIDE 36

Example (cont’d)

Poles = {0.9, 0.9, 0.8} Invariant zeros = 𝜏 𝑄 𝑨 = {1.1} Undetectable attack: 𝑏 𝑙 = 1.1𝑙 ⋅ 0.282 Masking initial state: 𝑦0 = −0.705 0.470 0.352 Masking disturbance 𝑒(𝑙) = 1.1𝑙 ⋅ (−0.282)

36

slide-37
SLIDE 37

Undetectable Attacks and Masking (cont’d)

  • Suppose operator observes the output 𝑧(𝑙), and does not know

the true initial state 𝑦(0) and true disturbance 𝑒(𝑙)

  • Let (𝑦0, 𝑒0, 𝑏0) be an undetectable attack, 0 = 𝐻𝑒𝑒0 + 𝐻𝑏𝑏0 with

initial state 𝑦0 Consider the cases: 1. Un-attacked system 𝑧 = 𝐻𝑒𝑒, with initial state 𝑦(0) 2. Attacked system 𝑧 = 𝐻𝑒 𝑒 + 𝑒0 + 𝐻𝑏𝑏0, with initial state 𝑦 0 + 𝑦0 If initial states 𝑦(0) and 𝑦 0 + 𝑦0 and disturbances 𝑒 and 𝑒 + 𝑒0 are equally likely, then impossible for operator to decide which case is true ⇒ Attack is undetectable!

37

slide-38
SLIDE 38

Undetectable Attacks and Masking (cont’d)

  • Suppose operator observes the output 𝑧 𝑙 , and knows the true

initial state 𝑦 0 = 0 and the disturbance 𝑒 𝑙 = 0, 𝑙 ≥ 0

  • Suppose system is asymptotically stable, 𝜍 𝐵 < 1
  • Let (𝑦0, 𝑏0) be an undetectable attack, 0 = 𝐻𝑏𝑏0 with initial state 𝑦0

Consider the cases: 1. Un-attacked system 𝑧1 𝑙 = 0, 𝑙 ≥ 0, with initial state 𝑦 0 = 0 2. Attacked system 𝑧2(𝑙) = (𝐻𝑏𝑏0)(𝑙) = −𝐷𝐵𝑙𝑦0 → 0 as 𝑙 → ∞, with initial state 𝑦 0 = 0 The attacked output 𝑧2 is vanishing, and can be made arbitrarily close to 𝑧1 by scaling (𝑦0, 𝑏0) ⇒ Attack is asymptotically undetectable!

38

slide-39
SLIDE 39

The Security Index 𝜷𝒋

Notation: 𝑏 0 ≔ |supp 𝑏 |, 𝑏𝑗 vector 𝑏 with 𝑗-th element non-zero Interpretation:

  • Attacker persistently targets signal component 𝑏𝑗 (condition 𝑨0 ≥ 1)
  • 𝛽𝑗 is smallest number of attack signals that need to be simultaneously

accessed to stage undetectable attack against signal 𝑏𝑗 Argument: Large 𝛽𝑗 ⇒ malicious cyber attacks targeting 𝑏𝑗 less likely Problem NP-hard in general (combinatorial optimization, cf. matrix spark). Generalization of static index in [Sandberg et al., SCS, 2010]

39

slide-40
SLIDE 40

Simple Example of Security Index

  • Measurements not affected by physical states and disturbances
  • 3 measurements
  • 4 attacks with security indices:
  • 𝛽1 = 3
  • 𝛽2 = 3
  • 𝛽3 = 3
  • 𝛽4 = ∞ (By definition. Even access to all attack signals not

enough to hide attack)

40

slide-41
SLIDE 41

Special Case 1: Critical Attack Signals

Signal with 𝛽𝑗 = 1 can be undetectably attacked without access to other elements ⇒ Critical Attack Signal Simple test, ∀𝒋: If there is 𝑨0 ∈ ℂ, 𝑨0 ≥ 1, such that rank [𝑄𝑒(𝑨0)] = rank [𝑄

𝑗(𝑨0)], then 𝛽𝑗 = 1

Even more critical case: If normalrank 𝑄

𝑒 𝑨0

= normalrank 𝑄

𝑗 𝑨0

then there is undetectable critical attack for all frequencies 𝑨𝑝 Holds generically when more disturbances than measurements (𝑝 ≥ 𝑞)! Protect against these attack signals first in risk management!

41

slide-42
SLIDE 42

Special Case 2: Transmission Zeros

Suppose 𝑄 𝑨 has full column normal rank. Then undetected attacks only at finite set of transmission zeros {𝑨0} Solve by inspection of corresponding zero directions ⇒ Easy in typical case of 1-dimensional zero directions

[Amin et al., ACM HSCC, 2010] [Pasqualetti et al., IEEE TAC, 2013]

42

slide-43
SLIDE 43

Special Case 3: Sensor Attacks

𝑄(𝑨) only loses rank in eigenvalues 𝑨0 ∈ {𝜇1 𝐵 , … , 𝜇𝑜(𝐵)} Simple eigenvalues give one-dimensional spaces of eigenvectors 𝑦0 ⇒ Simplifies computation of 𝜷𝒋 Example: Suppose 𝐸𝑏 = 𝐽𝑞 (sensor attacks), 𝐸𝑒 = 0, and system observable from each 𝑧𝑗, 𝑗 = 1, … , 𝑞:

  • By the PBH-test: 𝛽𝑗 = 𝑞 or 𝛽𝑗 = +∞ (if all eigenvalues stable,

no persistent undetectable sensor attack exists)

  • Redundant measurements increase 𝛽𝑗!

[Fawzi et al., IEEE TAC, 2014] [Chen et al., IEEE ICASSP, 2015] [Lee et al., ECC, 2015]

43

slide-44
SLIDE 44

Special Case 4: Sensor Attacks for Static Systems

Since 𝐵 = 𝐽𝑜 and 𝐶𝑒 = 𝐶𝑏 = 0, this is the steady-state case Space of eigenvectors 𝑦0 is 𝑜-dimensional ⇒ Typically makes computation of 𝜷𝒋 harder than in the dynamical case! Practically relevant case in power systems where 𝑞 > 𝑜 ≫ 0

  • Problem NP-hard, but power system imposes special structures in

𝐷 (unimodularity etc.)

  • Several works on efficient and exact computation of 𝛽𝑗 using min-

cut/max-flow and ℓ1-relaxation ([Hendrickx et al., 2014], [Kosut, 2014], [Yamaguchi et al., 2015])

[Liu et al., ACM CCS, 2009] [Sandberg et al., SCS, 2010]

44

slide-45
SLIDE 45

Special Case 4: Solution by MILP

Big 𝑁 reformulation:

45

Elementwise

slide-46
SLIDE 46

Example: Power System State Estimator for IEEE 118-bus System

  • State

dimension 𝑜 = 118

  • Number

sensors 𝑞 ≈ 490

46

slide-47
SLIDE 47
  • Computation time on laptop using min-cut method [Hendrickx et al., IEEE

TAC, 2014]: 0.17 sec

  • Used for protection allocation in [Vukovic et al., IEEE JSAC, 2012]

Example: Power System State Estimator for IEEE 118-bus System

47

slide-48
SLIDE 48

Summary So Far

  • Dynamical security index 𝛽𝑗 defined
  • Argued 𝛽𝑗 useful in risk management for assessing likelihood
  • f malicious attack against element 𝑏𝑗
  • Computation is NP-hard in general, but often “simple” in

special cases:

  • One-dimensional zero-dynamics
  • Static systems with special matrix structures (derived from

potential flow problems)

  • Dynamics generally simplifies computation and redundant

sensors increase 𝛽𝑗

  • Fast computation enables greedy security allocation

48

slide-49
SLIDE 49

Outline

  • Background and motivation
  • CPS attack models
  • Risk management
  • Attack detectability and Security metrics
  • Attack identification and secure state

estimation

49

slide-50
SLIDE 50

Attack Identification

  • Unknown state 𝑦 𝑙 ∈ ℝ𝑜
  • Unknown (natural) disturbance 𝑒 𝑙 ∈ ℝ𝑝
  • Unknown (malicious) attack 𝑏 𝑙 ∈ ℝ𝑛
  • Known measurement 𝑧 𝑙 ∈ ℝ𝑞
  • Known model 𝐵, 𝐶𝑒, 𝐶𝑏, 𝐷, 𝐸𝑒, 𝐸𝑏
  • When can we decide there is an attack signal 𝑏𝑗 ≠ 0?
  • Which elements 𝑏𝑗 can we track (“identify”)?
  • Not equivalent to designing an unknown input observer/secure state

estimator (state not requested here). See end of presentation

50

slide-51
SLIDE 51

Attack Identification

Definition: A (persistent) attack signal 𝑏 is

  • identifiable if for all attack signals 𝑏

≠ 𝑏, and all corresponding disturbances 𝑒 and 𝑒 , and initial states 𝑦(0) and 𝑦 (0), we have 𝑧 ≠ 𝑧;

  • 𝑗-identifiable if for all attack signals 𝑏 and 𝑏

with 𝑏𝑗 ≠ 𝑏𝑗, and all corresponding disturbances 𝑒 and 𝑒 , and initial states 𝑦(0) and 𝑦 (0), we have 𝑧 ≠ 𝑧 Interpretations:

  • Identifiability ⇔ (different attack 𝑏 ⇒ different measurement 𝑧) ⇔

attack signal is injectively mapped to 𝑧 ⇒ attack signal is detectable

  • 𝑗-identifiable weaker than identifiable
  • ∀𝑗: 𝑏 is 𝑗-identifiable ⇔ 𝑏 is identifiable
  • 𝑏 is 𝑗-identifiable: Possible to track element 𝑏𝑗, but not necessarily 𝑏𝑘,

𝑘 ≠ 𝑗

51

slide-52
SLIDE 52

Theorem

Suppose that the attacker can manipulate at most 𝑟 attack elements simultaneously ( 𝑏 0 ≤ 𝑟). i. There exists persistent undetectable attacks 𝑏𝑗 ⇔ 𝑟 ≥ 𝛽𝑗; ii. All persistent attacks are 𝑗-identifiable ⇔ 𝑟 < 𝛽𝑗/2; iii. All persistent attacks are identifiable ⇔ 𝑟 < min

𝑗 𝛽𝑗/2.

  • Proof. Compressed sensing type argument. See [Sandberg and

Teixeira, SoSCYPS, 2016] for details

52

slide-53
SLIDE 53

Simple Example of Security Index (cont’d)

Security indices: 𝛽1 = 3, 𝛽2 = 3, 𝛽3 = 3, 𝛽4= ∞ Attacker with 𝑟 = 1: Defender can identify (and thus detect) all attacks 𝑟 = 2: Defender can detect (not identify) all attacks against 𝑏1, 𝑏2, 𝑏3 and identify all attacks against 𝑏4 𝑟 = 3 − 4: Defender can identify all attacks against 𝑏4. Exist undetectable attacks against 𝑏1, 𝑏2, 𝑏3

53

slide-54
SLIDE 54

Security indices: 𝛽1 = 3, 𝛽2 = 3, 𝛽3 = 3, 𝛽4= ∞

  • Suppose the operator can choose to block one attack signal

(through installing physical protection, authentication, etc.).

  • Which signal 𝑏1, 𝑏2, 𝑏3, or 𝑏4 should she/he choose?
  • Among the one(s) with lowest security index! Choose 𝑏1.
  • New attack model and security indices: 𝛽2 = 𝛽3 = 𝛽4 = ∞
  • By explicitly blocking one attack signal, all other attacks are

implicitly blocked (they are identifiable)

Back to Risk Management

54

slide-55
SLIDE 55
  • Suppose number of attacked elements is 𝑟 ≤ 7

Example: Power System State Estimator for IEEE 118-bus System

  • Signals susceptible

to undetectable attacks

  • Signals were all

attacks are identifiable

  • Other signals will, if

attacked, always result in non-zero

  • utput 𝑧

55

slide-56
SLIDE 56

Secure State Estimation/Unknown Input Observer (UIO)

Secure state estimate 𝒚 : Regardless of disturbance 𝑒 and attack 𝑏, the estimate satisfies 𝑦 → 𝑦 as 𝑙 → ∞

  • 1. Rename and transform attacks and disturbances:
  • 2. Compute security indices 𝛽𝑗 with respect to 𝑔

Theorem: A secure state estimator exists iff 1. 𝐷, 𝐵 is detectable; and 2. 𝑟 < min

𝑗 𝛽𝑗 2 , where 𝑟 is max number of non-zero elements in 𝑔.

  • Proof. Existence of UIO by [Sundaram et al., 2007] plus previous theorem

56

slide-57
SLIDE 57

How to Identify an Attack Signal?

Use decoupling theory from fault diagnosis literature [Ding, 2008] Suppose that 𝑧 = 𝐻𝑒𝑒 + 𝐻𝑏𝑏 and Then there exists linear decoupling filter 𝑆 such that

57

slide-58
SLIDE 58

How to Identify an Attack Signal?

Suppose 𝑏 is identifiable (𝑟 < min

𝑗 𝛽𝑗/2)

1. Decouple the disturbances to obtain system 𝑠 = Δ𝑏 2. Filter out uncertain initial state component in 𝑠 to obtain 𝑠′ = Δ𝑏 3. Compute left inverses of Δ𝐽: = Δ𝑗 𝑗∈𝐽 formed out of the columns Δ𝑗 of Δ, for all subsets 𝐽 = 𝑟, 𝐽 ⊆ {1, … , 𝑛} (Bottleneck! Compare with compressed sensing) 4. By identifiability, if estimate 𝑏 𝐽 satisfies 𝑠′ = Δ𝑏 𝐽, then 𝑏 𝐽 ≡ 𝑏 (Similar scheme applies if 𝑏 is only 𝑗-identifiable)

58

slide-59
SLIDE 59

Summary

  • There is a need for CPS security
  • Briefly introduced CPS attack models and

concept of risk management

  • Input observability and detectability

⇒ Undetectable attacks and masking initial states and disturbances

  • A security metric 𝛽𝑗 for risk management
  • Suppose attacker has access to 𝑟 resources:

– Undetectable attacks against 𝑏𝑗 iff 𝑟 ≥ 𝛽𝑗 – Attack against 𝑏𝑗 identifiable iff 𝑟 < 𝛽𝑗/2

  • Many useful results in the fault diagnosis literature, especially for identifiable

attacks: Unknown input observers, decoupling filters, etc.

  • Future research direction: More realistic attacker models, estimate attack

likelihoods and impacts, corporation with IT security,…

59

slide-60
SLIDE 60

Further Reading

Introduction to CPS/NCS security

  • Cardenas, S. Amin, and S. Sastry: “Research challenges for the security of control

systems". Proceedings of the 3rd Conference on Hot topics in security, 2008, p. 6.

  • Special Issue on CPS Security, IEEE Control Systems Magazine, February 2015
  • D. Urbina et al.: ”Survey and New Directions for Physics-Based Attack Detection in

Control Systems”, NIST Report 16-010, November, 2016 CPS attack models, impact, and risk management

  • A. Teixeira, I. Shames, H. Sandberg, K. H. Johansson: ”A Secure Control Framework

for Resource-Limited Adversaries”. Automatica, 51, pp. 135-148, January 2015.

  • A. Teixeira, K. C. Sou, H. Sandberg, K. H. Johansson: "Secure Control Systems: A

Quantitative Risk Management Approach". IEEE Control Systems Magazine, 35:1,

  • pp. 24-45, February 2015
  • D. Urbina et al.: ”Limiting The Impact of Stealthy Attacks on Industrial Control

Systems”, 23rd ACM Conference on Computer and Communications Security, October, 2016

60

slide-61
SLIDE 61

Further Reading

Detectability and identifiability of attacks

  • S. Sundaram and C.N. Hadjicostis: “Distributed Function Calculation via Linear

Iterative Strategies in the Presence of Malicious Agents”. IEEE Transactions on Automatic Control, vol. 56, no. 7, pp. 1495–1508, July 2011.

  • F. Pasqualetti, F. Dörfler, F. Bullo: “Attack Detection and Identification in Cyber-

Physical Systems”. IEEE Transactions on Automatic Control, 58(11):2715-2729, 2013.

  • H. Fawzi, P. Tabuada, and S. Diggavi: “Secure estimation and control for cyber-

physical systems under adversarial attacks”. IEEE Transactions on Automatic Control, vol. 59, no. 6, pp. 1454–1467, June 2014.

  • Y. Mo, S. Weerakkody, B. Sinopoli: “Physical Authentication of Control

Systems”. IEEE Control Systems Magazine, vol. 35, no. 1, pp. 93-109, February 2015.

  • R. Smith: “Covert Misappropriation of Networked Control Systems”. IEEE

Control Systems Magazine, vol. 35, no. 1, pp. 82-92, February 2015.

  • H. Sandberg and A. Teixeira: “From Control System Security Indices to Attack

Identifiability”. Science of Security for Cyber-Physical Systems Workshop, CPS Week 2016

61

slide-62
SLIDE 62

Further Reading

Security metrics (security index)

  • O. Vukovic, K. C. Sou, G. Dan, H. Sandberg: "Network-aware

Mitigation of Data Integrity Attacks on Power System State Estimation". IEEE Journal on Selected Areas in Communications (JSAC), 30:6, pp. 1108--1118, 2012.

  • J. M. Hendrickx, K. H. Johansson, R. M. Jungers, H. Sandberg, K.
  • C. Sou: "Efficient Computations of a Security Index for False Data

Attacks in Power Networks". IEEE Transactions on Automatic Control: Special Issue on Control of CPS, 59:12, pp. 3194-3208, December 2014.

  • H. Sandberg and A. Teixeira: “From Control System Security

Indices to Attack Identifiability”. Science of Security for Cyber- Physical Systems Workshop, CPS Week 2016

62

slide-63
SLIDE 63

Acknowledgments

André M.H. Teixeira (Delft University of Technology) Kin Cheong Sou (National Sun Yat-sen University) György Dán Karl Henrik Johansson Jezdimir Milošević David Umsonst (KTH)

63