Information Flow Gang Tan Penn State University Spring 2019 CMPSC - - PowerPoint PPT Presentation

information flow
SMART_READER_LITE
LIVE PREVIEW

Information Flow Gang Tan Penn State University Spring 2019 CMPSC - - PowerPoint PPT Presentation

Information Flow Gang Tan Penn State University Spring 2019 CMPSC 447, Software Security Information Flow 3 Many security requirements can be formulated as what information flow is allowed/disallowed E.g., confidential data should not


slide-1
SLIDE 1

Information Flow

Gang Tan Penn State University Spring 2019

CMPSC 447, Software Security

slide-2
SLIDE 2

Information Flow

 Many security requirements can be formulated as

what information flow is allowed/disallowed

 E.g., confidential data should not flow to unauthorized

personnel

 E.g., data from untrusted sources should not taint a

database that stores data from trusted sources

 An information flow policy

 Provide data with labels

 E.g., confidential data vs. non‐confidential data  E.g., low‐integrity data vs. high‐integrity data

 Specify how labeled data can/cannot flow in a system

 E.g., confidential data cannot flow to the network

3

slide-3
SLIDE 3

Multi‐Level Security (MLS)

 Used in military to classify personnel and data

 Label L = 〈rank, compartment〉

 Dominance relation

 L1 ≤ L2 iff rank1 ≤ rank2 & compart1

compart2

 Example: 〈Unclassified, Iraq〉 ≤ 〈Secret, CENTCOM〉

 Mathematically, the relation forms a lattice

 Subjects: users or processes

 Label(S) = clearance of S

 Objects : documents or resources

 Label(O) = classification of O

4

slide-4
SLIDE 4

MLS Example: A Linear Lattice about Confidentiality

Unclassified  Confidential  Secret  Top Secret

5

Top Secret (TS) Secret (S) Confidential (C) Unclassified (U) Information flows up, but not down!

slide-5
SLIDE 5

Bell‐LaPadula Model

 Simple Security Condition: subject S can read

  • bject O if and only if L(O)  L(S)

 Example: Person with top‐secret clearance can read

secret files

 “No read up”

 *‐Property (Star Property): subject S can write

  • bject O if and only if L(S)  L(O)

 Example: Person with secret clearance can write top‐

secret files

 “No write down”

6

slide-6
SLIDE 6

Integrity [Biba 1977]

 Information flow can be used

to enforce integrity as well as confidentiality

 Integrity is the dual of

confidentiality

 Labels: low‐integrity (tainted);

high‐integrity

 Tainted data should not flow to

places that demand high‐ integrity data

7

Low integrity High integrity

slide-7
SLIDE 7

What about both Confidential and Integrity?

 Idea: combine the following two lattices

8

Secret Unclassified Low integrity High integrity

Secret and low integrity Unclassified and low integrity Secret and high integrity Unclassified and high integrity

slide-8
SLIDE 8

Compartments

 Provide finer‐grained security labels  Suppose TOP SECRET divided into <TOP SECRET, CAT>

and <TOP SECRET, DOG>

 Both are TOP SECRET but information flow restricted

between the two kinds

 A user with <TOP SECRECT, CAT> clearance cannot access

data with <TOP SECRET, DOG>, and vice versa.

 Compartments designed to enforce the need to know

principle

 Even with a top secret clearance, a user cannot access all

top secret data

9

slide-9
SLIDE 9

Compartments Example

 Arrows indicate allowed information flow

10

<TOP SECRET, CAT & DOG> <TOP SECRET, CAT> <TOP SECRET> <SECRET, CAT & DOG> <SECRET, DOG> <SECRET> <TOP SECRET, DOG> <SECRET, CAT>

 Not all classifications are comparable, e.g.,

<TOP SECRET, CAT> vs <SECRET, CAT & DOG>

slide-10
SLIDE 10

Controlling Information Flow

11

* Some slides borrowed from Shmatikov

slide-11
SLIDE 11

Mixing Information of Multiple Levels

 Systems (programs) often mix information of

different security levels

 E.g., an OS manages both secret and public

documents and is shared by users of different clearances

 Q: how do we know such a system respects multi‐

level security?

 That is, information at a higher level does not flow to

information at a lower level

12

slide-12
SLIDE 12

Noninterference

 A system is modeled as a blackbox with some

inputs and outputs

 Each input/output has a security level

 Noninterference requires

 Output at a lower level does not depend on input at a

higher level

 Changing higher‐level input won’t change lower‐level

  • utput

13

Public documents Output seen by users of public clearance Secret documents

OS

Output seen by users of secret clearance

slide-13
SLIDE 13

Noninterference for a Two‐Point Lattice

 High: cannot be observed by the attacker before, after, and

during execution

 Low: can be observed by the attacker before and after the

execution, but not during

 Some of these are inputs and some are outputs  Example: web server  High input: the server’s private key  Low input: user requests to access webpages  Low output: returned webpages

14

High Low

Lin: low input Lout :low output H: high input

P

slide-14
SLIDE 14

Noninterference for a Two‐Point Lattice

 Noninterference

 Low output does not depend on high input

 No matter what the high input is, the system returns the

same low output for the same low input

 E.g., no matter what the private key is, the web server returns

the same information for the same user webpage request

 As a result, the attacker learns no information about the

high input by observing the low output

15

High Low

Lin: low input Lout :low output H: high input

P

slide-15
SLIDE 15

Challenges for Enforcement

 Goal: low output should not depend on high input

 An end‐to‐end security policy

 Enforcement challenges

 Various channels for information flow (e.g., Implicit flows)  Need to track information flow inside P  Declassification: downgrade information

16

Lin: low input Lout :low output H: high input

P

slide-16
SLIDE 16

Explicit Flows

 Direct transfer of information

 E.g., via copying; in “x=y”, the information of y is

copied to x

 E.g., via writing to display, files, sockets, etc., which

are visible to the attacker

17

slide-17
SLIDE 17

Explicit Flows Example

 Example about confidentiality

String hi; // security label secret String lo; // security label public

Which program fragments (may) cause problems if hi has to be kept confidential?

18

  • 5. println(lo);
  • 6. println(hi);
  • 1. hi = lo;
  • 2. lo = hi;
  • 3. lo = "1234";
  • 4. hi = "1234";
slide-18
SLIDE 18

Implicit Flows

 Indirect transfer of information via the control flow of

a program

 Information in a variable x may be correlated to y’s

information due to control flow

 Example:

if (hi>0) lo=100; else lo=1000;

 Note there is no explicit flows such as “lo=hi”; only

constants are assigned to lo

 But at the end the program, lo’s value is correlated with

hi’s value

 By observing lo, an adversary can infer information about hi!

 We call this an implicit flow

19

slide-19
SLIDE 19

Yet There are Other Channels

 Termination channel

if (hi>0) infinite‐loop();

 If the attacker can observe the program’s termination

behavior, then there is a leakage

 Side channels

 E.g., timing behavior

if (hi>0) run‐1000‐cycles(); else run‐1‐cycle();

 An end‐to‐end enforcement would require us to

control all these possible channels

20

slide-20
SLIDE 20

Enforcement Challenge: Must Track Information Flow Internally

 The input flows inside a system through

intermediate variables and memory

 To prevent high input to flow to low output, must

also track internally how information flows

 E.g.,

x=hi; lo=x;

 E.g.,

if (hi>0) x=0; else x=1; lo=x;

21

Need to know x contains hi information in “lo=x”

slide-21
SLIDE 21

Tracking Information Flow Inside a System

 Static enforcement

 E.g., via a type system

 Dynamic enforcement

 Straightforward for dynamically tracking explicit flows

(taint tracking)

 Much harder for the case of implicit flows

22

slide-22
SLIDE 22

Jif

 Jif: Java with static information flow control  Jif augments Java types with labels

 int {Alice:Bob} x;  Object {L} o;

 Subtyping follows the

lattice order

 Type inference

 Programmer may omit types; Jif will infer them from

how values are used in expressions

23

[Myers]

slide-23
SLIDE 23

Implicit Flows (1)

24

if (a > 0) then { b = 4; } int{Alice:} a; int{Bob:} b; ...

This assignment leaks information contained in program counter (PC) {Alice:; Bob:} {} {Alice:} {Bob:}

PC label

{} {}{Alice:}={Alice:} {}

slide-24
SLIDE 24

Implicit Flows (2)

25

if (a > 0) then { b = 4; } int{Alice:} a; int{Bob:} b; ...

{Alice:; Bob:} {} {Alice:} {Bob:}

PC label

{} {}{Alice:}={Alice:} {}

[Zdancewic] To assign to variable with label L, must have PC L

slide-25
SLIDE 25

Jif Caveats

 No threads

 Information flow hard to control  Active area of current research

 Timing channels not controlled

 Explicit choice for practicality

 Differences from Java

 Some exceptions are fatal  Restricted access to some system calls

26

slide-26
SLIDE 26

Enforcement Challenge: Declassification

 In realistic systems, disallowing all information flow

from a higher level to a lower level is too prohibitive

 Very often, information need to be declassified to a

lower level

 Jif requires explicit declassification by programmers

27

slide-27
SLIDE 27

Declassification Example

 A password checking system,

28

Lin: user-input password Lout : yes/no H: real password

Password Checker

 The low output does depend on the real password

 It reveals some info about the real password  Namely, whether the user‐input password is correct or

not

 However, the amount of information flow is extremely

small; so we can declassify that output