A Map for Security Science
Fred B. Schneider*
Department of Computer Science Cornell University Ithaca, New York 14853 U.S.A.
*Funded by AFOSR, NICECAP, NSF (TRUST STC), and Microsoft.
A Map for Security Science Fred B. Schneider* Department of - - PowerPoint PPT Presentation
A Map for Security Science Fred B. Schneider* Department of Computer Science Cornell University Ithaca, New York 14853 U.S.A. *Funded by AFOSR, NICECAP, NSF (TRUST STC), and Microsoft. Maps = Features + Relations Features Land mass
*Funded by AFOSR, NICECAP, NSF (TRUST STC), and Microsoft.
1
Features
Relationships
2
Port Scan Bugburg Geekland Bufferville Malwaria Root kit pass Sploit Market Valley of the
Sea Plus Plus Sea Sharp …
Reproduced courtesy Fortify Software Inc
3
4
5
6
… includes resource utilization, input/output to environment.
7
8
Attacks exploit vulnerabilities.
Assumptions are potential vulnerabilities.
9
Operational description:
Semantic characterization:
10
11
[Lamport 77]
12
[Alpern+Schneider 85,87]
13
Monitoring: Attack ↔ Defense ↔ Policy
– Gets control on every policy-relevant event – Blocks execution if allowing event would violate policy – Integrity of EM protected from subversion.
Examples of EM-enforceable policies:
Examples of non EM-enforceable policies:
14
Monitoring: Attack ↔ Defense ↔ Policy
15
Monitoring: Attack ↔ Defense ↔ Policy
1.
2.
Application Secure application Specialize
P” P′ P
Policy I nsert
P P
SASI
Com pile
16
Monitoring: Attack ↔ Defense ↔ Policy
17
Monitoring: Attack ↔ Defense ↔ Policy
Specialize
P” P′
I nsert
P P
SASI
Com pile Application
P
Policy Optim ize
PCC
Application
Pr
Proof
18
Monitoring: Attack ↔ Defense ↔ Policy
Program Kernel RM
19
[Goguen-Meseguer 82]
20
21
22
23
Examples: possibility, statistical performance, etc.
24
25
Obfuscation: Attack ↔ Defense ↔ Policy
26
Obfuscation: Attack ↔ Defense ↔ Policy
Attacker knows:
Obfuscator T Input program S
Attacker does not know:
Random keys K1, K2, … Kn … Knowledge of the Ki would enable attackers to automate attacks!
integrity compromise availability compromise.
27
Obfuscation: Attack ↔ Defense ↔ Policy
28
Obfuscation: Attack ↔ Defense ↔ Policy
29
Obfuscation: Attack ↔ Defense ↔ Policy
30
Obfuscation: Attack ↔ Defense ↔ Policy
31
Obfuscation: Attack ↔ Defense ↔ Policy
Implied by usual notion of “strong typing”. Is a stronger type system than necessary. E.g. if x[i] = x[i] then skip is not type-safe but is not affected by T.
Better approximation than “pointer de-ref sanity” types. Low integrity value: can vary from morph to morph
32
Obfuscation: Attack ↔ Defense ↔ Policy
33
Obfuscation: Attack ↔ Defense ↔ Policy
Type systems:
Obfuscation
34
Defined: Characterization of policy: hyper-policies
Relationship: Class of defense (EM) and class of
Relationship: Class of defense (obfusc) and class
35
Science, meaning focus on process:
Science, meaning focus on results:
invention measurement + insight
theorems, not artifacts
Engineering, meaning focus on artifacts:
Proof of concept; measurement
36
Is “code safety” universal for enforcement? Can sufficiently introspective defenses always be
…
37
Absolute security vs
Prevention vs
Perfection vs
Enforcement vs
*SSR: Single Security Researcher