Course Overview, Introduction to Economics and Security Tyler Moore - - PDF document

course overview introduction to economics and security
SMART_READER_LITE
LIVE PREVIEW

Course Overview, Introduction to Economics and Security Tyler Moore - - PDF document

Notes Course Overview, Introduction to Economics and Security Tyler Moore CSE 7338 Computer Science & Engineering Department, SMU, Dallas, TX Lecture 1 Notes Outline Logistics 1 Motivation 2 Intro to Economics: Key notions 3 Intro


slide-1
SLIDE 1

Course Overview, Introduction to Economics and Security

Tyler Moore

CSE 7338 Computer Science & Engineering Department, SMU, Dallas, TX

Lecture 1

Outline

1

Logistics

2

Motivation

3

Intro to Economics: Key notions

4

Intro to Economics: Preferences

5

Intro to Economics: Utility

6

Intro to Economics: Expected utility

7

Economics of IT

8

Market failures

9

A very brief introduction to security

2 / 135 Logistics

Course website

Most info: http://lyle.smu.edu/~tylerm/courses/econsec/ Blackboard for announcements Youtube channel for R screencasts

4 / 135 Logistics Syllabus

Syllabus

http: //lyle.smu.edu/~tylerm/courses/econsec/admin/syllabus.html

5 / 135

Notes Notes Notes Notes

slide-2
SLIDE 2

Logistics Calendar

Calendar

http: //lyle.smu.edu/~tylerm/courses/econsec/admin/schedule.html

6 / 135 Motivation

Why is a computer scientist talking about economics?

The conventional CS approach to security has failed

1

Enumerate possible threats

2

Define attacker capabilities

3

Build systems to protect against these threats

Worked for encryption algorithms, but not Internet security

8 / 135 Motivation Why computer science alone can’t fix information security

Evidence of security failures: data breaches

9 / 135 Motivation Why computer science alone can’t fix information security

Evidence of security failures: phishing websites

10 / 135

Notes Notes Notes Notes

slide-3
SLIDE 3

Motivation Why computer science alone can’t fix information security

Evidence of security failures: botnets

11 / 135 Motivation Why computer science alone can’t fix information security

Evidence of security failures: critical infrastructure

Source: http://www.wired.com/images_blogs/threatlevel/2010/11/w32_stuxnet_dossier.pdf 12 / 135 Motivation Why computer science alone can’t fix information security

Evidence of security failures: critical infrastructure

13 / 135 Motivation Why computer science alone can’t fix information security

Evidence of security failures: critical infrastructure

Source: http://www.cl.cam.ac.uk/~fms27/papers/2011-Leverett-industrial.pdf 14 / 135

Notes Notes Notes Notes

slide-4
SLIDE 4

Motivation Why economics offers a useful perspective

But why economics?

Economics is a social science

Studies behavior of individuals and firms in order to predict outcomes Models of behavior based on systematic observation Usually cannot run experiments as in bench science, but economics has developed ways to cope with differences inherent to observing the world

Economics studies trade-offs between conflicting interests

Recognizes that people operate strategically Have devised ways to model people’s interests and decision making

15 / 135 Motivation Why economics offers a useful perspective

Economics is not just about money

Money helps to reveal preferences Money can serve as a common measure for costs and benefits As a discipline, economics examines much more than interactions involving money

Economics studies trade-offs between conflicting interests Conflicting interests and incentives appear in many circumstances where money never changes hands

16 / 135 Motivation How economics can help information security

Attackers operate strategically

Cannot expect attackers to respect stated assumptions of behavior

Threat modeling focuses an engineer’s task, which can harden a resource against particular attacks But system design does not exist in a vacuum – attackers can adapt to find holes in areas not considered by the threat model

Must understand what motivates attackers

For cybercriminals this could be profit For hacktivists this could be attention and disruption In each case, attackers will seek the least costly way to reach their goal

17 / 135 Motivation How economics can help information security

Botnet operators operate strategically (motivated by $)

18 / 135

Notes Notes Notes Notes

slide-5
SLIDE 5

Motivation How economics can help information security

Phishing gangs operate strategically (exploit weakest link)

phishing site lifetime (days) March April May 5 10 15 20 25 Hongkong China .hk domain .cn domain

Source: Moore & Clayton (2007), own aggregation

Take-down latency for phishing attacks targeting different registrars in spring 2007; lines are five-day moving averages broken down by top-level domain

19 / 135 Motivation How economics can help information security

Defenders also operate strategically

Those responsible for protecting information systems naturally must consider their own interests Often, there are multiple stakeholders responsible for defense Sometimes defenders’ interests conflict Sometimes the interests of defenders do not align with those of society

20 / 135 Motivation How economics can help information security

Let’s return to critical infrastructure protection

21 / 135 Motivation How economics can help information security

Incentives for critical infrastructure protection

Critical infrastructure operators

+ Upgrading to IP-based systems brings huge efficiency gains

  • Maintaining physical separation of networks reduces efficiency and

drives up operating costs

  • Likelihood of an attack is low (based on history)
  • Cost of attack largely borne by society

Consumers

+ Value reliability of service, including against attack

  • Prefers low cost service
  • Cannot distinguish between security investments among firms

Governments

+ Value reliability of service, including against attack + Fears political consequences of an attack, given national defense remit

  • Lack of budget to fund security
  • Lack of expertise to improve security on privately-controlled systems

22 / 135

Notes Notes Notes Notes

slide-6
SLIDE 6

Motivation How economics can help information security

So what’s the outcome?

Absent regulation to compel behavior, stakeholders act in their own interest based on their incentives and capabilities

Only operators, not consumers or governments, are capable of improving security So their incentives matter most! On balance, they are likely to tolerate a high level of insecurity in their systems

We can also compare this outcome to what seems ‘best’

In economics jargon, this is the search for the social optimum The social optimum maximizes expected utility More detail on how to compute this later on, but for now, we can intuit what the social optimum might be

Question #1: is complete security of critical infrastructures socially

  • ptimal?

Question #2: why hasn’t the market delivered the socially optimal

  • utcome?

23 / 135 Motivation How economics can help information security

Economics makes information security empirically grounded

Traditional threat modeling states that an attack is possible and should be protected against by definition But what if the threats we envision differ from what actually happens? An economic perspective approaches threat modeling by observing behavior This allows us to construct a more accurate picture of the risks due to information insecurity

24 / 135 Motivation How economics can help information security

Economics suggests policies to deploy technology better

In addition to describing why security fails and how attackers and defenders operate, economics can recommend policies to improve security Technology alone cannot fix the challenges facing information security; instead, policy can correct market limitations to help security technologies succeed We will discuss many of the options in this course (ex ante safety regulation, ex post liability, cyberinsurance, . . . ) Today we briefly discuss one example: information disclosure

25 / 135 Motivation How economics can help information security

Recall our first example? Made possible through policy

26 / 135

Notes Notes Notes Notes

slide-7
SLIDE 7

Motivation How economics can help information security

Information disclosure

Louis Brandeis: “sunlight is said to be the best of disinfectants” Information security incidents are

  • ften hidden from public view, so
  • ne light-touch intervention is to

mandate disclosure

27 / 135 Motivation How economics can help information security

Data breach legislation

California Civil Code 1798.82 (2002): “Any person or business that conducts business in California, and that owns or licenses computerized data that includes personal information, shall disclose any breach of the security of the system following discovery or notification of the breach in the security of the data to any resident of California whose unencrypted personal information was, or is reasonably believed to have been, acquired by an unauthorized person.”

Deirdre Mulligan

28 / 135 Motivation How economics can help information security

Many high-profile breaches came to light

29 / 135 Motivation How economics can help information security

Many high-profile breaches came to light

30 / 135

Notes Notes Notes Notes

slide-8
SLIDE 8

Motivation How economics can help information security

Effect of data breach legislation

Data breach legislation has two main goals

1

Consumer empowerment: give people the chance to protect themselves following a breach

2

Offer an incentive for firms to make breaches less likely

Can also be viewed as a means of correcting information asymmetries

1

Between consumers and firms

2

Between competing firms

31 / 135 Motivation How economics can help information security

Recap of what economics offers information security

Means of understanding strategic behavior (for attackers and defenders) Makes information security empirically grounded Suggests policies to deploy technology better

32 / 135 Intro to Economics: Key notions Motivation

Why again are we studying economics?

Economics is a social science

Studies behavior of individuals and firms in order to predict outcomes Models of behavior based on systematic observation Usually cannot run experiments as in bench science, but economics has developed ways to cope with differences inherent to observing the world

Economics studies trade-offs between conflicting interests

Recognizes that people operate strategically Have devised ways to model people’s interests and decision making

34 / 135 Intro to Economics: Key notions Motivation

Economics is not just about money

Money helps to reveal preferences Money can serve as a common measure for costs and benefits As a discipline, economics examines much more than interactions involving money

Economics studies trade-offs between conflicting interests Conflicting interests and incentives appear in many circumstances where money never changes hands

35 / 135

Notes Notes Notes Notes

slide-9
SLIDE 9

Intro to Economics: Key notions Models

Notion of Model

Reality

price quantity

Market

supply demand

Model

Simplification by projection

All models are wrong. Some are useful.

36 / 135 Intro to Economics: Key notions Models

Types of models used in economics

1 Analytical models: state plausible assumptions about agent’s

behavior, then examine the implications

+ Good for theoretical analysis of individual behavior

  • When models disagree, ground truth can be elusive

2 Empirical models: observe relationships in aggregate, without

explaining underlying individual decisions

+ Ground truth is achievable

  • Oversimplify, can’t explain underlying mechanisms

3 Measurement models: collects data to compare deviations from

predictions made by analytical models

Directly applying empirical analysis to analytical models usually fails + Offers feedback to analytical models to validate predictions

37 / 135 Intro to Economics: Key notions Models

Model Complexity and Scientific Discovery

empirical observation

v1 < v2 v1 = v2

v1 v2 vi ≈ T g · m − F(θi,t) m

  • dt + . . .

vacuum

→ Drag is part of a complex modelReduction to simple model: drag causes measurement error

38 / 135 Intro to Economics: Key notions Models

Model Complexity and Generalizability

independent variable x dependent variable y

Simple Model data

y = 1.05 + 0.5x

×

independent variable x dependent variable y

Complex Model

×

prediction

y = −1.3 + 6.5x − 3.8x2 + 0.6x3

error

Measure of complexity for predictive models: number of estimated parameters

→ Risk of overfitting increases with model complexity

39 / 135

Notes Notes Notes Notes

slide-10
SLIDE 10

Intro to Economics: Key notions Models

Trade-off on Model Complexity

number of parameters model error modeling effort

specifications data

Occam’s razor → William of Occam († 1349): Principle of model parismony

40 / 135 Intro to Economics: Key notions Models

Occam’s Razor

William of Occam, 1285–1349

entia non sunt multiplicanda praeter necessitatem

entities must not be multiplied beyond necessity

41 / 135 Intro to Economics: Preferences Rational choice theory model

Our first model: rational choice theory

Economics attempts to model the decisions we make, when faced with multiple choices and when interacting with other strategic agents Rational choice theory (RCT): model for decision-making Game theory (GT): extends RCT to model strategic interactions

43 / 135 Intro to Economics: Preferences Rational choice theory model

Rationality defined

Intuitive definition: a rational individual acts in his or her perceived best interest Rationality is what motivates a focus on incentives Question: can you think of scenarios when this definition does not hold in practice? To arrive at a precise definition: use rational choice theory to state available outcomes, articulate preferences among them, and decide accordingly

44 / 135

Notes Notes Notes Notes

slide-11
SLIDE 11

Intro to Economics: Preferences Rational choice theory model

Model of preferences

An agent is faced with a range of possible outcomes o1, o2 ∈ O, the set of all possible outcomes Notation

  • 1 ≻ o2: the agent is strictly prefers o1 to o2.
  • 1 o2: the agent weakly prefers o1 to o2;
  • 1 ∼ o2: the agent is indifferent between o1 and o2;

Outcomes can be also viewed as tuples of different properties ˆ x, ˆ y ∈ O, where ˆ x = (x1, x2, . . . , xn) and ˆ y = (y1, y2, . . . , yn)

45 / 135 Intro to Economics: Preferences Rational choice theory model

Rational choice axioms

Rational choice theory assumes consistency in how outcomes are preferred. Axiom

  • Completeness. For each pair of outcomes o1 and o2, exactly one of the

following holds: o1 ≻ o2, o1 ∼ o2, or o2 ≻ o1. ⇒ Outcomes can always be compared Axiom

  • Transitivity. For each triple of outcomes o1, o2, and o3, if o1 ≻ o2 and
  • 2 ≻ o3, then o1 ≻ o3.

⇒ People make choices among many different outcomes in a consistent manner

46 / 135 Intro to Economics: Preferences Preferences example

Example: trade-off between confidentiality and availability using cryptography

Alice Bob I love your music Eve Mallory hate

47 / 135 Intro to Economics: Preferences Preferences example

Example: trade-off between confidentiality and availability using cryptography

Outcomes O

c⊕: mechanism achieving high confidentiality c⊖: mechanism achieving low confidentiality a⊕: mechanism achieving high availability a⊖: mechanism achieving low availability

Preferences

c⊕ ≻ c⊖ and a⊕ ≻ a⊖ Taken together: (c⊕, a⊕) ≻ (c⊖, a⊖) Question: what about high availability and low confidentiality? Indifferent: (c⊕, a⊖) ∼ (c⊖, a⊕).

48 / 135

Notes Notes Notes Notes

slide-12
SLIDE 12

Intro to Economics: Preferences Preferences example

Indifference curves

availability (a) confidentiality (c) Indifference curve

  • Indiff. curve

(a⊖, c⊕) (a⊕, c⊖) (a◦, c◦)

49 / 135 Intro to Economics: Utility Definitions and functions

From preferences to utility

It’s great to express preferences, but to make mathematical analysis

  • f decisions possible, we need to transform these preferences into

numbers. We need a measure of utility, but what does that actually mean?

51 / 135 Intro to Economics: Utility Definitions and functions

We do not mean utility according to Bentham

Founder of utilitarianism: “fundamental axiom, it is the greatest happiness of the greatest number that is the measure

  • f right and wrong”

Utility: preferring “pleasure” over “pain” Jeremy Bentham

52 / 135 Intro to Economics: Utility Definitions and functions

Utility

Rational choice theory defines utility as a way of quantifying consumer preferences Definition (Utility function) A utility function U maps a set of outcomes onto real-valued numbers, that is, U : O → R. U is defined such that U(o1) > U(o2) ⇐ ⇒ o1 ≻ o2 . Agents make a rational decision by picking the outcome with highest utility:

  • ∗ = arg max
  • ∈O U(o)

(1)

53 / 135

Notes Notes Notes Notes

slide-13
SLIDE 13

Intro to Economics: Utility Definitions and functions

Example utility functions

U(o1, o2) = u · o1 + v · o2

Useful when outcomes are substitutes Example substitutes: processor speed and RAM

U(o1, o2) = min{u · o1, v · o2}

Useful when outcomes are complements Example complements: operating system and third-party software

54 / 135 Intro to Economics: Utility Example

Returning to our crypto example

First, we need a utility function

U(ai, ci) = u · ai + v · ci Question: why is this a good choice?

For simplicity, we assign a⊕ = 1, a⊖ = −1, c⊕ = 1, and c⊖ = −1 Utility is in the eye of the beholder We consider two scenarios

Intelligence agency (u = 1 and v = 3) First responders (u = 3 and v = 1)

55 / 135 Intro to Economics: Utility Example

Utility of different outcomes

Outcome UFR (first responder) Uintel (intelligence) (a⊕, c⊕) 4 4 (a⊕, c⊖) 2 −2 (a⊖, c⊕) ? ? (a⊖, c⊖) ? ?

56 / 135 Intro to Economics: Expected utility Definitions

Why isn’t utility theory enough?

Only rarely do actions people take directly determine outcomes Instead there is uncertainty about which outcome will come to pass More realistic model: agent selects action a from set of all possible actions A, and then outcomes O are associated with probability distribution

58 / 135

Notes Notes Notes Notes

slide-14
SLIDE 14

Intro to Economics: Expected utility Definitions

Lotteries

Definition (Lottery) A lottery is a mapping from all outcomes (o1, o2, . . . , on) ∈ O to probabilities corresponding to each outcome (p1, p2, . . . , pn), where n

1 pi = 1. A lottery l1 is represented as l1 = o1 : p1, o2 : p2, . . . , on : pn.

59 / 135 Intro to Economics: Expected utility Definitions

Where does randomness come from?

Indeterminism in nature Lack of knowledge Incompleteness in the model Uncertainty concerns which outcome will occur

⇒ Known unknowns, NOT unknown unknowns

60 / 135 Intro to Economics: Expected utility Definitions

Expected utility

Definition (Expected utility (discrete)) The expected utility of an action a ∈ A is defined by adding up the utility for all outcomes weighed by their probability of occurrence: E[U(a)] =

  • ∈O

U(o) · P(o|a) (2) Agents make a rational decision by maximizing expected utility: a∗ = arg max

a∈A E[U(a)]

(3)

61 / 135 Intro to Economics: Expected utility Example

Example: process control system security

Source: http://www.cl.cam.ac.uk/~fms27/papers/2011-Leverett-industrial.pdf 62 / 135

Notes Notes Notes Notes

slide-15
SLIDE 15

Intro to Economics: Expected utility Example

Example: process control system security

Actions available: A = {disconnect, connect} Outcomes available: O = {attack, no attack} If systems are connected, then the probability of successful attack is 0.01 (P(attack|connect) = 0.01) If systems are disconnected, then P(attack|disconnect) = 0

63 / 135 Intro to Economics: Expected utility Example

Example: process control system security

attack no attack Action U P(attack|action) U P(no attack|action) E[U(action)] disconnect

  • 100

5 1 ? connect

  • 100

0.01 10 0.99 ?

E[U(a)] =

  • ∈O

U(o) · P(o|a) E[U(disconnect)] = U(attack) · P(attack|disconnect) + U(no attack) · P(no attack|disconnect) = −100 · 0 + 5 · 1 = 5

64 / 135 Intro to Economics: Expected utility Example

Example: process control system security

attack no attack Action U P(attack|action) U P(no attack|action) E[U(action)] disconnect

  • 100

5 1 5 connect

  • 100

0.01 10 0.99 ?

E[U(a)] =

  • ∈O

U(o) · P(o|a) E[U(connect)] = U(attack) · P(attack|connect) + U(no attack) · P(no attack|connect) = −100 · 0.01 + 10 · 0.99 = 8.9 ⇒ risk-neutral IT security manager chooses to connect since E[U(connect)] > E[U(disconnect)].

65 / 135 Intro to Economics: Expected utility Attitudes toward risk

Let’s make a deal

Option 1: Take $10 Option 2: Get $20 with a 50% chance, $0 otherwise Which would you choose? E[U] = 0.5 ∗ $20 + 0.5 ∗ $0 = $10 Prefer option 1: you’re risk-averse Prefer option 2: you’re risk-seeking Are you indifferent? If so-you’re risk-neutral

66 / 135

Notes Notes Notes Notes

slide-16
SLIDE 16

Intro to Economics: Expected utility Attitudes toward risk

Let’s make a deal (round 2)

Option 1: Take $10 Option 2: Get $150 with a 10% chance, $0 otherwise Which would you choose? E[U] = 0.1 ∗ $150 + 0.5 ∗ $0 = $15 Prefer option 1: you’re risk-averse Prefer option 2: you’re risk-neutral or seeking

67 / 135 Intro to Economics: Expected utility Attitudes toward risk

Let’s make a deal (round 3)

Option 1: Take $10 Option 2: Get $50 with a 10% chance, $0 otherwise Which would you choose? E[U] = 0.1 ∗ $50 + 0.5 ∗ $0 = $5 Prefer option 1: you’re risk-averse or risk-neutral Prefer option 2: you’ve got a gambling problem

68 / 135 Intro to Economics: Expected utility Attitudes toward risk

Risk attitudes depend on the behavior of the utility function

  • utcomes (o)

U(o) risk-neutral risk-averse risk-seeking

69 / 135 Intro to Economics: Expected utility Attitudes toward risk

Risk-averse prefer utility of expected value over lottery

Source: Varian, Intermediate Microeconomics, p. 225 70 / 135

Notes Notes Notes Notes

slide-17
SLIDE 17

Intro to Economics: Expected utility Attitudes toward risk

Risk-seekers prefer lottery over utility of expected value

Source: Varian, Intermediate Microeconomics, p. 226 71 / 135 Intro to Economics: Expected utility Attitudes toward risk

From attitudes to utility

Suppose that outcomes are numeric O ∈ R When might that happen? Then we can define risk-attitudes by how the utility function behaves Definition (Risk neutrality) An agent is risk-neutral when U(o) is a linear function on

  • .

72 / 135 Intro to Economics: Expected utility Attitudes toward risk

From attitudes to utility

Definition (Risk aversion) An agent is risk-averse when U(o) is a concave function (i.e., U′′(x) < 0 for a twice-differentiable function). Definition (Risk seeking) An agent is risk-seeking when U(o) is a convex function (i.e., U′′(x) > 0 for a twice-differentiable function).

73 / 135 Intro to Economics: Expected utility Attitudes toward risk

Example: antivirus software

Suppose you have $10,000 in wealth. You have the option to buy antivirus software for $75. Outcomes available: O ={hacked (decreases wealth by $2,000), not hacked (no change in wealth)} Without AV software, probability of being hacked is 0.05 (P(hacked|no antivirus) = 0.05) With AV software, probability of being hacked is 0 (P(hacked|antivirus) = 0) Exercise: compute the expected utility of both buying and not buying AV if you are risk-neutral (so that U(o) = o). Would you buy AV software?

74 / 135

Notes Notes Notes Notes

slide-18
SLIDE 18

Intro to Economics: Expected utility Attitudes toward risk

Example: antivirus software

What if you are risk-averse (so that U(o) =

  • (o))?

Risk-averse hack no hack Action U P(hack|action) U P(no hack|action) E[U(action)] buy AV √9, 925 √9, 925 1 99.6 don’t buy √8, 000 0.05 √10, 000 0.95 99.4

Exercise (on your own): How much would you pay for antivirus software if you were risk-neutral and the probability of getting hacked is 0.1 if you don’t have AV installed?

75 / 135 Economics of IT

IT Economics

Economic “rules” for the IT industry differ from those for other industries (Shapiro and Varian 1998) Rule #1: Network effects

Value of a network grows super-linearly to its size Fax machines, operating systems, social networks, . . . n2

  • r n log n

Upshot: hard to bootstrap success, hard for competitors to dislodge

  • nce successful

77 / 135 Economics of IT

Consequence of insecure BGP

78 / 135 Economics of IT

Network effects and information security

Many technical security solutions become effective only when many people adopt them

Introduced in 1996, S-BGP authenticates the paths routers advertise and could have prevented Pakistan telecom from shutting down YouTube However, S-BGP is only valuable if all ISPs switch Why is email still sent unauthenticated?

Security protocols which have succeeded offer immediate value to adopting firms (e.g., SSH)

79 / 135

Notes Notes Notes Notes

slide-19
SLIDE 19

Economics of IT

IT Rule #2: High fixed costs and low marginal costs of production

Traditional industry: High fixed costs and high marginal costs IT industry: High fixed costs and very low marginal costs Competition drives price down to marginal costs of production (i.e., $0 for IT industries!)

80 / 135 Economics of IT

IT Rule # 3: Switching costs determine value

Switching from one IT product or service is usually expensive Shapiro-Varian theorem

Net present value of a software company is the total switching costs Once you have $1000 worth of songs on iTunes, you”re locked into Apple”s ecosystem Why can Microsoft still charge for Office despite free” alternatives?

Beware security mechanisms used to promote lock-in (e.g., digital rights management)

81 / 135 Economics of IT Public goods

Most goods can be privately consumed (e.g., cars, food) But somethings can’t be privately consumed (e.g., national defense, grazing commons) Public goods have two characteristics that make them hard to allocate efficiently

Non-rivalrous: individual consumption does not reduce what’s available to others Non-excludable: no practical way to exclude people from consuming

Public goods tend to be delivered at less than what is socially optimal

82 / 135 Economics of IT Public goods

The IT sector faces inherent impediments to competition

Network effects tends toward dominant platforms Information goods have practically zero marginal cost Information goods are also non-rivalrous, firms use DRM to make them excludable and turn a profit These factors combine to produce dominant-frm markets with big first-mover advantage

Microsoft’s philosophy of “we”ll ship it Tuesday and get it right by version 3” was not perverse behavior by Bill Gates but quite rational

83 / 135

Notes Notes Notes Notes

slide-20
SLIDE 20

Market failures

When markets fail

Market failures occur when the free-market outcome is inefficient

Monopolies/oligopolies Public goods Information asymmetries Externalities

Market failures justify regulatory intervention, and inform how public policy should be designed They help explain why private information security investment is often suboptimal

85 / 135 Market failures

Markets with asymmetric information

86 / 135 Market failures

Akerlof’s market for lemons

Suppose a town has 20 similar used cars for sale

10 “cherries” valued at $2,000 each 10 “lemons” valued at $1,000 each What is the market-clearing price?

Answer: $1,000. Why?

Buyers cannot determine car quality, so they refuse to pay a premium for a high-quality car Sellers know this, and only owners of lemons will sell for $1,000 The market is flooded with lemons

87 / 135 Market failures

Secure software is a market for lemons

Vendors may believe their software is secure, but buyers have no reason to believe them So buyers refuse to pay a premium for secure software, and vendors refuse to devote resources to do so How might the information asymmetry be reduced?

Certification schemes as a signaling device Collect more data on secure and insecure outcomes

88 / 135

Notes Notes Notes Notes

slide-21
SLIDE 21

Market failures

Certification schemes

Common Criteria certification

Sometimes useful, but may be gamed Evaluation is paid for by vendor seeking approval, leading to test-shopping

89 / 135 Market failures

Another certification scheme “fail”

90 / 135 Market failures

Not all shoe sites are created equal

91 / 135 Market failures

Adverse selection in certification schemes

Edelman uses data from SiteAdvisor to identify sites distributing spam and malware as “bad”

He then found that such “bad” companies are more likely to be TrustE-certified: 5.4% of TrustE-certified sites are “bad”, compared with 2.5% of all sites. Similarly, untrustworthy sites are over-represented in paid advertisement links compared to organic search results

This is called adverse selection

In health insurance, adverse selection occurs when sick people are more likely to buy coverage than healthy people Consequence of markets with asymmetric information

92 / 135

Notes Notes Notes Notes

slide-22
SLIDE 22

Market failures

Moral hazard

The second classical outcome of asymmetric information is moral hazard

People may drive recklessly if fully insured with $0 deductible

Moral hazard in information security

Often claimed that consumers engage in moral hazard due to $0 card fraud liability Cuts both ways: when regulations favor banks, they can behave recklessly in combating fraud

93 / 135 Market failures

Another option to remedy information asymmetries: disclosure requirements

California Civil Code 1798.82 (2002): “Any person or business that conducts business in California, and that owns or licenses computerized data that includes personal information, shall disclose any breach of the security of the system following discovery or notification of the breach in the security of the data to any resident of California whose unencrypted personal information was, or is reasonably believed to have been, acquired by an unauthorized person.”

Deirdre Mulligan

94 / 135 Market failures

Externalities

95 / 135 Market failures

Externalities

Cost (or benefit) incurred by a party who did not agree to the transaction causing harm (or benefit)

Positive externalities tend toward under-provision Negative externalities tend toward over-provision

Environmental pollution is a negative externality

Factory produces a good and gets paid by buyer Pollution caused by production is not accounted for in the transaction

Information insecurity is a negative externality

96 / 135

Notes Notes Notes Notes

slide-23
SLIDE 23

Market failures

Botnets

Source: http://en.wikipedia.org/wiki/File:Botnet.svg 97 / 135 Market failures

Botnet infections as an externality

Botnets carry out the task requested by botnet herder

Send spam Host phishing websites Distribute malware Launch denial-of-service attacks

Many tasks assigned to bots are designed to harm others more than their host

98 / 135 A very brief introduction to security Some definitions

Safety vs. Security

Safety Protects against accidents Defends against nature Can be modeled using probability theory with historical data Security Protects against intentional malice Defends against intelligent beings Must model the strategy of adversaries

100 / 135 A very brief introduction to security Some definitions

Safety vs. Security

Safety Security Question: If you were in charge of a building’s security, how would preparations differ for a tornado versus a terrorist attack? Hint: When preparing for a tornado, should you consider whether neighboring buildings have been protected?

101 / 135

Notes Notes Notes Notes

slide-24
SLIDE 24

A very brief introduction to security Some definitions

What is digital information?

Definition Digital information: information encoded in discrete numbers “Hi!” → 0x486921

102 / 135 A very brief introduction to security Some definitions

What are the implications of digital representation of information?

1 Costless to create perfect copies 2 Information can be transmitted anywhere immediately 3 Information can be remembered indefinitely

⇒ Easy to keep detailed record of transactions

4 Digitally encoded information lacks provenance

⇒ Modifications can’t be identified by just looking at the data

103 / 135 A very brief introduction to security Some definitions

What is information security?

Information security is the endeavor to achieve protection goals specific to

  • information. What are those goals?

1 Confidentiality: information is accessible only to authorized parties 2 Integrity: modification of information can be detected 3 Availability: authorized parties can access information (and use

resources) when and where it is needed

104 / 135 A very brief introduction to security Some definitions

Who are these authorized parties the definitions speak of?

Who is an authorized party? How are they authorized? By whom? Parties: human beings controlling computer system, or programs acting on their behalf Authorization: decision a principal must take on whether a party is allowed to undertake a task Authorization decision is the fundamental challenge of security engineering

105 / 135

Notes Notes Notes Notes

slide-25
SLIDE 25

A very brief introduction to security Some definitions

Identification vs. Authentication vs. Authorization

Identification, authentication and authorization answer different questions

Identification: Who are you? Authentication: Is it really you? Authorization: Knowing who you are, are you allowed to do something?

Common mistake: conflating these concepts

Deploying an authentication system does not solve the authorization problem

106 / 135 A very brief introduction to security Some definitions

How computers identify people

In order to authorize a user to access computer resources, systems must figure out who they’re interacting with Computer systems store (ID, attribute) pairs Upon encountering a user, the system prompts for the ID and attribute. IDs should be unique If the attribute is only known to the user (e.g., a password), it can be used to authenticate the user to the system

107 / 135 A very brief introduction to security Some definitions

Case study: authentication and authorization at ATMs

ATM Bank Identification step

  • 1. Insert card

Authentication steps

  • 2. Request matching PIN
  • 3. Enter PIN

Authorization steps

  • 4. How much to withdraw?
  • 5. Request $100
  • 6. Balance≥$100?
  • 7. Approve withdrawal
  • 8. Dispense $100

108 / 135 A very brief introduction to security Some definitions

Authentication failure: ATM fails to authenticate user

ATM Bank Identification step

  • 1. Insert card

Authentication steps

  • 2. Request matching PIN
  • 3. Enter PIN

Authorization steps

  • 4. How much to withdraw?
  • 5. Request $100
  • 6. Balance≥$100?
  • 7. Approve withdrawal
  • 8. Dispense $100

Mallory Guess PIN

109 / 135

Notes Notes Notes Notes

slide-26
SLIDE 26

A very brief introduction to security Some definitions

Card skimmers: ATM incorrectly authenticates user

Source: http://krebsonsecurity.com/all-about-skimmers/ 110 / 135 A very brief introduction to security Some definitions

Authentication failure: User fails to authenticate ATM

ATM Bank Identification step

  • 1. Insert card

Authentication steps

  • 2. Request matching PIN
  • 3. Enter PIN

Authorization steps

  • 4. How much to withdraw?
  • 5. Request $100
  • 6. Balance≥$100?
  • 7. Approve withdrawal
  • 8. Dispense $100

ATM Mallory Fake ATM

111 / 135 A very brief introduction to security Some definitions

Fake ATMs: User fails to authenticate ATM

Source: http://www.wired.com/threatlevel/2009/08/malicious-atm-catches-hackers/ 112 / 135 A very brief introduction to security Some definitions

Question: how does authentication fail on phishing websites?

113 / 135

Notes Notes Notes Notes

slide-27
SLIDE 27

A very brief introduction to security Computer systems and networks

Four fundamental ideas of computer architecture

1 Code is data 2 Layers of abstraction 3 Moore’s law 4 Halting problem 114 / 135 A very brief introduction to security Computer systems and networks

The von Neumann computer architecture

The pervasive von Neumann computer architecture does not distinguish between instructions for computer programs and data Consequently, Code is Data

⇒ Enables great flexibility in reprogramming computers ⇒ Programs can be costlessly reproduced, not just data

There are unfortunate security implications John von Neumann

115 / 135 A very brief introduction to security Computer systems and networks

The dark side of “Code is Data”

Source: http://www.cl.cam.ac.uk/~rja14/Papers/SE-04.pdf 116 / 135 A very brief introduction to security Computer systems and networks

Layers of abstraction

Abstraction: specifying meaning and behavior of software while hiding implementation details Modular code exploits abstraction and enables composition and reuse Abstraction and code modularity enables rapid software development (which has in turn led to a rapid rise in software complexity) Unlike for mechanical engineering, in software engineering there is no practical limit to the potential combinations of code

117 / 135

Notes Notes Notes Notes

slide-28
SLIDE 28

A very brief introduction to security Computer systems and networks

Abstraction solves everything?

“All problems in computer science can be solved by another level of indirection.” David Wheeler

118 / 135 A very brief introduction to security Computer systems and networks

Layered computer architecture

Hardware Operating system Libraries Application Active content

Intel x86 Microsoft Windows Mozilla Firefox Gecko, NSPR, OJI, . . . Facebook

119 / 135 A very brief introduction to security Computer systems and networks

Layers – good or bad?

+ Abstraction enables greater compatibility since higher layer only interacts with next layer + Layered approach means that developers can ignore problems already solved at other layers

  • Higher layers cannot identify or prevent malfunctions at lower layers
  • Vulnerabilities propagate up the stack
  • Flaws in a single layer can affect all software developed on top

⇒ Think back to when Windows was ridden with holes

Question: at what layer can a strategic attacker wreak the most havoc at least cost?

120 / 135 A very brief introduction to security Computer systems and networks

Abstraction solves everything?

“All problems in computer science can be solved by another level of indirection”, except security problems.

121 / 135

Notes Notes Notes Notes

slide-29
SLIDE 29

A very brief introduction to security Computer systems and networks

Moore’s law

Intel founder Gordon Moore noticed in 1965 that integrated circuit density had been doubling since the 1950s He predicted the trend to continue Moore’s Law: computer performance roughly doubles every 18 months

Figure from Moore’s original paper speculating on the implications of exponential growth in computing power 122 / 135 A very brief introduction to security Computer systems and networks

The halting problem

In 1936, Alan Turing proved that it is impossible to write a general-purpose program that can determine whether another program will stop Bear this in mind the next time someone complains that software developers should be able to find and remove all vulnerabilities in their code

123 / 135 A very brief introduction to security Computer systems and networks

Information security overview

Protection Goals Confidentiality Integrity Availability

  • 1. Engineer defenses

Satisfy goals

  • 2. Security threats
  • 3. Countering security threats

124 / 135 A very brief introduction to security Computer systems and networks

Threat models

All security is relative, but relative to what?

⇒ Threat models codify assumed adversary behavior

Threat models articulate assumed adversary behavior

1

Goal: disrupting defender’s protection goals, make money, wreak havoc

2

Knowledge: does the attacker know how the defense works?

3

Capabilities: Computational power available, time available to target defenders, local vs. global eavesdropping, active vs. passive

Question: could a threat model be fully specified by assuming a certain level of financial resources available to the adversary?

125 / 135

Notes Notes Notes Notes

slide-30
SLIDE 30

A very brief introduction to security Security threats

Security threats: assumptions gone awry

System vulnerabilities: violate engineering assumptions Cryptanalysis: violate physical or mathematical assumptions People just don’t behave as designers expect

Violate assumptions about attacker behavior Violate assumptions about defender behavior

126 / 135 A very brief introduction to security Security threats

Cryptanalysis

Goal of cryptanalysis: descramble ciphertext without knowing the decryption key Simplest approach: brute force

Key of length ℓ bits ⇒ 2ℓ−1 guesses For AES-128, ℓ = 128, so brute force requires 2127 attempts (100 times a trillion times a trillion times a trillion)

Cryptanalysts look for shortcuts (so that 2k guesses required, where k < ℓ What do the shortcuts look like?

Mathematical assumptions can fail Look for patterns in ciphertext (i.e., loss of randomness)

127 / 135 A very brief introduction to security Security threats

Distribution of letters in English

128 / 135 A very brief introduction to security Security threats

Kerckhoffs’ Principle

Cryptographic algorithms must be

  • public. Security depends only on the

secrecy of the keys.

Rationales: – avoid blind trust – more eyes find more flaws – err on the side of caution – changing keys is easier than changing the system – the only reasonable assumption to protect a public infrastructure

→ No security by obscurity

129 / 135

Notes Notes Notes Notes

slide-31
SLIDE 31

A very brief introduction to security Security threats

What can happen if you ignore Kerckhoffs

Source: http://www.schneier.com/blog/archives/2008/08/hacking_mifare.html 130 / 135 A very brief introduction to security Security threats

Do cryptanalysts have the right threat model?

Adi Shamir Cryptography is usually bypassed. I am not aware of any major world-class security system employing cryptography in which the hackers penetrated the system by actually going through the cryptanalysis. [. . . ] Usually there are much simpler ways of penetrating the security system.

131 / 135 A very brief introduction to security Security threats

Most attackers bypass the threat model

This shouldn’t be surprising: a well-engineered system will be designed so that the attacks they planned for are hard to carry out Threat models can go wrong in two ways

1

Ascribe too much power to an attacker or focus too much on a particular mode of attack

⇒ Leads to “over-engineering” and over-investment in defenses against certain threats

2

Miss attacks by not accounting for behaviors and capabilities

132 / 135 A very brief introduction to security Security threats

Threat model adopted by cryptanalysts fail on both counts

Often overestimates attacker capability (focus on nation-state as adversary) Singular focus on decrypting ciphertext without access to the encryption key ignores how most attacks take place Much cheaper for an attacker to find a way to recover the key

133 / 135

Notes Notes Notes Notes

slide-32
SLIDE 32

A very brief introduction to security Security threats

Users don’t always behave as system designers envision

Many successful attacks trick users into sharing keys and passwords Systems security is predicated on users only taking actions that are in their own interest

⇒ Makes their job tractable ⇒ Offloads the hard decision of whether to allow untrusted software to execute onto the end user

134 / 135 A very brief introduction to security Security threats

Prompts condition users to ignore security warnings

135 / 135 A very brief introduction to security Security threats

Countering security threats

When weaknesses are discovered in security defenses, defenders have two choices

1

Make fundamental changes to defenses

2

Counter the attacks directly

First approach could lead to improved security in the long run, but it is slow Second approach is reactive, but more responsive

1

Ex post countermeasures: counter attacks and flaws after they are encountered

2

Ex ante countermeasures: counter flaws before an attack is realized

136 / 135

Notes Notes Notes Notes