Insider Problem and Elec1ons Ma3 Bishop Computer Security Lab - - PowerPoint PPT Presentation

insider problem and elec1ons
SMART_READER_LITE
LIVE PREVIEW

Insider Problem and Elec1ons Ma3 Bishop Computer Security Lab - - PowerPoint PPT Presentation

Insider Problem and Elec1ons Ma3 Bishop Computer Security Lab Dept. of Computer Science University of California, Davis Ma3 Bishop, February 18, 2016 Slide #1 Opening Thought Theres no sense in being precise when you don't even know


slide-1
SLIDE 1

Insider Problem and Elec1ons

Ma3 Bishop Computer Security Lab

  • Dept. of Computer Science

University of California, Davis

Ma3 Bishop, February 18, 2016 Slide #1

slide-2
SLIDE 2

Opening Thought

  • There’s no sense in being precise when you

don't even know what you’re talking about. — John von Neumann

Ma3 Bishop, February 18, 2016 Slide #2

slide-3
SLIDE 3

What is an Insider?

  • Government intelligence analyst

– By day: analyzes informa1on obtained from monitoring electronic signals to determine what adversary is up to – By night: provides this informa1on to the adversary so they know what the analyst’s government is being told

Ma3 Bishop, February 18, 2016 Slide #3

slide-4
SLIDE 4

Legendary Example

  • Greeks wanted to get inside Troy
  • They built a wooden horse and put soldiers

inside

  • The Trojans pulled the horse into the city
  • At night, the Greeks inside the horse got out

and opened the gates

  • The Greeks entered the city (and sacked it)

Ma3 Bishop, February 18, 2016 Slide #4

slide-5
SLIDE 5

Real-Life Examples

  • In World War II, Abwehr sent spies to England

– Germany fed them informa1on about other spies – Bri1sh had captured all of them, turned many of them, and so got the informa1on

  • But Soviets had penetrated Bri1sh counter-

intelligence

– Kim Philby was high-ranking Bri1sh official – He was also a Soviet spy

Ma3 Bishop, February 18, 2016 Slide #5

slide-6
SLIDE 6

University of California, San Francisco

  • Transcriber in Pakistan said she would post

pa1ent records on the Internet unless UCSF would help her get money owed from “Tom”, the subcontractor who hired her

  • Tom subcontracted by Sonya
  • Sonya subcontracted by Transcrip1on Stat
  • Transcrip1on Stat contracted with UCSF (for

past 20 years!)

Ma3 Bishop, February 18, 2016 Slide #6

slide-7
SLIDE 7

Defining the Insider

  • “an already trusted person with access to

sensi1ve informa1on and informa1on systems”

– Understanding the Insider Threat, RAND (2004), xi

  • “anyone with access, privileges, or knowledge of

informa1on systems and services”

– Same report, 10

  • Anyone opera1ng inside the security perimeter

– New Incident Response Best Prac7ces: Patch and Process is No Longer Acceptable Incident Response, Guidance Sojware (2003)

Ma3 Bishop, February 18, 2016 Slide #7

slide-8
SLIDE 8

More Defini1ons

  • “legi1mate users who maliciously leverage their system

privileges, and familiarity and proximity to their computa1onal environment to compromise valuable informa1on or inflict damage ”

– “Towards a Theory of Insider Threat Assessment”, Proc. 2008

  • Intl. Conf. Dependable Systems and Networks (2008)
  • “a person with legi1mate access to an organiza1on’s

computers and networks”

– “Insiders Behaving Badly: Addressing Bad Actors and Their Ac1ons”, IEEE Trans. Infor. Forensics and Security (2010)

  • “users with privileged knowledge about a system”

– “Designing Host and Network Sensors to Mi1gate the Insider Threat”, IEEE Security & Privacy (2009)

Ma3 Bishop, February 18, 2016 Slide #8

slide-9
SLIDE 9

S1ll More Defini1ons

  • “Insider a3acks—that is, a3acks by users with

privileged knowledge about a system”

– “Designing Host and Network Sensors to Mi1gate the Insider Threat,” IEEE Security & Privacy (2009)

  • “legi1mate users in an IT Infrastructure”

– “Towards an Insider Threat Predic1on Specifica1on Language,” Informa7on Management & Computer Security (2006)

Ma3 Bishop, February 18, 2016 Slide #9

slide-10
SLIDE 10

And S1ll More Defini1ons

  • “a person [who] has been legi1mately

empowered with the right to access, represent,

  • r decide about one or more assets of the
  • rganiza1on's structure”

– “Countering Insider Threats”, Schloss Dagstuhl (2008)

  • “a human en1ty that has/had access to the

informa1on system of an organiza1on and does not comply with the security policy of the

  • rganiza1on”

– “An Insider Threat Predic1on Model”, Proc. 7th Intl.

  • Conf. Trust, Privacy, and Security (2010)

Ma3 Bishop, February 18, 2016 Slide #10

slide-11
SLIDE 11

And a Final Defini1on

  • “A current or former employee, contractor, or

business partner who

– has or had authorized access to an organiza1on’s network, system data and – inten1onally exceeded or misused that access in a manner that nega1vely affected the confiden1ality, integrity, or availability of the organiza1on’s informa1on or informa1on systems”

  • Common Sense Guide to Preven7on and Detec7on of Insider

Threats 3rd Edi7on — Version 3.1 (2009)

Ma3 Bishop, February 18, 2016 Slide #11

slide-12
SLIDE 12

Perimeters

Organiza1on firewall Internet

Insiders Outsiders

mobile mobile

Ma3 Bishop, February 18, 2016 Slide #12

slide-13
SLIDE 13

Problems

  • How well defined is your perimeter?

– Mobile compu1ng, especially BYOD – Virtual private networks – Remote sites – Unknown modems, etc.

  • How does physical access play into this?

– Authorized users – Others, such as janitors

Ma3 Bishop, February 18, 2016 Slide #13

slide-14
SLIDE 14

Supply Chain Problem

  • Someone sells you a program to solve a

problem your company has

  • When you use it, it copies data from your

computer (including medical records) to a server on the Internet

  • So . . . Are you an insider?
  • And . . . Is the person who wrote it an insider?
  • And . . . Is the person who sold it an insider?

Ma3 Bishop, February 18, 2016 Slide #14

slide-15
SLIDE 15

Example: Vo1ng Machines

  • Rumor: The son of a U.S. presiden1al candidate

ran an investment firm that was rumored to have a stake in Hart InterCivic, a maker of e-vo1ng systems used in Ohio, USA

– No evidence this is true; but suppose it is

  • Implica1on is someone in the company, on

instruc1ons from a stakeholder, could corrupt e- vo1ng system to deliver votes as desired

– A supply chain a3ack as person is not an elec1on

  • fficial; so, is that person an insider? Is the

stakeholder?

Ma3 Bishop, February 18, 2016 Slide #15

slide-16
SLIDE 16

Not Just Computer Scien1sts

  • Insider trading

– In U.S. law, defined by the agency that regulates the stock exchanges (Securi1es Exchange Commission) – Extensively li1gated over the years – Considerable grey area makes it difficult to know whether par1cular transac1on is legal or not

Ma3 Bishop, February 18, 2016 Slide #16

slide-17
SLIDE 17

Common No1ons in Defini1ons

  • Access

– Without access, nothing can happen – Access can be direct or indirect

  • Hunker, Probst list 3 other categories of

a3ributes of insiders

– Knowledge – Ability to represent something – Trust by the organiza1on

  • All require some form of (direct or indirect)

access

Ma3 Bishop, February 18, 2016 Slide #17

slide-18
SLIDE 18

Back to Basics (For a Moment)

  • Security policy defines security
  • Security mechanisms enforce security
  • The mechanisms are imprecise

– Do not enforce the security policy precisely – Jones and Lipton result: no generic procedure for developing security mechanisms that are both secure and precise (except trivial cases)

Ma3 Bishop, February 18, 2016 Slide #18

slide-19
SLIDE 19

Policies

  • Idea: define policies that minimize access

– Principles of least privilege, fail-safe defaults – Look for inconsistencies that could enable viola1on of security policy

Ma3 Bishop, February 18, 2016 Slide #19

slide-20
SLIDE 20

Cri1cal Assump1on

  • The stated policy is complete, precise, and

correct

– Reality: this is rarely (if ever) true – Indeed, you may not know the policy you need!

Ma3 Bishop, February 18, 2016 Slide #20

slide-21
SLIDE 21

Analyzing It More

  • Apply no1on of “layers of abstrac1on” to

security policy

  • Examine the discrepancies between different

layers

  • Can integrate inten1on into these layers

Ma3 Bishop, February 18, 2016 Slide #21

slide-22
SLIDE 22

Issues

  • Feasibility

– Computer systems understand accounts, not people – Computer systems understand ac7ons, not inten7ons

Ma3 Bishop, February 18, 2016 Slide #22

slide-23
SLIDE 23

Example

  • Policy: Alice is authorized to read medical records for

the purpose of transcrip7ons

  • Implementa1on: account alice is authorized to read

access files labeled “medical records”

  • Gaps

– Anyone with access to account alice can read files labeled “medical records” – Account alice can read medical records and then do anything with that data (including pos1ng the data to the Web) – Account alice can read any file labeled “medical record” whether it is a medical record or not

Ma3 Bishop, February 18, 2016 Slide #23

slide-24
SLIDE 24

Unifying Policy Hierarchy

  • Ideal policy: what you want, even if not stated

explicitly

  • Feasible policy: what you can actually

implement on an actual system

  • Configured policy: what you actually configure

your system to enforce

  • Run-1me policy: what the system actually

enforces (including what vulnerabili1es enable)

Ma3 Bishop, February 18, 2016 Slide #24

slide-25
SLIDE 25

Simple Example

  • Bob surfing web using browser vulnerable to

remote exploit

  • Accidentally surfs to site with a3ack that exploits

it

– If a3acker, Alice, gets access to Bob’s account, she’s in the Configured/Run-Time gap

  • Deliberately surfs to site with a3ack that exploits

it; claims “oops, I didn’t know!”

– Now Bob gives access to Alice, so he’s in the Ideal/ Feasible gap

Ma3 Bishop, February 18, 2016 Slide #25

slide-26
SLIDE 26

The Threats

  • Someone has more access at lower policy level

than at higher policy level

  • Someone has less access at lower policy level

than at higher policy level

  • This provides a policy-based defini1on of

“insider” – an en1ty that falls into these gaps

Ma3 Bishop, February 18, 2016 Slide #26

slide-27
SLIDE 27

Analyzing Access

Ma3 Bishop, February 18, 2016 27

Problem: How do we take physical (as opposed to logical) access into account? Informal approach: figure out who has access by looking at environment More rigorous approach: model the procedures,

  • r process, that uses the compromised ar1facts,

and determine the actors affec1ng the process

slide-28
SLIDE 28

How to Analyze Process for Access

Ma3 Bishop, February 18, 2016 28

  • Deal with insiders in the context of a process

– Not so much based upon how a computer system works, except as that is a process

  • Processes defined rigorously, allowing us to

reason about them

– Iden1fy poten1al points of failure

  • These are poten1al vulnerabili1es to a3acks

– Look at agents (insiders) that can cause failure

slide-29
SLIDE 29

Example: Elec1on Process

Ma3 Bishop, February 18, 2016 29

  • Use Yolo County, CA, USA elec1on process as

example

  • Model it using Li3le-JIL process modeling

language

– Hierarchical (tree) structure of steps – How a leaf step is executed is up to step’s agent – How an intermediate step is executed is defined by its substeps and sequence badge indica1ng the

  • rder of substeps
slide-30
SLIDE 30

Before Elec1on Day

  • Qualified people register to vote

– Need to present proof of residence – Nature of proof varies

  • Elec1on officials create, print ballots

– Each jurisdic1on has its own rules for ballot formaxng, languages – Ojen need to use special paper

  • Elec1on officials choose loca1ons for polling

sta1ons

Ma3 Bishop, February 18, 2016 Slide #30

slide-31
SLIDE 31

Before Elec1on Day

  • Elec1on officials train poll workers, inspectors,

judges

  • They then distribute the elec1on materials

that will be used at the polling sta1ons

– How and when this is done varies widely

Ma3 Bishop, February 18, 2016 Slide #31

slide-32
SLIDE 32

On Elec1on Day(s)

  • Voter goes to polling sta1on, authen1cates

– Manner of authen1ca1on varies among jurisdic1ons

  • Voter is given ballot

– Physical or electronic

  • Voter enters booth, votes

– May be done electronically – If on paper, ballot placed in protec1ve sleeve

  • Voter leaves booth

– If paper ballot, drops it into ballot box – “Box” may be a scanner that records ballot markings

Ma3 Bishop, February 18, 2016 Slide #32

slide-33
SLIDE 33

At End of Elec1on Day

  • Elec1on officials take vo1ng instruments back to

Elec1on Central

– Includes cards from e-vo1ng systems

  • Elec1on officials remove ballots from ballot box
  • Ballots run through automa1c counters

– Results go onto computer to tally votes – Counters enable humans to intervene if ballot markings ambiguous (overvotes, undervotes, stray markings)

Ma3 Bishop, February 18, 2016 Slide #33

slide-34
SLIDE 34

Ajer Elec1on Day

  • Canvass begins to assure correctness of

elec1on results

– Procedures vary, but usually involve some kind of a par1al recount that is then compared to tallies from automa1c counter

  • Results transmi3ed to chief elec1on official

for cer1fica1on

– In California, this is the Secretary of State

Ma3 Bishop, February 18, 2016 Slide #34

slide-35
SLIDE 35

Elec1on Process in Li3le-JIL

  • Graphical process defini1on language with formal

seman1cs; process represented as a hierarchical decomposi1on of steps

Ma3 Bishop, February 18, 2016 Slide #35

count votes prepare for and conduct elec1on at precinct pre-polling ac1vi1es do recount Precinct+ Precinct+ conduct elec1on Vote Count Inconsistent Excep1on

slide-36
SLIDE 36

Our Focus: Count Votes

  • 1. Ini1alize counts
  • 2. Count votes from all precincts

– Count each precinct independently

  • 3. Perform random audit
  • 4. Report final vote totals to Secretary of State
  • 5. Securely store elec1on ar1facts

Ma3 Bishop, February 18, 2016 Slide #36

slide-37
SLIDE 37

Who Are the Insiders?

  • Voters

– Ideal policy: intended vote – Feasible policy: computer captures intended vote

  • May mismark it, or record it incorrectly
  • Elec1on officials manage the coun1ng

– Intended policy: count the votes properly – Feasible policy: what each step of the coun1ng should do – Run-1me policy: the ar1facts given to and produced by each step, and what each step actually does

Ma3 Bishop, February 18, 2016 Slide #37

slide-38
SLIDE 38

Subprocess “count votes”

Ma3 Bishop, February 18, 2016 38

count votes 1 count votes from all precincts 3 report intermediate vote totals to Secretary of State 9 report final vote totals to Secretary of State 5 perform ballot and vote count 8 perform reconciliations 13 scan votes 14 confirm tallies match 15 handle discrepancy 17 rescan 20 add vote count to vote total 16 manually count votes 21 perform random audit 4 select precincts for 1% mandatory manual audit 10 confirm audit tallies are consistent 12 manually count votes 11 recount votes 7 initialize counts 2 reconcile voting roll and cover sheet 18 reconcile total ballots and counted ballots 19 securely store election artifacts 6 + * may throw VoteCountInconsistentException done separately per precinct may throw VoteCountInconsistentException continues after handling VoteCountInconsistentException input: - votingRoll,

  • coverSheet

input: - repository

  • paperTrail
  • coverSheet
slide-39
SLIDE 39

Ar1facts and Agents

Ma3 Bishop, February 18, 2016 39

(ref #) step Input ar/facts

  • utput ar/facts

agent (2) Ini1alize counts totalTallies Elec1onOfficial (13) perform reconcilia1ons coverSheet; paperTrail; repository; vo1ngRoll Elec1onOfficial (18) reconcile vo1ng roll and cover sheet coverSheet; vo1ngRoll Elec1onOfficial (19) reconcile total ballots and counted ballots coverSheet; paperTrail; repository Elec1onOfficial (39) check off voter as voted vo1ngRoll 1meStamp Elec1onOfficial (44) put ballot in repository repository 1meStamp Elec1onOfficial

slide-40
SLIDE 40

Insiders and A3acks

Ma3 Bishop, February 18, 2016 40

  • Insider: step agent who has access to any process

ar1fact that can affect the outcome of the process and for which general access is restricted

  • Sabotage aZack insider: agent that can change

the value of ar1facts used in the computa1on of any of the final outputs of process

  • Data exfiltra7on aZack insider: agent that has

access to any ar1facts needed to synthesize an ar1fact whose access should be restricted

slide-41
SLIDE 41

Iden1fying Threats of Insider Sabotage A3ack

Ma3 Bishop, February 18, 2016 41

  • Iden1fy a hazard as the delivery of an

incorrect ar1fact to a step in the process that delivers the ar1fact as a final process output

  • From the process defini1on, automa1cally

generate fault tree showing how hazard can

  • ccur
slide-42
SLIDE 42

Ma3 Bishop, February 18, 2016 42

Selected hazard: Final tallies to Sec of State are wrong

slide-43
SLIDE 43

Example

Ma3 Bishop, February 18, 2016 43

  • Hazard: wrong finalTallies delivered to the step

report final vote totals to Secretary of State

– Meaning the reported elec1on results are wrong

  • Automa1cally generate fault tree
  • Use fault tree analysis tool to calculate minimal

cut sets (MCSs)

– Look for sets of ac1vi1es where all agents are insiders and can modify final output (finalTallies) or ar1fact used to create final output

slide-44
SLIDE 44

12 MCSs; Example Results

Ma3 Bishop, February 18, 2016 44

  • Step rescan produces wrong ar1fact tallies
  • Step perform random audit does not throw excep1on

VoteCountInconsistentExcep7on

} Step recount votes produces wrong ar1fact

recountedVoteTotals

} Step scan votes produces wrong ar1fact tallies } Step confirm tallies match does not throw excep1on

VoteCountInconsistentExcep7on

} Step perform random audit does not throw excep1on

VoteCountInconsistentExcep7on

1 2 3

slide-45
SLIDE 45

Data Exfiltra1on A3ack

Ma3 Bishop, February 18, 2016 45

  • In elec1on context, associa1ng a specific voter

with a specific ballot

– Done in Ohio, USA by correla1ng 1me-stamped ballots, poll books with 1mes listed

  • For expository purposes, voters vote on an

electronic vo1ng machine that 1me-stamps paper record of ballot

– In Yolo, almost everyone uses paper, which is never 1me-stamped

slide-46
SLIDE 46

conduct election 30 pre-polling activities 31 count votes 33 prepare for and conduct election at precinct 32 pre-polling check 34 authenticate and vote 35 add unused ballots to repository 36 count ballots and reconcile 37 perform pre-vote authentication 38 check off voter as voted 39 record voter preference 41 fill out ballot 43 put ballot in repository 44 handle spoiled ballot 45 issue regular ballot 40 issue provisional ballot 42 + + may throw VoterIneligibleException for each precinct for each voter by adding timestamp next to voter's name in the voting roll continues after handling VoterIneligibleException timestamp is added on ballot when ballot is put in repository

Excerpt of “conduct elec1on” process

Ma3 Bishop, February 18, 2016 46

slide-47
SLIDE 47

Analysis

Ma3 Bishop, February 18, 2016 47

  • If process executed as specified, only voter should know

how she voted

  • But . . .

– Step 39: add 1mestamp next to name in roll – Step 44: add 1mestamp to ballot when placed in repository

  • When can these be combined?

– Ar1facts are vo7ngRoll (step 39), repository (step 44)

  • Look in process model for a step, or sibling steps, using

these ar1facts

– Steps 18, 19 here; parent is step 13, requiring both – Elec7onOfficial is agent – So Elec7onOfficial can exfiltrate data

slide-48
SLIDE 48

Evalua1on

Ma3 Bishop, February 18, 2016 48

  • Subjec1ve

– Process model validated by domain experts – Domain experts are be3er able to iden1fy most worrisome types of insider a3acks

  • Objec1ve

– Focus on effec1veness, efficiency of process defini1on and analysis approaches – Li3le-JIL allows itera1ve process improvement based on feedback from domain experts

slide-49
SLIDE 49

Limita1ons

Ma3 Bishop, February 18, 2016 49

  • Techniques are not always precise enough to

fully describe the vulnerabili1es and explain how they arise

  • Analysis does not take into account full control

and data dependencies of all steps

  • Current agent descrip1ons are coarse
  • Need to improve analysis of types of agents

assigned to steps

  • Use original analysis to suggest process

modifica1ons (automated or semi-automated)

slide-50
SLIDE 50

Conclusion

  • Problem: instan1a1ng the model

– In par1cular, how do you get the ideal policy? – And how do you find the run-1me policy?

  • Need to determine threats
  • How do you gather, analyze psychosocial

informa1on?

– Social networks, media very useful for this – But one heck of an invasion of privacy!

Ma3 Bishop, February 18, 2016 Slide #50

slide-51
SLIDE 51

Key Points

  • Treat a3ackers as a con1nuum, not as binary

“inside” and “outside” divisions

  • Policies aren’t precise, so think of them as layers
  • f rules

– Very useful for separa1ng “intent” from “what’s actually implemented” at various levels

  • Understand the en7re process, not just the

computer use!

– Physical access ojen more important than computer access

Ma3 Bishop, February 18, 2016 Slide #51

slide-52
SLIDE 52

Closing Thought

To those accustomed to the precise, structured methods

  • f conven1onal system development, exploratory

development techniques may seem messy, inelegant, and unsa1sfying. But it’s a ques1on of congruence: precision and flexibility may be just as dysfunc1onal in novel, uncertain situa1ons as sloppiness and vacilla1on are in familiar, well-defined ones. Those who admire the massive, rigid bone structures of dinosaurs should remember that jellyfish s1ll enjoy their very secure ecological niche. — Beau Sheil, “Power Tools for Programmers”

Ma3 Bishop, February 18, 2016 52

slide-53
SLIDE 53

About Me

Ma3 Bishop Computer Security Laboratory Department of Computer Science University of California at Davis 1 Shields Ave. Davis, CA 95616-8562 USA phone: +1 (530) 752-8060 email: bishop@ucdavis.edu web: h3p://seclab.cs.ucdavis.edu/~bishop

Ma3 Bishop, February 18, 2016 Slide #53

slide-54
SLIDE 54

Joint Work With …

  • Dr. Carrie Gates, Dell Research
  • Dr. Sean Peisert, Lawrence Berkeley Na1onal

Laboratory and University of California at Davis

  • Prof. Lee Osterweil, Prof. Lori Clarke, Prof.

George Avrunin, Bobby Simidchieva, Heather Conboy, Huang Pham, University of Massachuse3s, Amherst

  • Prof. Sophie Engle, University of San Francisco
  • Dr. Sean Whalen, University of California at San

Francisco

Ma3 Bishop, February 18, 2016 Slide #54