Cognitive Security for Personal Devices Rachel Greenstadt - - PowerPoint PPT Presentation

cognitive security for personal devices
SMART_READER_LITE
LIVE PREVIEW

Cognitive Security for Personal Devices Rachel Greenstadt - - PowerPoint PPT Presentation

Cognitive Security for Personal Devices Rachel Greenstadt (greenie@cs.drexel.edu) Jake Beal (jakebeal@mit.edu) AISec October 28, 2008 I must be dancing with Jake, after all, this guy knows Jakes private key.... Sounds like Jake Looks


slide-1
SLIDE 1

Cognitive Security for Personal Devices

Rachel Greenstadt (greenie@cs.drexel.edu) Jake Beal (jakebeal@mit.edu)

AISec October 28, 2008

slide-2
SLIDE 2

I must be dancing with Jake, after all, this guy knows Jake’s private key....

slide-3
SLIDE 3

Human-style authentication

Looks like Jake Dances like Jake Sounds like Jake

slide-4
SLIDE 4

It seems this is Mako and not, in fact, Jake Computers could recognize other cues Typing patterns Touchpad patterns Use patterns Camera image T Posture/Device placement

slide-5
SLIDE 5

Cognitive Security

  • Humans have rich and subtle mechanisms

for handling trust and security

  • Goal: Intelligent agents mediate security

decisions between users and applications

  • Build user models via continuously-deployed

multi-modal behavioral biometrics

  • Use models to aid security decisions
slide-6
SLIDE 6

Mismatch Between Users and Machines: An AI and HCI Problem

  • We must use human mechanisms sometimes
  • Example: passwords to keys
  • Security automation considered harmful? [Edwards

Poole Stoole 2007]

  • Context dependent security decisions
  • Can’t be pre-baked in
  • Need an agent to observe the context
slide-7
SLIDE 7

Machine Imprint on Users,

develop models of their behavior

Obviously not appropriate for all scenarios...

slide-8
SLIDE 8

Architecture for Machine Integrity

  • Sensitive Information
  • Requires isolation
  • Lots of research in this sort of

model already

  • Overhead? (VMMs, classifiers,

etc) perhaps...

slide-9
SLIDE 9

Once computers know their users, they can infer beliefs and goals

Alice:

* Knows she wants to visit her bank * Doesn’t know she’s not at her bank

Alice’s device:

* Knows Alice is not visiting her bank * Doesn’t know that Alice believes she is at her bank

slide-10
SLIDE 10

Adjustably Autonomous Security

  • Model users’ belief, desires, intentions
  • Understand concepts
  • private information
  • expected program behavior
  • simulate users’ judgment
  • pass decisions up when appropriate
slide-11
SLIDE 11

Current work

  • Authentication
  • Keystrokes
  • Stylometry
  • Anti-phishing
slide-12
SLIDE 12

Thank You

  • Questions?
  • More detail available as MIT CSAIL Tech

Report 2008-016

  • http://dspace.mit.edu/handle/1721.1/40810
  • Email: greenie@cs.drexel.edu, jakebeal@mit.edu