Hardware Security Chester Rebeiro IIT Madras 1 Physically - - PowerPoint PPT Presentation

hardware security
SMART_READER_LITE
LIVE PREVIEW

Hardware Security Chester Rebeiro IIT Madras 1 Physically - - PowerPoint PPT Presentation

Hardware Security Chester Rebeiro IIT Madras 1 Physically Unclonable Functions Physical Unclonable Func1ons and Applica1ons: A Tutorial h8p://ieeexplore.ieee.org/document/6823677/ Edge Devices 1000s of them expected to be deployed Low power


slide-1
SLIDE 1

Hardware Security

1

Chester Rebeiro IIT Madras

slide-2
SLIDE 2

Physically Unclonable Functions

Physical Unclonable Func1ons and Applica1ons: A Tutorial h8p://ieeexplore.ieee.org/document/6823677/

slide-3
SLIDE 3

Edge Devices

3

1000s of them expected to be deployed Low power (solar or ba8ery powered) Small footprint Connected to sensors and actuators Expected to operate 24 x 7 almost unmanned 24x7 these devices will be con1nuously pumping data into the system, which may influence the way ci1es operate Will affect us in mulRple ways, and we may not even know that they exist.

slide-4
SLIDE 4

AuthenRcaRng Edge Devices

  • Stored keys

– EEPROM manufacture is an overhead – Public key cryptography is heavy – Can be easily copied / cloned

4

EncrypRon done in edge device Public keys stored in server Private keys

slide-5
SLIDE 5

Physically Unclonable FuncRons

  • No stored keys
  • No public key cryptography
  • Cannot be cloned / copied
  • Uses nano-scale variaRons in manufacture. No two devices are exactly idenRcal

5

EncrypRon done in edge device Public keys stored in server challenge / response

Digital Fingerprints

slide-6
SLIDE 6

PUFs

6

A funcRon whose output depends on the input as well as the device execuRng it.

slide-7
SLIDE 7

What is Expected of a PUF? (Inter and Intra Differences)

7

challenge response response challenge Response Response

(Reliable) Same Challenge to Same PUF Difference between responses must be small on expectaRon IrrespecRve of temperature, noise, aging, etc. (Unique) Same Challenge to different PUF Difference between responses must be large on expectaRon Significant variaRon due to manufacture

slide-8
SLIDE 8

What is Expected of a PUF? (Unpredictability)

8

challenge response response Difficult to predict the output of a PUF to a randomly chosen challenge when one does not have access to the device

slide-9
SLIDE 9

Intrinsic PUFs

  • Completely within the chip

– PUF – Measurement circuit – Post-processing

  • No fancy processing steps!

– eg. Most Silicon based PUFs

9

slide-10
SLIDE 10

Silicon PUFs

  • eg. Ring Oscillator PUF

10

f = 1 2nt

Frequency of ring oscillator Number of stages Delay of each stage

f n t

Ring Oscillator with odd number of gates Frequency affected by process variaRon.

slide-11
SLIDE 11

Why variaRon occurs?

11

When gate voltage is less than threshold no current flows When gate voltate is greater than threshold current flows from source to drain Threshold voltage is a function of doping concentration, oxide thickness

Delay depends on capacitance Process Varia1ons

  • Oxide thickness
  • Doping concentraRon
  • Capacitance

MOS Transistor CMOS Inverter

slide-12
SLIDE 12

Silicon PUFs

  • eg. Ring Oscillator PUF

12

>

enable counter counter N bit challenge 1 2 3 N

N-1 N-2

1 bit response

RA RB

response = 1 fA > fB fA ≤ fB ⎧ ⎨ ⎪ ⎩ ⎪

slide-13
SLIDE 13

Results of a RO PUF

15 Xilinx, Virtex 4 FPGAs; 1024 ROs in each FPGA; Each RO had 5 inverter stages and 1 AND gate

13

Physical Unclonable Functions for Device Authentication and Secret Key Generation https://people.csail.mit.edu/devadas/pubs/puf-dac07.pdf

Inter Chip Varia1ons (Uniqueness measurement)

challenge response response

When 128 bits are produced, Avg 59.1 bits out of 128 bits different

slide-14
SLIDE 14

Results of a RO PUF

15 Xilinx, Virtex 4 FPGAs; 1024 ROs in each FPGA; Each RO had 5 inverter stages and 1 AND gate

14

Physical Unclonable Functions for Device Authentication and Secret Key Generation https://people.csail.mit.edu/devadas/pubs/puf-dac07.pdf

Intra Chip Varia1ons (Reproducability measurement)

challenge response response

0.61 bits on average out of 128 bits differ

120oC 1.08V 20oC; 1.2V

slide-15
SLIDE 15

Arbiter PUF

15

1 1 1 1 1

Ideally delay difference between Red and Blue lines should be 0 if they are symmetrically laid out. In pracRce variaRon in manufacturing process will introduce random delays between the two paths

Switch

slide-16
SLIDE 16

Arbiter

16

D FF D clk Q

?

If the signal at D reaches first then Q will be set to 1 If the signal at clk reaches first then Q will be set to 0 D FF

slide-17
SLIDE 17

Arbiter PUF

17

challenge

rising
 Edge

1 if top path is faster, else 0

D Q

1 1 1 1 1 1

1 1 1 1 G 13.56MHz Chip For ISO 14443 A spec.

slide-18
SLIDE 18

Results for RO PUF

18 Design and Implementa1on of PUF-Based “Unclonable” RFID ICs for An1-Counterfei1ng and Security Applica1ons IEEE Int.Conf. on RFID, 2008, S. Devdas et. Al.

slide-19
SLIDE 19

Comparing RO and Arbiter PUF

19

Number of Challenge : Response Pairs : Number of Challenge : Response Pairs :

N 2 ⎛ ⎝ ⎜ ⎞ ⎠ ⎟

2N

#CRPs linearly related to the number

  • f components

#CRPs exponenRally related to the number

  • f components

WEAK PUF STRONG PUF

slide-20
SLIDE 20

Weak PUF vs Strong PUF

20

  • ComparaRvely few number of Challenge

Response Pairs (CRPs)

  • Huge number of Challenge

Response Pairs (CRPs)

  • CRPs must be kept secret, because an a8acker

may be able to enumerate all possible CRPs

  • Weak PUFs useful for creaRng cryptographic

keys

  • It is assumed that an a8acker cannot

Enumerate all CRPs within a fixed Rme interval. Therefore CRPs can be made public

  • Formally, an adversary given a poly-sized sample
  • f adapRvely chosen CRPs cannot predict the

Response to a new randomly chosen challenge.

  • Typically used along with a cryptographic scheme

(like encrypRon / HMAC etc) to hide the CRP (since the CRPs must be kept secret)

  • Does not require any cryptographic scheme, since

CRPs can be public.

Weak PUF Strong PUF

  • Very Good Inter and Intra differences
slide-21
SLIDE 21

PUF Based AuthenRcaRon (with Strong PUF)

21

CRPs challenge response Bootstrapping: At manufacture, server builds a database of CRPs for each device. At deployment, server picks a random challenge from the database, queries the device and validates the response

slide-22
SLIDE 22

PUF Based AuthenRcaRon Man in the Middle

22

CRPs challenge response Man in the middle may be able to build a database of CRPs To prevent this, CRPs are not used more than once

slide-23
SLIDE 23

PUF Based AuthenRcaRon CRP Tables

23

CRPs challenge response Each device would require its own CRP table and securely stored in a trusted server Tables must be large enough to cater to the enRre life Rme of the device

  • r need to be recharged periodically

(scalability issues) CRPs

slide-24
SLIDE 24

PUF based AuthenRcaRon

(AlleviaRng CRP Problem)

Secret Model of PUF

24

Gate Delays

  • f PUF components

Bootstrapping: At manufacture, server builds a database of gate delays of each component in the PUF. At deployment, server picks a random challenge constructs its expected response from secret model, queries the device and validates the response SRll Requires Secure Bootstrapping and Secure Storage

slide-25
SLIDE 25

PUF based AuthenRcaRon

(AlleviaRng CRP Problem)

  • PPUF : Public Model PUF

25

Gate Delays

  • f PUF

Components (Public) Trusted server (PKI) Bootstrapping: Download the public model of PUF from the trusted server. At deployment, server picks a random challenge constructs expected response from public model, queries the device and validates the response. If Rme for response is less than a threshold accept response else rejects. AssumpRon: A device takes much less Rme to compute a PUF response than an a8acker who models the PUF. T < T0 ?

slide-26
SLIDE 26

PUF based AuthenRcaRon

(AlleviaRng CRP Problem)

Homomorphic Encryp1on

26

Encrypted CRPs Untrusted Cloud R e s p

  • n

s e

slide-27
SLIDE 27

Conclusions

  • Different types of PUFs being explored

– Analog PUFs, Sensor PUFs etc.

  • CRP issue sRll a big problem
  • Several a8acks feasible on PUFs.

– Model building a8acks (SVMs) – Tampering with PUF computaRon (eg. Forcing a sine-wave on the ground plane, can alter the results of the PUF)

  • PUFs are a very promising way for lightweight authenRcaRon of edge

devices.

27

slide-28
SLIDE 28

Hardware Trojans

Hardware Security: Design, Threats, and Safeguards; D. Mukhopadhyay and R.S. Chakraborty

slide-29
SLIDE 29

29 h8ps://www.theguardian.com/technology/2012/may/29/cyber-a8ack-concerns-boeing-chip h8ps://techcrunch.com/2013/09/05/nsa-subverts-most-encrypRon-works-with-tech-companies-for-back-door-access-report-says/ h8ps://www.theregister.co.uk/2013/07/29/lenovo_accused_backdoors_intel_ban/ h8ps://www.technologyreview.com/s/519661/nsas-own-hardware-backdoors-may-sRll-be-a-problem-from-hell/

slide-30
SLIDE 30

IC Life Cycle (Vulnerable Steps)

30

IP Tools Std. Cells Models Design Specifications Fab Interface Mask Fab Wafer Probe Dice and Package Package Test Deploy and Monitor

Trusted Either Untrusted Wafer

*hbp://www.darpa.mil/MTO/solicita1ons/baa07-24/index.html

Offshore Third-party

slide-31
SLIDE 31

Malware in Third Party IPs

  • Third party IPs

– Can they be trusted? – Will they contain malicious backdoors

  • Developers don’t / can’t

search 1000s of lines of code looking out for trojans.

31

slide-32
SLIDE 32

FANCI : IdenRficaRon of Stealthy Malicious Logic

  • FANCI: evaluate hardware

designs automaRcally to determine if there is any possible backdoors hidden

  • The goal is to point out to

testers of possible trojan locaRons in a huge piece of code

32

h8p://www.cs.columbia.edu/~simha/preprint_ccs13.pdf (some of the following slides are borrowed from Waksman’s CCS talk)

slide-33
SLIDE 33

Hardware Trojan Structure

33

Payload Trigger Circuit

Trigger Circuit:

Based on a seldom occurring

  • event. For example,
  • when address on address bus is

0xdeadbeef.

  • A parRcularly rare packet arrives on

network

  • Some Rme has elapsed

Payload:

Do something nefarious:

  • Make a page in memory (un)privileged
  • Leak informaRon to the outside world

through network, covert channels, etc

  • Cause the system to fail

Trojan can be inserted anywhere in during the manufacturing process (eg. In third party IP cores purchased, by fabricaRon plant, etc.)

slide-34
SLIDE 34

Trojan=Trigger+Payload

34

slide-35
SLIDE 35

Trojan=Trigger+Payload

35

slide-36
SLIDE 36

Backdoors are Stealthy

  • Small

– Typically a few lines of code / area

  • Stealth

– Cannot be detected by regular tesRng methodologies (rare triggers) – Passive when not triggered

36

slide-37
SLIDE 37

Unfortunately…

With so much of code it is highly likely that stealthy porRons of the code are missed or not tested properly.

37

FANCI: will detect these stealthy

  • circuits. These parts are most likely to

have Trojans. The aim is to have no false negaRves. A few false posiRves are acceptable

slide-38
SLIDE 38

Control Values

A B C O 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

38

By how much does an input influence the

  • utput O?

A B C O

slide-39
SLIDE 39

Control Values

A B C O 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

39

By how much does a input influence the

  • utput 0?

A : has a control of 0.5 on the output (A ma8ers in this funcRon) 1 1 A B C A B C O

slide-40
SLIDE 40

Control Values

A B C O 1 1 1 1 1 1 1 1 1 1 1 1 1 1

40

By how much does a input influence the

  • utput 0?

A : has a control of 0 on the output (A does not ma8er in this funcRon) (A is called unaffecRng) 1 1 A B C A B C O

slide-41
SLIDE 41

Control Values for a Trigger in a Trojan

41

if (addr == 0xdeadbeee) then{ trigger = 1 }

A31 A30 A2 A1

A0

trig ger

… … 1 … 1 … 1 1 : : : : : : 1 1 1 1 1 : : : : : : 1 1 1 1 1 1 A31 has a control value 1/232 Easier to hide a trojan when larger input sets are considered A low chance of affecRng the output Lends itself to stealthiness à easier to hide a malicious code

slide-42
SLIDE 42

An Example of a Mux

42

<A, B, C, D, S1, S2> = <0.25, 0.25, 0.25, 0.25, 0.5, 0.5> No trojan present here (intuRvely): * All mux inputs have a control value around mid range (not too close to 0)

slide-43
SLIDE 43

An Example of a Malicious Mux

43

66 extra select lines which are only modify M when whey are set to a parRcular value M The control values E and S3 to S66 are suspicious because they rarely Influence the value of M. Perfect for disguising malicious backdoors Just searching for MIN values is

  • wen not enough. Be8er metrics

Are needed.

slide-44
SLIDE 44

CompuRng Stealth from Control

44

slide-45
SLIDE 45

CompuRng Stealth from Control

45

slide-46
SLIDE 46

FANCI: The Complete Algorithm

46

slide-47
SLIDE 47

IC Life Cycle (The Fab)

47

IP Tools Std. Cells Models Design Specifications Fab Interface Mask Fab Wafer Probe Dice and Package Package Test Deploy and Monitor

Trusted Either Untrusted Wafer

*hbp://www.darpa.mil/MTO/solicita1ons/baa07-24/index.html

Offshore Third-party

slide-48
SLIDE 48

DetecRng Trojans in ICs

  • OpRcal InspecRon based techniques

Scanning OpRcal Microscopy (SOM), Scanning Electron Microscopy (SEM), and pico-second imaging circuit analysis (PICA)

– Drawbacks: Cost and Time!

  • TesRng techniques

– Not a very powerful technique

  • Side channel based techniques

– Non intrusive technique – Compare side-channels with a golden model

48

A Survey on Hardware Trojan DetecRon Techniques h8p://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7169073

slide-49
SLIDE 49

Side Channel Based Trojan DetecRon

49

Lightweight PRESENT ImplementaRon Power Traces Hardware trojan design and detec1on: a prac1cal evalua1on h8ps://dl.acm.org/citaRon.cfm?id=2527318

slide-50
SLIDE 50

Side Channel Based Trojan DetecRon (IC with Trojan)

50

slide-51
SLIDE 51

Difference of DistribuRons

51

slide-52
SLIDE 52

Hardware Trojan PrevenRon

(If you can’t detect then prevent)

52

Silencing Hardware Backdoors www.cs.columbia.edu/~simha/preprint_oakland11.pdf Slides taken from Adam Waksman’s Oakland talk

slide-53
SLIDE 53

Hardware Trojan PrevenRon

53

Ensure that a hardware Trojan is never delivered the correct Trigger

slide-54
SLIDE 54

Example (A 5 stage processor)

54

slide-55
SLIDE 55

Example (A 5 stage processor)

55

slide-56
SLIDE 56

Types of Trojans

56

slide-57
SLIDE 57

Ticking Timebomb

57

slide-58
SLIDE 58

Ticking Timebomb

58

slide-59
SLIDE 59

Cheat Codes

59

slide-60
SLIDE 60

Cheat Codes

60

slide-61
SLIDE 61

Sequence Cheat Codes

61

slide-62
SLIDE 62

Hardware Trojan Silencing (with ObfuscaRon)

62

slide-63
SLIDE 63

Silencing Ticking Timebombs

  • Power Resets : flush pipeline, write current IP and registers to

memory, save branch history targets

63

slide-64
SLIDE 64

Silencing Ticking Timebombs

  • Can trigger be stored to architectural state and restored later

– No. Unit validaRon tests prevent this – Reason for trusRng validaRon epoch Large validaRon teams Organized hierarchically

  • Can triggers be stored in non-volaRle state internal to the unit?

– Eg. Malware configures a hidden non-volaRle memory

  • Unmaskable Interrupts?

– Use a FIFO to store unmaskable interrupts

  • Performance Counters are hidden Rme bombs

64

slide-65
SLIDE 65

Data ObfuscaRon

65

Homomorphic EncrypRon (Gentry 2009) Ideal soluRon But pracRcal hurdles

slide-66
SLIDE 66

Data ObfuscaRon

66

slide-67
SLIDE 67

Data ObfuscaRon

67

Store Data 5 to Address 7

slide-68
SLIDE 68

Data ObfuscaRon (ComputaRonal Case)

68

slide-69
SLIDE 69

Sequence Breaking (Reordering)

69

Ensure funcRonality is maintained

slide-70
SLIDE 70

Sequence Breaking (InserRng events)

70

Insert arbitrary events when reordering is difficult

slide-71
SLIDE 71

Catch All (DuplicaRon)

71

Expensive: Non-recurring : design; verificaRon costs due to duplicaRon Recurring : Power and energy costs