A Challenge Code for Maximizing the Entropy of PUF Responses - - PowerPoint PPT Presentation

a challenge code for maximizing the entropy of puf
SMART_READER_LITE
LIVE PREVIEW

A Challenge Code for Maximizing the Entropy of PUF Responses - - PowerPoint PPT Presentation

A Challenge Code for Maximizing the Entropy of PUF Responses Olivier Rioul 1 , Patrick Sol 1 , Sylvain Guilley 1 , 2 and Jean-Luc Danger 1 , 2 1 LTCI, CNRS, Tlcom ParisTech, Universit Paris-Saclay, 75 013 Paris, France. Email:


slide-1
SLIDE 1

A Challenge Code for Maximizing the Entropy

  • f PUF Responses

Olivier Rioul1, Patrick Solé1, Sylvain Guilley1,2 and Jean-Luc Danger1,2

1 LTCI, CNRS, Télécom ParisTech,

Université Paris-Saclay, 75 013 Paris, France. Email: firstname.lastname@telecom-paristech.fr

2 Secure-IC S.A.S., 15 Rue Claude Chappe, Bât. B,

ZAC des Champs Blancs, 35510 Cesson-Sévigné, France. Email: firstname.lastname@secure-ic.com <sylvain.guilley@telecom-paristech.fr>

slide-2
SLIDE 2

Outline

Entropy of PUF Concept Estimation Prevision Running example: the Loop-PUF (LPUF) SRAM PUF example L-PUF into details [CDGB12] Theory Definitions Results Main result Beyond n bits Conclusions

2 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-3
SLIDE 3

Outline

Entropy of PUF Concept Estimation Prevision Running example: the Loop-PUF (LPUF) SRAM PUF example L-PUF into details [CDGB12] Theory Definitions Results Main result Beyond n bits Conclusions

3 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-4
SLIDE 4

Entropy of PUFs

... PUF 1 PUF 2 PUF M i.i.d.

PUFs are instanciations of blueprints by a fab plant

4 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-5
SLIDE 5

After fabrication (estimation ˆ

P)

x . . . x . . . 1 x . . . 2 x f f . . . f f x f f . . . f e x f f . . . f d ... 2−128 (a) x . . . x . . . 1 x . . . 2 x f f . . . f f x f f . . . f e x f f . . . f d ... 2−128 (b)

Which PUF is the most entropic?

5 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-6
SLIDE 6

After fabrication (estimation ˆ

P)

x . . . x . . . 1 x . . . 2 x f f . . . f f x f f . . . f e x f f . . . f d ... 2−128 (a) x . . . x . . . 1 x . . . 2 x f f . . . f f x f f . . . f e x f f . . . f d ... 2−128 (b)

Which PUF is the most entropic? Recall H = −

0xff...ff

  • c=0x00...00

P(R = PUF(c)) log P(R = PUF(c)).

5 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-7
SLIDE 7

Before fabrication

Stochastic model Active discussion at ISO sub-committee 27:

6 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-8
SLIDE 8

Outline

Entropy of PUF Concept Estimation Prevision Running example: the Loop-PUF (LPUF) SRAM PUF example L-PUF into details [CDGB12] Theory Definitions Results Main result Beyond n bits Conclusions

7 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-9
SLIDE 9

Non-delay PUF: SRAM PUF

  • elt. 2

... Bc

  • elt. 1
  • elt. n

Response: log2(c) c Challenge:

Amount of entropy: = n.

8 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-10
SLIDE 10

Delay PUF: core delay element

d(ci) dT1

i

dT2

i

dB1

i

dB2

i

ci yi = xi+1 yi−1 = xi i − 1 element element i + 1

Same idea as in other delay PUFs, like arbiter-PUF, etc.

9 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-11
SLIDE 11

Let d(ci) be the corresponding delay. As time is an extensive physical quantity: d(ci) =

  • dT1

i

+ dB2

i

= dTB

i

if ci = −1, dB1

i

+ dT2

i

= dBT

i

if ci = +1. The delays dTB

i

and dBT

i

are modeled as i.i.d. normal random variables selected at fabrication [PDW89].

Figure: Monte-Carlo simulation (with 500 runs) of the delays in a chain of 60 basic buffers implemented in a 55 nm CMOS technology.

10 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-12
SLIDE 12

Delay PUF: Loop PUF

1 1

d1,1 d0,1 d1,N d0,N

C1 CN

  • scillator

measurement ID

Amount of entropy: > n? Nota bene: here, d(c) is expressed in number of clock cycles.

11 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-13
SLIDE 13

LPUF is not self-contained

It needs a protocole

n i.i.d. normal Response Bc ∈ {±1} Bc = sign(n

i=1 ci∆i)

Loop-PUF: random variables ∆i Challenge c ∈ {±1}n

input : Challenge c

  • utput: Response Bc

1 Set challenge c 2 Measure d1 ← ⌊N n i=1 d(ci)⌋ 3 Set challenge −c 4 Measure d2 ← ⌊N n i=1 d(−ci)⌋ 5 return Bc = sign(d1 − d2)

Algorithm 1: Protocole to get one bit out LPUF .

12 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-14
SLIDE 14

Our result: RAM-PUF vs Loop-PUF

For n = 8 SRAM-PUF (0 ≤ M ≤ n) LPUF (0 ≤ M ≤ 2n−1)

13 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-15
SLIDE 15

Outline

Entropy of PUF Concept Estimation Prevision Running example: the Loop-PUF (LPUF) SRAM PUF example L-PUF into details [CDGB12] Theory Definitions Results Main result Beyond n bits Conclusions

14 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-16
SLIDE 16

Challenge

Definition

A challenge c is a vector of n control bits c = (c1, c2, . . . , cn) ∈ {±1}n. Let ∆1, ∆2, . . . , ∆n be i.i.d. zero-mean normal (Gaussian) variables characterizing the technological dispersion. A bit response to challenge c is defined as Bc = sign(∆c) ∈ {±1} (1) where

∆c = c1∆1 + c2∆2 + · · · + cn∆n.

(2)

15 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-17
SLIDE 17

Challenge code

Definition

A challenge code C is a set of M n-bit challenges that form a (n, M) binary code. We shall identify C with the M × n matrix of ±1’s whose lines are the challenges. The M codewords and their complements are used to challenge the PUF elements. The corresponding identifier is the M-bit vector B = (Bc)c∈C. (3) The entropy of the PUF responses is denoted by H = H(B).

16 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-18
SLIDE 18

Orthant probabilities

Let X1, X2, . . . , Xn be zero-mean, jointly Gaussian (not necessarily independent) and identically distributed. As a prerequisite to the derivations that follow, we wish to compute the orthant probability

P(X1 > 0, X2 > 0, . . . , Xn > 0).

The probabilities associated to other sign combinations can easily be deduced from it using the symmetry properties of the Gaussian distribution. Since the value of the orthant probability does not depend on the common variance of the random variables we may assume without loss of generality that each Xi has unit variance: Xi ∼ N(0, 1). The

  • rthant probability will depend only on the correlation coefficients

ρi,j = E(XiXj) (i = j).

(4)

17 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-19
SLIDE 19

Some lemmas

Lemma (Quadrant probability of a bivariate normal)

P(X1 > 0, X2 > 0) = 1

4 + arcsin ρ1,2 2π

.

(5)

Lemma (Orthant probability of a trivariate normal)

P(X1 > 0, X2 > 0, X3 > 0) = 1

8 + arcsin ρ1,2 + arcsin ρ2,3 + arcsin ρ1,3 4π

.

(6)

Lemma (No closed formula for n > 3 exists. . . )

18 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-20
SLIDE 20

Main Result: Hadamard Codes

We have M responses bits, so H(B) ≤ M bits. When is it possible to have the maximum value H(B) = M bits?

19 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-21
SLIDE 21

Main Result: Hadamard Codes

We have M responses bits, so H(B) ≤ M bits. When is it possible to have the maximum value H(B) = M bits?

Theorem

H(B) = M implies M ≤ n. H(B) = M = n bits if and only if C is a Hadamard (n, n) code.

19 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-22
SLIDE 22

Main Result: Hadamard Codes

We have M responses bits, so H(B) ≤ M bits. When is it possible to have the maximum value H(B) = M bits?

Theorem

H(B) = M implies M ≤ n. H(B) = M = n bits if and only if C is a Hadamard (n, n) code.

Proof.

H(B) = M means that all bits Bc are independent, i.e., all Yj = n

i= ciXi’s are independent (uncorrelated), i.e., all M (n-bit)

challenges c(j) are orthogonal.

19 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-23
SLIDE 23

Hadamard Codes

n orthogonal binary ±1 vectors form an Hadamard code: n = 1 C = (1), H = 1 bit; n = 2 C =

  • 1

1 1

−1

  • , H = 2 bits;

n = 3 No Hadamard code! but any (3,3)code ≡

  • 1

1 1

−1

1 1 1

−1

1

  • for which Σ = 1

3CCt =

1

1/ 3 1/ 3 1/ 3

1

−1/

3 1/ 3 −1/ 3

1

  • gives

H = −6

1

8 + arcsin 1/

3

  • log

1

8 + arcsin 1/

3

  • − 2

1

8 − 3 arcsin 1/

3

  • log

1

8 − 3 arcsin 1/

3

  • ≈ 2.875 < 3 bits.

20 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-24
SLIDE 24

Hadamard Codes (cont’d)

n=4

   

1 1 1 1 1

−1

1

−1

1 1

−1 −1

1

−1 −1

1

   , H = 4 bits

n=8

            

1 1 1 1 1 1 1 1 1

−1

1

−1

1

−1

1

−1

1 1

−1 −1

1 1

−1 −1

1

−1 −1

1 1

−1 −1

1 1 1 1 1

−1 −1 −1 −1

1

−1

1

−1 −1

1

−1

1 1 1

−1 −1 −1 −1

1 1 1

−1 −1

1

−1

1 1

−1             

, H = 8 bits

21 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-25
SLIDE 25

Hadamard Codes (cont’d)

n=12

                     

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 −1 −1 −1 −1 −1 −1 1 1 1 −1 −1 −1 1 1 1 −1 −1 −1 1 1 −1 1 −1 −1 1 −1 −1 1 1 −1 1 1 −1 −1 1 −1 −1 1 −1 1 −1 1 1 1 −1 −1 −1 1 −1 −1 1 −1 1 1 1 −1 −1 −1 1 1 1 1 −1 −1 1 −1 1 −1 1 1 −1 −1 −1 1 −1 −1 1 1 1 −1 1 −1 −1 1 1 −1 −1 1 −1 1 1 −1 −1 1 −1 1 −1 1 1 1 −1 −1 1 −1 −1 1 1 −1 1 −1 1 −1 −1 1 1 −1 1 −1 1 −1 −1 −1 1 1 1 −1

                     

H = 12 bits

22 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-26
SLIDE 26

Outline

Entropy of PUF Concept Estimation Prevision Running example: the Loop-PUF (LPUF) SRAM PUF example L-PUF into details [CDGB12] Theory Definitions Results Main result Beyond n bits Conclusions

23 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-27
SLIDE 27

Beyond n bits

n = 1 element =

⇒ H = 1 bit;

24 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-28
SLIDE 28

Beyond n bits

n = 1 element =

⇒ H = 1 bit;

n = 2 elements =

⇒ H = 2 bits;

24 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-29
SLIDE 29

Beyond n bits

n = 1 element =

⇒ H = 1 bit;

n = 2 elements =

⇒ H = 2 bits;

H = M (max entropy = number of challenges) =

⇒ H ≤ n bits.

24 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-30
SLIDE 30

Beyond n bits

n = 1 element =

⇒ H = 1 bit;

n = 2 elements =

⇒ H = 2 bits;

H = M (max entropy = number of challenges) =

⇒ H ≤ n bits.

Common belief that n elements give at most n bits of entropy (SRAM PUFs, delay PUFs). Q Can we obtain more than n bits by taking more challenges: M > n ?

24 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-31
SLIDE 31

Beyond n bits

n = 1 element =

⇒ H = 1 bit;

n = 2 elements =

⇒ H = 2 bits;

H = M (max entropy = number of challenges) =

⇒ H ≤ n bits.

Common belief that n elements give at most n bits of entropy (SRAM PUFs, delay PUFs). Q Can we obtain more than n bits by taking more challenges: M > n ? A Yes!

n < H < M

For n elements, using M > n challenges, the entropy can increase beyond n bits, albeit strictly < M.

24 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-32
SLIDE 32

n = 3 elements

M = 1 C1 = ( 1 1 1 ) gives H = 1 bit.

25 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-33
SLIDE 33

n = 3 elements

M = 1 C1 = ( 1 1 1 ) gives H = 1 bit. M = 2 C2 =

1 1

1 1 1 −1

  • gives H =

− 1

2 + arcsin 1/

3

π

  • log

1

4 + arcsin 1/

3

1

2 − arcsin 1/

3

π

  • log

1

4 − arcsin 1/

3

≈ 1.966 bits.

25 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-34
SLIDE 34

n = 3 elements

M = 1 C1 = ( 1 1 1 ) gives H = 1 bit. M = 2 C2 =

1 1

1 1 1 −1

  • gives H =

− 1

2 + arcsin 1/

3

π

  • log

1

4 + arcsin 1/

3

1

2 − arcsin 1/

3

π

  • log

1

4 − arcsin 1/

3

≈ 1.966 bits.

M = 3 C3 =

1

1 1 1 1 −1 1 −1 1

  • gives H =

− 3

4 +3 arcsin 1/

3

  • log

1

8 + arcsin 1/

3

1

4 −3 arcsin 1/

3

  • log

1

8 −3 arcsin 1/

3

≈ 2.875 bits.

25 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-35
SLIDE 35

n = 3 elements

M = 1 C1 = ( 1 1 1 ) gives H = 1 bit. M = 2 C2 =

1 1

1 1 1 −1

  • gives H =

− 1

2 + arcsin 1/

3

π

  • log

1

4 + arcsin 1/

3

1

2 − arcsin 1/

3

π

  • log

1

4 − arcsin 1/

3

≈ 1.966 bits.

M = 3 C3 =

1

1 1 1 1 −1 1 −1 1

  • gives H =

− 3

4 +3 arcsin 1/

3

  • log

1

8 + arcsin 1/

3

1

4 −3 arcsin 1/

3

  • log

1

8 −3 arcsin 1/

3

≈ 2.875 bits.

M = 4 C3 =

  • 1

1 1 1 1 −1 1 −1 1

−1

1 1

  • gives ≈ 3.666 bits

25 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-36
SLIDE 36

n = 3 elements

M = 1 C1 = ( 1 1 1 ) gives H = 1 bit. M = 2 C2 =

1 1

1 1 1 −1

  • gives H =

− 1

2 + arcsin 1/

3

π

  • log

1

4 + arcsin 1/

3

1

2 − arcsin 1/

3

π

  • log

1

4 − arcsin 1/

3

≈ 1.966 bits.

M = 3 C3 =

1

1 1 1 1 −1 1 −1 1

  • gives H =

− 3

4 +3 arcsin 1/

3

  • log

1

8 + arcsin 1/

3

1

4 −3 arcsin 1/

3

  • log

1

8 −3 arcsin 1/

3

≈ 2.875 bits.

M = 4 C3 =

  • 1

1 1 1 1 −1 1 −1 1

−1

1 1

  • gives ≈ 3.666 bits

25 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-37
SLIDE 37

n = 4 elements

C8 =             

1 1 1 1 1

−1

1

−1

1 1

−1 −1

1

−1 −1

1

−1

1 1 1 1

−1

1 1 1 1

−1

1 1 1 1

−1             

H = 6.251 bits.

26 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-38
SLIDE 38

n = 8 elements, etc.

27 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-39
SLIDE 39

n = 8 elements, etc.

Hn n

27 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-40
SLIDE 40

Outline

Entropy of PUF Concept Estimation Prevision Running example: the Loop-PUF (LPUF) SRAM PUF example L-PUF into details [CDGB12] Theory Definitions Results Main result Beyond n bits Conclusions

28 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-41
SLIDE 41

Conclusions and Perspectives

Conclusions

Hn = n bits of entropy obtained using a Hadamard challenge code; Hn > n bits of entropy obtained using a challenge code made of several Hadamard “chunks” Related talk given at ISIT 2016 [RSGD16]:

29 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses

slide-42
SLIDE 42

[CDGB12] Zouha Cherif, Jean-Luc Danger, Sylvain Guilley, and Lilian Bossuet. An easy-to-design PUF based on a single oscillator: The loop PUF. In 15th Euromicro Conference on Digital System Design, DSD 2012, Çe¸ sme, Izmir, Turkey, September 5-8, 2012, pages 156–162. IEEE Computer Society, 2012. [PDW89] Marcel J.M. Pelgrom, Aad C.J. Duinmaijer, and Anton P .G. Welbers. Matching properties of MOS transistors. IEEE Journal of Solid State Circuits, 24(5):1433–1439, 1989. DOI: 10.1109/JSSC.1989.572629. [RSGD16] Olivier Rioul, Patrick Solé, Sylvain Guilley, and Jean-Luc Danger. On the Entropy of Physically Unclonable Functions. In ISIT, IEEE International Symposium on Information Theory, July 2016. Barcelona, Spain.

30 / 30

June 23, 2016

Sylvain Guilley A Challenge Code for Maximizing the Entropy of PUF Responses