PUFs using a Single Enrollment Vincent van der Leest (Intrinsic-ID) - - PowerPoint PPT Presentation

pufs using a single enrollment
SMART_READER_LITE
LIVE PREVIEW

PUFs using a Single Enrollment Vincent van der Leest (Intrinsic-ID) - - PowerPoint PPT Presentation

Soft Decision Error Correction for Compact Memory-Based PUFs using a Single Enrollment Vincent van der Leest (Intrinsic-ID) Bart Preneel (KU Leuven and IBBT) Erik van der Sluis (Intrinsic-ID) CHES workshop 2012, Leuven Tuesday, September 11,


slide-1
SLIDE 1

Soft Decision Error Correction for Compact Memory-Based PUFs using a Single Enrollment

Vincent van der Leest (Intrinsic-ID) Bart Preneel (KU Leuven and IBBT) Erik van der Sluis (Intrinsic-ID) CHES workshop 2012, Leuven Tuesday, September 11, 2012

slide-2
SLIDE 2

Confidential Tuesday, September 11, 2012 2

  • PUFs

– IC identification based on physical characteristics – Measurements are noisy and require error correction

  • Use Case: Secure Key Storage

– Error correct noisy PUF to produce stable key

  • Error correction

– Overhead on PUF size, efficient codes are required – Soft decision decoding is more efficient than hard decision – Soft decision algorithms with multiple measurements exist – We introduce soft decision using a single measurement

Introduction

slide-3
SLIDE 3

Confidential Tuesday, September 11, 2012 3

  • Memory-based PUFs: deriving PUF fingerprint from start-up

pattern of (standard-cell) memory in IC

  • Examples: SRAM, D Flip-Flop, Latch, Buskeeper…
  • Startup patterns are required to be:

– Robust (stable under different operating conditions) – Unique (random and unpredictable)

  • Memory-based PUF used here: SRAM PUF

Memory-based PUFs

slide-4
SLIDE 4

Confidential Tuesday, September 11, 2012 4

Use Case: Secure key storage

In secure environment:

  • “Program” key
  • Derive helper data
  • Store helper data

During operation:

  • Retrieve secret key

using helper data and PUF response

  • Secret reproducible

with error correction

slide-5
SLIDE 5

Confidential Tuesday, September 11, 2012 5

  • Soft decision decoding for memory-based PUFs*:

– Enrollment:

  • Perform multiple measurements
  • Derive error probability of each PUF bit
  • Store error probability with helper data (= soft information)

– Reconstruction:

  • Use error probabilities as confidence level for each bit
  • Less PUF bits required to reconstruct secret

* [Maes-Tuyls-Verbauwhede'09]

Soft decision decoding: state of the art

slide-6
SLIDE 6

Confidential Tuesday, September 11, 2012 6

  • Using multiple enrollment measurements leads to:

– Requiring non-volatile memory during enrollment – Growing footprint with number of measurements – Additional enrollment time in production line

  • Drawbacks make soft decision decoding for PUFs

practically and commercially inapplicable

Motivation for new construction

slide-7
SLIDE 7

Confidential Tuesday, September 11, 2012 7

  • Hard decision decoding using concatenated codes*

* [Bösch-Guajardo-Sadeghi-Shokrollahi-Tuyls’08]

Decoding (reconstruction)

Our proposal (high level)

Linear Encoder Repetition Encoder Repetition Decoder Linear Decoder

Encoding (enrollment)

slide-8
SLIDE 8

Confidential Tuesday, September 11, 2012 8

  • Soft decision decoding using concatenated codes
  • Quantizer: only a single enrollment measurement required

Decoding (reconstruction)

Our proposal (high level)

Linear Encoder Repetition Encoder Repetition Decoder Linear Decoder

Encoding (enrollment)

Quantizer Soft Decoder

slide-9
SLIDE 9

Confidential Tuesday, September 11, 2012 9

  • Decoders with efficient hardware implementation
  • Brute force decoder:

– Codes with limited set of codewords – Calculate Euclidean Distance input to all codewords – Select most likely codeword for decoding – Examples: Reed-Muller [16,5,8] and [8,4,4]

  • Hackett decoder:

– Golay [24,12,8] decoder with soft input – Hard decision decoding with 8 different input patterns – Input patterns selected based on soft information – Most likely output selected based on Euclidean Distance

Soft decoder examples

slide-10
SLIDE 10

Confidential Tuesday, September 11, 2012 10

Calculating hard decision performance

Hard decision FRR can be calculated based on length of repetition code (equations available for concatenated codes) Based on results, codes require repetition length: RM[16,5,8] : 13 bits RM[8,4,4] : 23 bits Golay[24,12,8] : 13 bits

slide-11
SLIDE 11

Confidential Tuesday, September 11, 2012 11

No equations available for calculating FRR of soft decision codes  simulations performed Based on simulations, codes require repetition/ quantizer length: RM[16,5,8] : 7 bits RM[8,4,4] : 14 bits Golay[24,12,8] : 8 bits

Simulating soft decision performance

slide-12
SLIDE 12

Confidential Tuesday, September 11, 2012 12

Comparing amount of SRAM required

Code Type Repetition length FRR SRAM (bytes) RM[16,5,8] Hard 13 1.6 · 10-7 910 RM[16,5,8] Soft 7 3.7 · 10-7 490 RM[8,4,4] Hard 25 3.4 · 10-7 1075 RM[8,4,4] Soft 14 3.3 · 10-7 602 Golay[24,12,8] Hard 13 4.0 · 10-7 585 Golay[24,12,8] Soft 8 4.8 · 10-7 360

Results show: soft decision decoding decreases amount of SRAM required 38 - 47% in these examples

slide-13
SLIDE 13

Confidential Tuesday, September 11, 2012 13

Impact of SRAM changes with:

  • FRR
  • Noise rate
  • Key length
  • Number of keys

In this example: SRAM cell ≈ 1GE

Comparing total footprint

Footprint (kGE) 1 2 3 4 5 6 7 8 9 10 SRAM Soft/Linear decoder Quantizer/ Repetition decoder Encoder

slide-14
SLIDE 14

Confidential Tuesday, September 11, 2012 14

  • New soft decoding method for memory-based PUFs:

– Using only single enrollment measurement – Requires 38 - 47% less PUF bits than hard decoding – Solves issues from old method (NVM, footprint, enrollment time) – All example codes implemented efficiently in hardware

  • New method comes at a limited cost in resources
  • Size of PUF more dominant in footprint  cost decreases
  • Decoder implementation to be chosen based on:

– What to minimize: PUF size, footprint, … – Values of FRR, noise rate, key length, number of keys, …

Conclusions

slide-15
SLIDE 15

Confidential Tuesday, September 11, 2012 15

Questions?