Evaluating the Security of Implementations Against Side Channel - - PowerPoint PPT Presentation

evaluating the security of implementations against side
SMART_READER_LITE
LIVE PREVIEW

Evaluating the Security of Implementations Against Side Channel - - PowerPoint PPT Presentation

Evaluating the Security of Implementations Against Side Channel Attacks Activities from ANSSI Laboratory for Embedded Security Emmanuel Prouff emmanuel.prouff@ssi.gouv.fr Agence Nationale de la S ecurit e des Syst` emes dInformation


slide-1
SLIDE 1

Evaluating the Security of Implementations Against Side Channel Attacks

Activities from ANSSI Laboratory for Embedded Security Emmanuel Prouff emmanuel.prouff@ssi.gouv.fr

Agence Nationale de la S´ ecurit´ e des Syst` emes d’Information

Summer School, ˇ Sibenik, Croatia – June, 17-21, 2019

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-2
SLIDE 2

ANSSI Presentation LSC Missions

ANSSI Core Missions

Prevent threats by supporting the development of trusted products and services for Governmental entities and economic actors Provide reliable advice and support to Governmental entities and operators of Critical Infrastructure Keep companies and the general public informed about information security threats and the related means of protection through an active communication policy Give support to security evaluation labs (ITSEF) and to the french national certification center (CCN).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-3
SLIDE 3

ANSSI Presentation LSC Missions

Certification Body

Certification Body: 10 agents List of certified products available

  • n the ANSSI website: www.ssi.gouv.fr

Some statistics about french Common Criteria evaluations:

◮ 50% smartcard evaluations ◮ 35% microcontroller evaluations ◮ 15% softwares, network, misc ...

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-4
SLIDE 4

ANSSI Presentation Certification

Certification Labels for Security Products

CSPN - certification for first level security

black-box

◮ fast and easy procedure (for ex. allow to label freewares) ◮ evaluation made by ITSEFs ◮ compliance with security target ◮ efficiency of security functionalities ◮ 25-35 man/day

Common Criteria - CC certification

white-box

◮ longer procedure, recognized outside of France ◮ evaluation made by ITSEFs ◮ compliance with security target ◮ eval. of each security functionality ◮ different assurance levels: EAL1, . . ., EAL7

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-5
SLIDE 5

Security Evaluation Industry

Security Evaluation in the Industry

Context

Mandatory for some security products (e.g. banking cards, ePassport or secure platforms for embedded systems) Not always Mandatory but Economical Advantage for many

  • thers (e.g. USIM or access control)

General Framework

◮ Developers implement countermeasures against SoA attacks (passive, semi-invasive or invasive) ◮ Independent Labs evaluate the security w.r.t SoA attacks (e.g. listed by the JHAS group) ◮ Certification authorities (e.g. ANSSI or BSI or EMV-CO) validate the evaluation and deliver the certificates.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-6
SLIDE 6

Security Evaluation Industry

Security Evaluation in the Industry

Facts

Attacks are each year more and more powerful

◮ Semi-invasive attacks with multiple faults ◮ Template Attacks, Second-Order SCA or Horizontal SCA ◮ Use of HPC

... each year more numerous (≃ 100 publications / year) Security is costly: development/testing time, decreasing of the performances, loss of genericity for the codes, expertise cost How to increase coverage and accuracy of the evaluation while decreasing the cost?

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-7
SLIDE 7

Security Evaluation Needs

Security Evaluation in the Industry

Some needs...

Automatize evaluations without quality loss Increase trust in evaluation results

◮ Failure due to countermeasures or to evaluator weakness?

Quantify the security instead of testing a set of attacks

◮ Too many attacks, too many parametrizations, etc. ◮ Need to always stay up-to-date ◮ Failure with 106 measurements but what if 107 are available?

Measure the information leakage

◮ Portability gain ◮ Allow for comparison between evaluations

Identify Points of Interest

◮ Exchange ”experts How-To” for sound and repeatable techniques

To sum-up: Estimate the efficiency of the most powerful attacks in a minimum of time.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-8
SLIDE 8

Security Evaluation Needs

Security Evaluation in the Industry

Some needs...

Automatize evaluations without quality loss Increase trust in evaluation results

◮ Failure due to countermeasures or to evaluator weakness?

Quantify the security instead of testing a set of attacks

◮ Too many attacks, too many parametrizations, etc. ◮ Need to always stay up-to-date ◮ Failure with 106 measurements but what if 107 are available?

Measure the information leakage

◮ Portability gain ◮ Allow for comparison between evaluations

Identify Points of Interest

◮ Exchange ”experts How-To” for sound and repeatable techniques

To sum-up: Estimate the efficiency of the most powerful attacks in a minimum of time.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-9
SLIDE 9

Security Evaluation Needs

What we do at ANSSI related to these subjects?

Some Examples...

Define generic frameworks to encompass most of the SoA attacks Use the latter frameworks to build generic and modular testing libraries Define methods to accurately measure the information leakage from a chip Define methods to evaluate the success rate of SoA attacks based on the latter measure Adapt methods from Machine Learning to identify PoI

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-10
SLIDE 10

Side-Channel Attacks Generic Framework

Advanced Side Channel Attacks (DPA like attacks)

Side Channel Analysis: General Framework.

Secrets

Implementation Optionnal

Statistical Tools

AES

Channel Side Channel Adversary Chip Model

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-11
SLIDE 11

Side-Channel Attacks Generic Framework

Advanced Side Channel Attacks

Side Channel Analysis: General Framework (Theoretical)

Context: attack during the manipulation of S(X + k).

1 Measurement :

◮ get a leakages sample (ℓk,i)i related to a sample (xi)i of plaintexts.

2 Model Selection :

◮ Design/Select a function m(·).

3 Prediction :

◮ For every ˆ

k, compute mˆ

k,i = m(S(xi + ˆ

k)).

4 Distinguisher Selection :

◮ Choose a statistical distinguisher ∆.

5 Key Discrimination :

◮ For every ˆ

k, compute the distinguishing value ∆ˆ

k:

∆ˆ

k = ∆

  • (ℓk,i)i, (mˆ

k,i)i

  • .

6 Key Candidate Selection :

◮ Deduce ˆ

k from all the values ∆ˆ

k.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-12
SLIDE 12

Side-Channel Attacks Generic Framework

Advanced Side Channel Attacks

Side Channel Analysis: attack Description Sheet

Attack Description Sheet

Type of Leakage: e.g. power consumption or electromagnetic emanation Model Function:e.g. one bit of Z or its Hamming weight Statistical Distinguisher: e.g. difference of means, correlation or entropy Key Candidate Selection: e.g. the candidate the maximizes the scores

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-13
SLIDE 13

Leakage Assessment for Designers

A designer/evaluator POV

Security of a device against SCA is tested by designers/evaluators. Large set of SCA to test: CPA, MIA, LRA, DPA, ML, etc. Little time, limited means, constrained resources. Strong knowledge of my device. CPA ML LRA DPA MIA

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-14
SLIDE 14

Leakage Assessment for Designers

A designer/evaluator POV

Security of a device against SCA is tested by designers/evaluators. Large set of SCA to test: CPA, MIA, LRA, DPA, ML, etc. Little time, limited means, constrained resources. Strong knowledge of my device. CPA ML LRA DPA MIA

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-15
SLIDE 15

Leakage Assessment for Designers

A designer/evaluator POV

Security of a device against SCA is tested by designers/evaluators. Large set of SCA to test: CPA, MIA, LRA, DPA, ML, etc. Little time, limited means, constrained resources. Strong knowledge of my device. CPA ML LRA DPA MIA

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-16
SLIDE 16

Leakage Assessment for Designers

A designer/evaluator POV

Security of a device against SCA is tested by designers/evaluators. Large set of SCA to test: CPA, MIA, LRA, DPA, ML, etc. Little time, limited means, constrained resources. Strong knowledge of my device. CPA ML LRA DPA MIA

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-17
SLIDE 17

Leakage Assessment for Designers First Order Case

Leakage assessment: is there information in the traces?

must be very efficient in the number of traces. must be as generic as possible: any kind information must be revealed.

֒ → independent from leakage functions. ֒ → takes into account as many intermediate variables as possible.

Intuitions

First focus on first-order leakages, i.e. the information is contained in the conditional mean of the traces. E[T | Z = z] = E[T] A secure implementation would behave as manipulating random values.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-18
SLIDE 18

Leakage Assessment for Designers First Order Case

Test Vector Leakage Assessment (TVLA) [Becker et al. White Paper CRI]

Acquire some sets of traces: S1: Plaintexts and Keys are both fixed to well chosen values. S2: Plaintexts are randomly chosen and Keys are fixed. S3: Plaintexts are fixed and Keys are randomly chosen. . . . Welch t-test between Si and S1 compute , for each time sample t, score(t) = ˆ E[Si] − ˆ E[S1] ˆ

V [Si] Ni

+

ˆ V [S1] N1

  • where ˆ

E and ˆ V are estimations of the mean and of the variance respectively. if score(t) > threshold then there is a leakage. . .

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-19
SLIDE 19

Leakage Assessment for Designers First Order Case

An example on AES implem. (8-bit ATMega)

200000 observations, Random plaintexts vs. Fixed Set. threshold = 4.5std(score) + mean.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-20
SLIDE 20

Leakage Assessment for Designers First Order Case

TVLA output. . .

There are first-order leakages!

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-21
SLIDE 21

Leakage Assessment for Designers First Order Case

TVLA output. . .

There are first-order leakages! But which sensitive value is leaking?

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-22
SLIDE 22

Leakage Assessment for Designers First Order Case

TVLA output. . .

There are first-order leakages! But which sensitive value is leaking? + There is some first-order leakages and their time samples.

  • These leakages may not be sensitive. . .

֒ → use S3

  • These leakages may depend on sensitive values in any ways:

◮ several bytes of plaintext/key may be involved. ◮ relationship between these leakages and intermediate variables may be tricky.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-23
SLIDE 23

Leakage Assessment for Designers First Order Case

TVLA output. . .

There are first-order leakages! But which sensitive value is leaking? + There is some first-order leakages and their time samples.

  • These leakages may not be sensitive. . .

֒ → use S3

  • These leakages may depend on sensitive values in any ways:

◮ several bytes of plaintext/key may be involved. ◮ relationship between these leakages and intermediate variables may be tricky.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-24
SLIDE 24

Leakage Assessment for Designers First Order Case

TVLA output. . .

There are first-order leakages! But which sensitive value is leaking? + There is some first-order leakages and their time samples.

  • These leakages may not be sensitive. . .

֒ → use S3

  • These leakages may depend on sensitive values in any ways:

◮ several bytes of plaintext/key may be involved. ◮ relationship between these leakages and intermediate variables may be tricky.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-25
SLIDE 25

Leakage Assessment for Designers First Order Case

TVLA output. . .

There are first-order leakages! But which sensitive value is leaking? Find strategies to identify the plaintext/key bytes involved minimize the number of subsequent acquisition campaigns. use generic tools to observe leakages: T-test, SNR, etc. . .

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-26
SLIDE 26

Leakage Assessment for Designers 2nd-Order Case

Higher Order Side Channel Attacks

Core Principle

First Order Masking: M0 = Z ⊕ M1 = ⇒ Second Order SCA:

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-27
SLIDE 27

Leakage Assessment for Designers 2nd-Order Case

Higher Order Side Channel Attacks

Core Principle

Masking of order d: M0 = Z ⊕ M1 ⊕ · · · ⊕ Md Attack of order d + 1:

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-28
SLIDE 28

Leakage Assessment for Designers 2nd-Order Case

Higher-Order SCA

Context: attack during the manipulation of s0, s1, · · · , sd with S(X + k) = d

i=0 mi. 1 Measurement :

◮ get a leakages sample (

ℓk,i)i related to a sample (xi)i of plaintexts.

2 Pre-processing and Model Selection :

◮ Select a combination function f(

ℓ), Design/Select a function m(s).

3 Prediction :

◮ For every ˆ

k, compute mˆ

k,i = m(S(xi + ˆ

k)).

4 Distinguisher Selection :

◮ Choose a statistical distinguisher ∆.

5 Key Discrimination :

◮ For every ˆ

k, compute the distinguishing value ∆ˆ

k:

∆ˆ

k = ∆

  • (f (

ℓk,i))i, (mˆ

k,i)i

  • .

6 Key Candidate Selection :

◮ Deduce ˆ

k from all the values ∆ˆ

k.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-29
SLIDE 29

Leakage Assessment for Designers 2nd-Order Case

An example on AES implem. (8-bit ATMega)

200000 observations, Random plaintexts vs. Fixed Set. Combination function: Centered product.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-30
SLIDE 30

Leakage Assessment for Designers 2nd-Order Case

TVLA 2nd-order output. . .

There are second-order leakages!

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-31
SLIDE 31

Leakage Assessment for Designers 2nd-Order Case

TVLA 2nd-order output. . .

There are second-order leakages! The trace sizes are squared. . .

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-32
SLIDE 32

Leakage Assessment for Designers 2nd-Order Case

TVLA 2nd-order output. . .

There are second-order leakages! The trace sizes are squared. . . + The centred product + TVLA allow to identify second order leakages.

  • The treatment complexity increases exponentially with the
  • rder.
  • The number of traces increases exponentially with the order.
  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-33
SLIDE 33

Leakage Assessment for Designers 2nd-Order Case

Open Issues with TVLA

Difficulty to deal with false positives: how to specify efficient and smart acquisition campaigns? Adversary may be considered too strong for some contexts: e.g. knowledge of the masking material, ability to profile the device, etc. Is it an issue? How to deal with it? Relation between SNR peaks amplitude and attacks efficiency.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-34
SLIDE 34

Evaluate Attack Success Rates from Profiling Context

Security of a device: practice

Problem: Is my device secure against an attack ?

Perform SCA

SCA Success / Failure

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-35
SLIDE 35

Evaluate Attack Success Rates from Profiling Context

Security of a device: much better

Problem: Is my device secure against an attack ? Perform the SCA a lot of times, count the number of successes. SCA Success Rate % Issue: Might be too expensive (acquisitions, computations ...).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-36
SLIDE 36

Evaluate Attack Success Rates from Profiling Context

Security of a device: much better

Problem: Is my device secure against an attack ? Perform the SCA a lot of times, count the number of successes. SCA Success Rate % Issue: Might be too expensive (acquisitions, computations ...).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-37
SLIDE 37

Evaluate Attack Success Rates from Profiling Context

An example : 2O-CPA

1000 × 10000 = 10 Millions of observations.

1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 10 20 30 40 50 60 70 80 90 100 Number of observations Success Rate (%)

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-38
SLIDE 38

Evaluate Attack Success Rates from Profiling Context

An example : 4O-CPA

1000 × 107 = 10 Billions of observations.

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 x 10

7

10 20 30 40 50 60 70 80 90 100 Number of messages Success rate (%)

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-39
SLIDE 39

Evaluate Attack Success Rates from Profiling Context

Knowledge of the device

The designer/evaluator has a strong knowledge of the device:

◮ Leakage functions. ◮ Noises distributions. ◮ ...

Total control over inputs:

◮ Plaintext. ◮ Key. ◮ Randoms.

credit:ZeptoBars

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-40
SLIDE 40

Evaluate Attack Success Rates from Profiling Context

Security of a device: designer

Problem: Is my device secure against an attack ?

Methodology

Costly SCA Success Rate %

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-41
SLIDE 41

Evaluate Attack Success Rates from Profiling A new methodology

PCD Methodology

Outlines:

1 Profile the device parameters. 2 Compute some formulas using these parameters. 3 Deduce the success rate.

Purpose: Work for any additive distinguisher st. DPA, CPA, Maximum Likelihood, etc. Generalize to HO versions. Enable to clearly identify the impact of each device’s parameter on its security.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-42
SLIDE 42

Evaluate Attack Success Rates from Profiling A new methodology

PCD Methodology

Outlines:

1 Profile the device parameters. 2 Compute some formulas using these parameters. 3 Deduce the success rate.

Purpose: Work for any additive distinguisher st. DPA, CPA, Maximum Likelihood, etc. Generalize to HO versions. Enable to clearly identify the impact of each device’s parameter on its security.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-43
SLIDE 43

Evaluate Attack Success Rates from Profiling A new methodology

PCD Methodology

Outlines:

1 Profile the device parameters. 2 Compute some formulas using these parameters. 3 Deduce the success rate.

Purpose: Work for any additive distinguisher st. DPA, CPA, Maximum Likelihood, etc. Generalize to HO versions. Enable to clearly identify the impact of each device’s parameter on its security.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-44
SLIDE 44

Evaluate Attack Success Rates from Profiling A new methodology

Distribution of the Scores Vector

Score vector

Vector of scores given by the SCA (when targeting 8-bit secrets)

  • d = (dk0, dk1, · · · , dk255)

Comparison vector

Vector of differences of scores between k⋆ and k.

  • c = (dk⋆ − dk0, dk⋆ − dk1, · · · , dk⋆ − dk255)

Attack success ⇔ c > 0. SR = P[ c > 0].

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-45
SLIDE 45

Evaluate Attack Success Rates from Profiling A new methodology

Distribution of the Scores Vector

Score vector

Vector of scores given by the SCA (when targeting 8-bit secrets)

  • d = (dk0, dk1, · · · , dk255)

Comparison vector

Vector of differences of scores between k⋆ and k.

  • c = (dk⋆ − dk0, dk⋆ − dk1, · · · , dk⋆ − dk255)

Attack success ⇔ c > 0. SR = P[ c > 0].

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-46
SLIDE 46

Evaluate Attack Success Rates from Profiling A new methodology

Distribution of the Scores Vector

Distribution of score vector

# observations → ∞ = ⇒ d ∼ N(md, Σd) (mCLT). md, Σd can be computed/deduced from leakage profiling

◮ even when masking material is unknown(by averaging the combining of leakage points)!

Distribution of comparison vector

# observations∞ = ⇒ c ∼ N(mc, Σc) (mCLT). mc, Σc can be computed/deduced from leakage profiling SR = P[ c > 0] = Φmc,Σc( 0, ∞).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-47
SLIDE 47

Evaluate Attack Success Rates from Profiling A new methodology

Distribution of the Scores Vector

Distribution of score vector

# observations → ∞ = ⇒ d ∼ N(md, Σd) (mCLT). md, Σd can be computed/deduced from leakage profiling

◮ even when masking material is unknown(by averaging the combining of leakage points)!

Distribution of comparison vector

# observations∞ = ⇒ c ∼ N(mc, Σc) (mCLT). mc, Σc can be computed/deduced from leakage profiling SR = P[ c > 0] = Φmc,Σc( 0, ∞).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-48
SLIDE 48

Evaluate Attack Success Rates from Profiling A new methodology

Evaluation of SR: PCD Methodology

1 Profile the leakage of every share. 2 Compute the parameters md and Σd of the score vector. 3 Deduce the parameters mc and Σc and evaluate the success

rate thanks to the multivariate normal cdf.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-49
SLIDE 49

Evaluate Attack Success Rates from Profiling A new methodology

Evaluation of SR: PCD Methodology

1 Profile the leakage of every share. 2 Compute the parameters md and Σd of the score vector. 3 Deduce the parameters mc and Σc and evaluate the success

rate thanks to the multivariate normal cdf.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-50
SLIDE 50

Evaluate Attack Success Rates from Profiling A new methodology

Evaluation of SR: PCD Methodology

1 Profile the leakage of every share. 2 Compute the parameters md and Σd of the score vector. 3 Deduce the parameters mc and Σc and evaluate the success

rate thanks to the multivariate normal cdf.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-51
SLIDE 51

Evaluate Attack Success Rates from Profiling A new methodology

Evaluation of SR: PCD Methodology

1 Profile the leakage of every share. 2 Compute the parameters md and Σd of the score vector. 3 Deduce the parameters mc and Σc and evaluate the success

rate thanks to the multivariate normal cdf.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-52
SLIDE 52

Evaluate Attack Success Rates from Profiling A new methodology

Validation of the approach

Context

Two ’real life’ devices: 130nm and 350nm architectures. Masked AES, output of S-box. EM radiations.

Methodology

Estimation of leakage parameters using linear regression techniques on 200.000 samples. HO-CPAs using normalized product combination function, and HW model function.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-53
SLIDE 53

Evaluate Attack Success Rates from Profiling A new methodology

Validation of the approach

Context

Two ’real life’ devices: 130nm and 350nm architectures. Masked AES, output of S-box. EM radiations.

Methodology

Estimation of leakage parameters using linear regression techniques on 200.000 samples. HO-CPAs using normalized product combination function, and HW model function.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-54
SLIDE 54

Evaluate Attack Success Rates from Profiling A new methodology

Results

Figure: 130nm, 2OCPA

2000 4000 6000 8000 10000 12000 14000 20 40 60 80 100

Figure: 350 nm, 3OCPA

2000 4000 6000 8000 10000 20 40 60 80 100

Figure: 350nm, 2OCPA

2000 4000 6000 8000 10000 20 40 60 80 100

Figure: 350 nm, 4OCPA

10000 20000 30000 40000 50000 20 40 60 80 100

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-55
SLIDE 55

Evaluate Attack Success Rates from Profiling A new methodology

Impact of leakage profiling

What happens when we regress/profile with less samples ?

Leak(X) = c0 + c1X1 + c2X2 + c3X3 + c4X4 + c5X5 + c6X6 + c7X7 + Noise

500 1000 1500 2000 2500 8 6 4 2 2 4

Number of observations

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-56
SLIDE 56

Evaluate Attack Success Rates from Profiling A new methodology

Results

Figure: 350 nm, 2OCPA

1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 10 20 30 40 50 60 70 80 90 100

Conclusion: 1500 samples are enough to accurately assess the efficiency of this attack (instead of 10 Millions !).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-57
SLIDE 57

Evaluate Attack Success Rates from Profiling A new methodology

Increasing the order

Number of observations: constant (instead of exponential). Number of operations: linear (instead of exponential).

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-58
SLIDE 58

Evaluate Attack Success Rates from Profiling A new methodology

What about our 4O-CPA ?

1000 × 107 = 10 Billions of observations.

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 x 10

7

10 20 30 40 50 60 70 80 90 100 Number of messages Success rate (%)

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-59
SLIDE 59

Evaluate Attack Success Rates from Profiling A new methodology

What about our 4O-CPA ?

15 hundreds of observations.

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 x 10

7

10 20 30 40 50 60 70 80 90 100 Number of messages Success rate (%)

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-60
SLIDE 60

Evaluate Attack Success Rates from Profiling A new methodology

Conclusion about SR and Profiling

From good leakage assessment, the success rates of the main side-channel attacks can be deduced.

◮ Do we still need to perform the attacks?

Formulas moreover indicate the impact of device’s parameters

  • n the SR.

◮ Related to recent works by Bruneau, Heuser, Guilley and Rioul.

Possibility to precisely know the SR of attacks requiring a lot

  • f observations, using only a very limited number of

acquisitions! Open an important issue: specify sound campaigns to compute SNR that can be easily related to information leakage.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-61
SLIDE 61

Evaluate Attack Success Rates from Profiling A new methodology

Conclusion about SR and Profiling

From good leakage assessment, the success rates of the main side-channel attacks can be deduced.

◮ Do we still need to perform the attacks?

Formulas moreover indicate the impact of device’s parameters

  • n the SR.

◮ Related to recent works by Bruneau, Heuser, Guilley and Rioul.

Possibility to precisely know the SR of attacks requiring a lot

  • f observations, using only a very limited number of

acquisitions! Open an important issue: specify sound campaigns to compute SNR that can be easily related to information leakage.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-62
SLIDE 62

Evaluate Attack Success Rates from Profiling A new methodology

Conclusion about SR and Profiling

From good leakage assessment, the success rates of the main side-channel attacks can be deduced.

◮ Do we still need to perform the attacks?

Formulas moreover indicate the impact of device’s parameters

  • n the SR.

◮ Related to recent works by Bruneau, Heuser, Guilley and Rioul.

Possibility to precisely know the SR of attacks requiring a lot

  • f observations, using only a very limited number of

acquisitions! Open an important issue: specify sound campaigns to compute SNR that can be easily related to information leakage.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-63
SLIDE 63

Evaluate Attack Success Rates from Profiling A new methodology

Conclusion about SR and Profiling

From good leakage assessment, the success rates of the main side-channel attacks can be deduced.

◮ Do we still need to perform the attacks?

Formulas moreover indicate the impact of device’s parameters

  • n the SR.

◮ Related to recent works by Bruneau, Heuser, Guilley and Rioul.

Possibility to precisely know the SR of attacks requiring a lot

  • f observations, using only a very limited number of

acquisitions! Open an important issue: specify sound campaigns to compute SNR that can be easily related to information leakage.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-64
SLIDE 64

Evaluate Attack Success Rates from Profiling A new methodology

Conclusion about SR and Profiling

From good leakage assessment, the success rates of the main side-channel attacks can be deduced.

◮ Do we still need to perform the attacks?

Formulas moreover indicate the impact of device’s parameters

  • n the SR.

◮ Related to recent works by Bruneau, Heuser, Guilley and Rioul.

Possibility to precisely know the SR of attacks requiring a lot

  • f observations, using only a very limited number of

acquisitions! Open an important issue: specify sound campaigns to compute SNR that can be easily related to information leakage.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-65
SLIDE 65

Evaluate Attack Success Rates from Profiling A new methodology

Conclusion about SR and Profiling

From good leakage assessment, the success rates of the main side-channel attacks can be deduced.

◮ Do we still need to perform the attacks?

Formulas moreover indicate the impact of device’s parameters

  • n the SR.

◮ Related to recent works by Bruneau, Heuser, Guilley and Rioul.

Possibility to precisely know the SR of attacks requiring a lot

  • f observations, using only a very limited number of

acquisitions! Open an important issue: specify sound campaigns to compute SNR that can be easily related to information leakage.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-66
SLIDE 66

Machine Learning and Points of Interest

Machine Learning and PoI

Some Open Issues

Sound choice among the first components returned by Principal Component Analysis (PCA) or Linear Discriminant analysis

◮ maximizing intra/inter class variances is not equivalent to maximizing information leakage!

Combine use of Kernel functions with PCA/LDA to get efficient methods to identify PoI (or tuples of PoI against masked implementations) Compare PCA/LDA + Kernel functions with recent methods based on Projection Pursuits.

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-67
SLIDE 67

Conclusion

Conclusion

Leakage assessment is a sound alternative to the ”attacks testing” approach (at least for designers/developers) Profiling + leakage assessment enables to soundly estimate the efficiency of the main SCA Machine Learning provides us with many tools to answer our questions (key extraction, dimension reduction, ) BUT adaptations and further studies are needed! Genericity and automitzation is our Graal

◮ it does not go against human expertise, it helps to identify the central problems!

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA

slide-68
SLIDE 68

Conclusion

Acknowledgement: part of the slides come from presentations given by Eleonora Cagli, Thomas Roche, Adrian Thillard

  • E. PROUFF, ANSSI

Evaluating the Security of Implementations Against SCA