masking proofs are tight
play

Masking Proofs are Tight (and How to Exploit it in Security - PowerPoint PPT Presentation

Masking Proofs are Tight (and How to Exploit it in Security Evaluations) Vincent Grosso, Franois-Xavier Standaert Radbout University Nijmegen (The Netherlands), UCL (Belgium) EUROCRYPT 2018, Tel Aviv, Israel Motivation (side-channel security


  1. Masking Proofs are Tight (and How to Exploit it in Security Evaluations) Vincent Grosso, François-Xavier Standaert Radbout University Nijmegen (The Netherlands), UCL (Belgium) EUROCRYPT 2018, Tel Aviv, Israel

  2. Motivation (side-channel security evaluation) 1 current practice (simplified) attack-based evaluations 2 128 computation 2 64 2 0 2 10 2 20 πŸ‘ πŸ’πŸ measurements

  3. Motivation (side-channel security evaluation) 1 current practice (simplified) > πŸ‘ πŸ’πŸ = πŸ‘ πŸ“πŸ ? = πŸ‘ πŸ—πŸ ? attack-based evaluations 2 128 computation 2 64 2 0 2 10 2 20 πŸ‘ πŸ’πŸ measurements

  4. Motivation (side-channel security evaluation) 1 current practice proposed approach (simplified) open designs ( Kerckhoffs ) attack-based evaluations proof-based evaluations 2 128 2 128 computation computation computation 2 64 2 64 2 0 2 0 2 10 2 20 2 30 2 60 πŸ‘ πŸ’πŸ πŸ‘ 𝟘𝟏 measurements measurements

  5. Example: masked encoding 2 β€’ Probing security ( Ishai, Sahai, Wagner 2003 ) 𝑧 = 𝑧 1 βŠ• 𝑧 2 βŠ• β‹― βŠ• 𝑧(𝑒 βˆ’ 1) βŠ• 𝑧(𝑒) ?

  6. Example: masked encoding 2 β€’ Probing security ( Ishai, Sahai, Wagner 2003 ) 𝑧 = 𝑧 1 βŠ• 𝑧 2 βŠ• β‹― βŠ• 𝑧(𝑒 βˆ’ 1) βŠ• 𝑧(𝑒) ? β€’ 𝑒 βˆ’ 1 probes do not reveal anything on 𝑧

  7. Example: masked encoding 2 β€’ Probing security ( Ishai, Sahai, Wagner 2003 ) 𝑧 = 𝑧 1 βŠ• 𝑧 2 βŠ• β‹― βŠ• 𝑧(𝑒 βˆ’ 1) βŠ• 𝑧(𝑒) y β€’ But 𝑒 probes completely reveal 𝑧

  8. Example: masked encoding 2 β€’ Probing security ( Ishai, Sahai, Wagner 2003 ) 𝑧 = 𝑧 1 βŠ• 𝑧 2 βŠ• β‹― βŠ• 𝑧(𝑒 βˆ’ 1) βŠ• 𝑧(𝑒) ? β€’ Noisy leakage security ( Prouff, Rivain 2013 )

  9. Example: masked encoding 2 β€’ Probing security ( Ishai, Sahai, Wagner 2003 ) 𝑧 = 𝑧 1 βŠ• 𝑧 2 βŠ• β‹― βŠ• 𝑧(𝑒 βˆ’ 1) βŠ• 𝑧(𝑒) noise and independence (Duc, Dziemb., Faust 2014) β€’ Noisy leakage security ( Prouff, Rivain 2013 )

  10. Example: masked encoding 2 β€’ Probing security ( Ishai, Sahai, Wagner 2003 ) 𝑧 = 𝑧 1 βŠ• 𝑧 2 βŠ• β‹― βŠ• 𝑧(𝑒 βˆ’ 1) βŠ• 𝑧(𝑒) noise and independence (Duc, Dziemb., Faust 2014) β€’ Noisy leakage security ( Prouff, Rivain 2013 ) 𝑑 MI( 𝑍, 𝑴) < MI( 𝑍(π‘˜), 𝑴(π‘˜)) 𝑒 𝑂 ∝ and MI (𝑍;𝑴)

  11. Contributions 3 β€’ Previous work: masking proofs are tight for the encodings (Duc, Faust, Standaert, EC15/JoC18)

  12. Contributions 3 β€’ Previous work: masking proofs are tight for the encodings (Duc, Faust, Standaert, EC15/JoC18) β€’ This work: 1. The same holds for circuits (e.g., S-boxes) made from simple gadgets (e.g., add. & mult.)

  13. Contributions 3 β€’ Previous work: masking proofs are tight for the encodings (Duc, Faust, Standaert, EC15/JoC18) β€’ This work: 1. The same holds for circuits (e.g., S-boxes) made from simple gadgets (e.g., add. & mult.) 2. Proofs can considerably simplify evaluations β€’ Under noise & independence assumptions β€’ Limited to divide & conquer attacks

  14. Outline 1. Evaluation settings 2. Case studies w.c. eval. time complexity β€’ Case #1: low 𝑒 , one-tuple vs. multi-tuples β€’ Independent Operation’s Leakages (IOL) β€’ Case #2: higher 𝑒 , single-tuple β€’ Independent Shares Leakages (ISL), DFS bound β€’ Case #3: multiplication leakages β€’ ISL assumption + PR bound β€’ Case #4: higher 𝑒 , worst-case attacks β€’ Shares repetition, security graphs 3. Concrete attacks (i.e., why worst-case data comp. needed) 4. Conclusions & future research

  15. Outline 1. Evaluation settings 2. Case studies w.c. eval. time complexity β€’ Case #1: low 𝑒 , one-tuple vs. multi-tuples β€’ Independent Operation’s Leakages (IOL) β€’ Case #2: higher 𝑒 , single-tuple β€’ Independent Shares Leakages (ISL), DFS bound β€’ Case #3: multiplication leakages β€’ ISL assumption + PR bound β€’ Case #4: higher 𝑒 , worst-case attacks β€’ Shares repetition, security graphs 3. Concrete attacks (i.e., why worst-case data comp. needed) 4. Conclusions & future research

  16. Evaluation settings (I) 4 β€’ Target implementation:

  17. Evaluation settings (I) 4 β€’ Target implementation: β€’ C1 Adv: one 𝑒 -tuple, 𝑴 = 𝑀 10 = [𝑀 10 1 , … , 𝑀 10 𝑒 ] leakage matrix leakage vector leakage sample ( one 𝑒 -tuple ) ( all leaks ) ( one share )

  18. Evaluation settings (I) 4 β€’ Target implementation: β€’ C2 Adv: ten 𝑒 -tuples, 𝑴 = [𝑀 1 , 𝑀 2 , … , 𝑀 10 ]

  19. Evaluation settings (I) 4 β€’ Target implementation: C3 Adv: multiplication leaks, some β€’ 𝑀 𝑗 ’s become 𝑒 2 -tuples - or even 2 𝑒 2 -tuples ( log/alog tables ) compression 𝑑 1 𝑏 1 𝑐 1 𝑏 1 𝑐 2 𝑏 1 𝑐 3 0 𝑠 𝑠 1 2 𝑑 2 βˆ’π‘  0 𝑠 𝑏 2 𝑐 1 𝑏 2 𝑐 2 𝑏 2 𝑐 3 + β‡’ 1 3 𝑑 3 βˆ’π‘  𝑠 0 𝑏 3 𝑐 1 𝑏 3 𝑐 2 𝑏 3 𝑐 3 2 3 refreshing partial products

  20. Evaluation settings (I) 4 β€’ Target implementation: C3 Adv: multiplication leaks, some β€’ 𝑀 𝑗 ’s become 𝑒 2 -tuples - or even 2 𝑒 2 -tuples ( log/alog tables ) β€’ 8-bit 𝑧 = 𝑧(1) … 𝑧 𝑒 , π‘š(π‘˜) = HW 𝑧(π‘˜) + π‘œ 2 , SNR = 𝜏 2 (8βˆ’bit HW) = 2 β€’ Noise variance 𝜏 π‘œ 2 2 𝜏 π‘œ 𝜏 π‘œ β€’ ( Correlated noise analyzed in the paper )

  21. Evaluation settings (II) 5 β€’ Exact worst-case evaluations β‰ˆ computing: MI 𝐿; π‘Œ, 𝑴 = H 𝐿 + Pr[𝑙] βˆ™ Pr 𝑦 𝑙 𝑦 βˆ™ Pr[𝑧] βˆ™ Pr π’Ž 𝑙, 𝑦, 𝒛 βˆ™ log 2 (Pr 𝑙 𝑦, π’Ž ) 𝒛 π’Ž shares vectors πœ€ -dimension integral β‡’ 𝑃(2 𝑒 ) β€’ Which can be computationally hard…

  22. Outline 1. Evaluation settings 2. Case studies w.c. eval. time complexity β€’ Case #1: low 𝑒 , one-tuple vs. multi-tuples β€’ Independent Operation’s Leakages (IOL) β€’ Case #2: higher 𝑒 , single-tuple β€’ Independent Shares Leakages (ISL), DFS bound β€’ Case #3: multiplication leakages β€’ ISL assumption + PR bound β€’ Case #4: higher 𝑒 , worst-case attacks β€’ Shares repetition, security graphs 3. Concrete attacks (i.e., why worst-case data comp. needed) 4. Conclusions & future research

  23. Case #1 6 β€’ 𝑒 = 1,2 , C1 Adv β‡’ exhaustive analysis possible

  24. Case #1 6 β€’ 𝑒 = 1,2 , C2 Adv β‡’ exhaustive analysis possible

  25. Case #1 6 β€’ 𝑒 = 1,2 , C2 Adv β‡’ exhaustive analysis possible β€’ But IOL assumption leads to faster evaluation 𝑗 , β€’ i.e., MI 𝐿; π‘Œ, 𝑴 β‰ˆ 10 βˆ™ MI(𝑍 𝑀 𝑗 )

  26. Case #1 6 β€’ 𝑒 = 1,2 , C2 Adv β‡’ exhaustive analysis possible β€’ But IOL assumption leads to faster evaluation β€’ Conservative ( dependencies linearly decrease the MI )

  27. Case #2 7 β€’ Larger 𝑒 ’s, C1 Adv β‡’ exhaustive analysis hard

  28. Case #2 7 β€’ Larger 𝑒 ’s, C1 Adv β‡’ exhaustive analysis hard β€’ But ISL assumpt. leads to much faster eval. 𝑀 𝑗 (π‘˜)) 𝑒 [DFS15,18] 𝑗 , 𝑗 (π‘˜), β€’ i.e., MI( 𝑍 𝑀 𝑗 ) < MI( 𝑍

  29. Case #2 7 β€’ Larger 𝑒 ’s, C1 Adv β‡’ exhaustive analysis hard β€’ But ISL assumpt. leads to much faster eval. β€’ Critical ( dependencies exponentially increase the MI )

  30. Case #3 8 β€’ Mult. leaks β‡’ analysis even harder ( 2-bit example )

  31. Case #3 8 β€’ Mult. leaks β‡’ analysis even harder ( 2-bit example ) β€’ ISL assumpt. leads to much faster eval. [PR13] MI( 𝑒 2 partial prod.) β‰ˆ 1,72 βˆ™ 𝑒 βˆ™ MI(𝑒 -tuple) β€’

  32. Case #4: putting things together (I) 9 β€’ Full S-box analysis, large 𝑒 ’s, C1 & C3 Adv

  33. Case #4: putting things together (I) 9 β€’ Full S-box analysis, large 𝑒 ’s, C1 & C3 Adv β€’ C1 β†’ C3: MI increases linearly in # of tuples 𝑀 𝑗 (π‘˜)) 𝑒 β‰ˆ 𝑒 βˆ™ MI( 𝑍 𝑗 (π‘˜), 𝑗 (π‘˜), 𝑀 𝑗 (π‘˜)) 𝑒 β€’ i.e., MI( 𝑍

  34. Case #4: putting things together (I) 9 β€’ Full S-box analysis, large 𝑒 ’s, C1 & C3 Adv β€’ C1 β†’ C3: MI increases linearly in # of tuples β€’ ∝ β€œcircuit size” parameter of masking proofs

  35. Case #4: putting things together (II) 10 β€’ Things get (much) worse if shares re-used 𝑏 1 𝑐 1 𝑏 1 𝑐 2 𝑏 1 𝑐 3 β€’ e.g., ISW each share used 𝑒 times: 𝑏 2 𝑐 1 𝑏 2 𝑐 2 𝑏 2 𝑐 3 𝑏 3 𝑐 1 𝑏 3 𝑐 2 𝑏 3 𝑐 3

  36. Case #4: putting things together (II) 10 β€’ Things get (much) worse if shares re-used Adv. can average the β€’ 𝑀 𝑗 (π‘˜) ’s & increases MI exp. in 𝑒

  37. Case #4: putting things together (II) 10 β€’ Things get (much) worse if shares re-used Adv. can average the β€’ 𝑀 𝑗 (π‘˜) ’s & increases MI exp. in 𝑒 𝑀 𝑗 (π‘˜)) 𝑒 β‰ˆ (𝑒 βˆ™ MI( 𝑍 𝑗 (π‘˜), 𝑗 (π‘˜), 𝑀 𝑗 (π‘˜))) 𝑒 β€’ i.e., MI( 𝑍

  38. Case #4: putting things together (II) 10 β€’ Things get (much) worse if shares re-used Adv. can average the β€’ 𝑀 𝑗 (π‘˜) ’s & increases MI exp. in 𝑒 β€’ ∝ β€œnoise condition” of masking security proofs

  39. Case #4: putting things together (II) 10 β€’ Things get (much) worse if shares re-used Adv. can average the β€’ 𝑀 𝑗 (π‘˜) ’s & increases MI exp. in 𝑒 β€’ ∝ β€œnoise condition” of masking security proofs

  40. Link to the bigger picture 11 β€’ From MI 𝐿; π‘Œ, 𝑴 one can directly obtain a bound on the attack’s overall complexity β€’ Example for MI 𝐿; π‘Œ, 𝑴 = 10 βˆ’7

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend