grow cut using representations in cryptanalysis
play

Grow & Cut - Using Representations in Cryptanalysis Alexander - PowerPoint PPT Presentation

Grow & Cut - Using Representations in Cryptanalysis Alexander Meurer , Ruhr-Universitt Bochum ECRYPT Summer School on Tools, Mykonos, 2012 Motivation Motivation Generic technique for combinatorial problems Simple (no complicated


  1. Knapsack Representation Technique Knapsack Representation Technique Expand H into larger stack H'

  2. Expanding the Knapsack Haystack Expanding the Knapsack Haystack ● Recall: Classical MitM haystack H = {0,1} n/2 x {0} n/2 of size 2 n/2

  3. Expanding the Knapsack Haystack Expanding the Knapsack Haystack ● Recall: Classical MitM haystack H = {0,1} n/2 x {0} n/2 of size 2 n/2 Contains unique Needle ( x 1 , … , x n/2 )

  4. Expanding the Knapsack Haystack Expanding the Knapsack Haystack ● Recall: Classical MitM haystack H = {0,1} n/2 x {0} n/2 of size 2 n/2 ● New expanded haystack H' = { x 2 {0,1} n : wt( x ) = n/4 }

  5. Expanding the Knapsack Haystack Expanding the Knapsack Haystack ● Recall: Classical MitM haystack H = {0,1} n/2 x {0} n/2 of size 2 n/2 ● New expanded haystack H' = { x 2 {0,1} n : wt( x ) = n/4 }

  6. Knapsack Representation Technique Knapsack Representation Technique Expanding H' introduces r many representations N 1 , … , N r

  7. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  8. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  9. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  10. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  11. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  12. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  13. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  14. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) We can choose 2 out of 4 of the 1-entries, so ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) many representations ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  15. Number of Knapsack Representations Number of Knapsack Representations ● Example (n=8) ● Denote x * = ( 1 0 1 1 0 0 1 0) the solution ● x * can be represented as ( 1 0 1 0 0 0 0 0 ) + ( 0 0 0 1 0 0 1 0 ) We can choose 2 out of 4 of the 1-entries, so ( 1 0 0 1 0 0 0 0 ) + ( 0 0 1 0 0 0 1 0 ) In general: many representations many representations ( 1 0 0 0 0 0 1 0 ) + ( 0 0 1 1 0 0 0 0 ) ( 0 0 1 1 0 0 0 0 ) + ( 1 0 0 0 0 0 1 0 ) ( 0 0 1 0 0 0 1 0 ) + ( 1 0 0 1 0 0 0 0 ) ( 0 0 0 1 0 0 1 0 ) + ( 1 0 1 0 0 0 0 0 )

  16. Knapsack Representation Technique Knapsack Representation Technique Examine a 1/r – fraction of H' to fnd one N i

  17. The Cutting Phase The Cutting Phase ● We need to flter out a 2 -n/2 fraction of H' = { x 2 {0,1} n : wt( x ) = n/4 } ● For compute two lists ● Find one (x,y) with over

  18. The Cutting Phase The Cutting Phase ● We need to flter out a 2 -n/2 fraction of H' = { x 2 {0,1} n : wt( x ) = n/4 } ● For compute two lists ● Find one (x,y) with over

  19. The Cutting Phase The Cutting Phase ● We need to flter out a 2 -n/2 fraction of H' = { x 2 {0,1} n : wt( x ) = n/4 } ● For compute two lists ● Find one (x,y) with over

  20. The Cutting Phase The Cutting Phase ● We need to flter out a 2 -n/2 fraction of Complexity H' = { x 2 {0,1} n : wt( x ) = n/4 } ● For compute two lists ● Find one (x,y) with over

  21. The Cutting Phase The Cutting Phase ● We need to flter out a 2 -n/2 fraction of Computing the is in fact Complexity H' = { x 2 {0,1} n : wt( x ) = n/4 } more expensive! ● For compute two lists ● Find one (x,y) with over

  22. - 1 st st Attempt Computing L Computing 1 - 1 Attempt L 1 ● Idea: Do it in a Meet-in-the-Middle way ● Choose a (random) partition ● Merge the lists

  23. - 1 st st Attempt Computing L Computing 1 - 1 Attempt L 1 ● Idea: Do it in a Meet-in-the-Middle way ● Choose a (random) partition ● Merge the lists

  24. - 1 st st Attempt Computing L Computing 1 - 1 Attempt L 1 Choose labels ● Idea: Do it in a Meet-in-the-Middle way ● Choose a (random) partition ● Merge the lists

  25. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes

  26. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes

  27. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes

  28. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes

  29. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes x

  30. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes Complexity* = Max x = 2 0.4057n

  31. st Attempt Complexity Analysis: 1 st Complexity Analysis: 1 Attempt List sizes If assign uniform labels Complexity* = Max x = 2 0.4057n

  32. and ¸ are almost always good L and R are almost always good ¸ ¸ L ¸ R ● Recall ● Example: Good and bad distributed labels (n=20) random a i a i = 0 for i=1,...,n/2

  33. and ¸ are almost always good L and R are almost always good ¸ ¸ L ¸ R ● Recall One can show: Fraction of bad knapsacks (a 1 , … , a n ) is exponentially small! ● Example: Good and bad distributed labels (n=20) random a i a i = 0 for i=1,...,n/2

  34. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Aim: Compute 2 -n/2 fraction of H' = { x 2 2 {0,1} n : wt( x ) = n/4 } ● Idea: Use representations again! ● Decompose x 2 H' as x = y + z where y , z 2 2 H'' and 2 {0,1} n : wt( y ) = n/8 } H'' = { y 2 → Every x has representations ( y , z )

  35. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Aim: Compute 2 -n/2 fraction of H' = { x 2 2 {0,1} n : wt( x ) = n/4 } ● Idea: Use representations again! ● Decompose x 2 H' as x = y + z where y , z 2 2 H'' and 2 {0,1} n : wt( y ) = n/8 } H'' = { y 2 → Every x has representations ( y , z )

  36. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Choose coprime M (1) and M (2) of size 2 0.25n ● Choose random target values (of size 2 0.25n ) with with ● Compute lists ● Merge into

  37. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Choose coprime M (1) and M (2) of size 2 0.25n ● Choose random target values (of size 2 0.25n ) with with ● Compute lists ● Merge into

  38. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Choose coprime M (1) and M (2) of size 2 0.25n ● Choose random target values (of size 2 0.25n ) with with ● Compute lists ● Merge into

  39. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Choose coprime M (1) and M (2) of size 2 0.25n ● Choose random target values (of size 2 0.25n ) with Can be computed as before by a MitM approach! with ● Compute lists ● Merge into

  40. - 2 nd nd Attempt Computing Computing L 1 - 2 Attempt L 1 ● Choose coprime M (1) and M (2) of size 2 0.25n ● Choose random target values (of size 2 0.25n ) with with ● Compute lists ● Merge into

  41. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists

  42. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists

  43. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists

  44. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists

  45. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists

  46. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists

  47. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists Two modular constraints:

  48. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists x

  49. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists Complexity = Max = 2 0.3113n Lower bound achieved? x

  50. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists No! Forgot merge costs! x

  51. nd Attempt Complexity Analysis: 2 Complexity Analysis: 2 nd Attempt List sizes Randomly partioned base lists x

  52. Summary (so far) Summary (so far) ● Represenation Technique gives fastest (randomized) algorithm for random hard knapsacks Time Space Brute-Force 2 n 1 MitM 2 n/2 2 n/2 Schroeppel-Shamir 2 n/2 2 n/4 Representations 2 0.3373n 2 0.2936n

  53. Generic Decoding of Random Generic Decoding of Random Binary Linear Codes Binary Linear Codes

  54. Recap Binary Linear Codes Recap Binary Linear Codes ● C = random binary [n,k,d] code ● n = length / k = dimension / d = minimum distance Bounded Distance Decoding (BDD) Bounded Distance Decoding (BDD) ● Given x = c + e with c C · · 2 4 5 d-1 4 5 and w := wt( e ) = d-1 2 2 x · c · · ● Find e and thus c = x + e

  55. Why do we care about BDD? Why do we care about BDD? ● BDD is connected to ● Message Recovery for classical McEliece c = m * G + e ● Key recovery for Stern's identifcation scheme id = H * s ● Hardness of the LPN problem (Learning Parities with noise)

  56. Comparing Running Times Comparing Running Times How to compare performance of decoding algorithms ● Running time T(n,k,d) ● Fixed code rate R = k/n ● For n→∞, k and d are related via Gilbert-Varshamov bound, thus T(n,k,d) = T(n,k) = T(n,R) ● Compare algorithms by complexity coeffcient F(R), i.e. T(n,R) = 2 F(R) • n + o(n)

  57. Comparing Running Times Comparing Running Times How to compare performance of decoding algorithms ● Running time T(n,k,d) Minimize F(R)! Minimize F(R)! ● Fixed code rate R = k/n ● For n→∞, k and d are related via Gilbert-Varshamov bound, thus T(n,k,d) = T(n,k) = T(n,R) ● Compare algorithms by complexity coeffcient F(R), i.e. T(n,R) = 2 F(R) • n + o(n)

  58. Syndrome Decoding Syndrome Decoding (BDD) Given x = c + e with c C and wt( e )=w, fnd e ! 2 ● H = parity check matrix ● Consider syndrome s := s( x ) = H · x = H ·( c + e ) = H · e → Find linear combination of w columns of H matching s weight w n H s n-k = + + =

  59. Syndrome Decoding Syndrome Decoding (BDD) Given x = c + e with c C and wt( e )=w, fnd e ! 2 ● H = parity check matrix ● Consider syndrome s := s( x ) = H · x = H ·( c + e ) = H · e → Find linear combination of w columns of H matching s weight w n Brute-Force complexity Brute-Force complexity H s n-k = + + = T(n,k,d) = T(n,k,d) =

  60. Complexity Coeffcients (BDD) Complexity Coeffcients (BDD) Brute-Force F(R) 0,05 0,06 0,3868

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend