generalization of the ball collision algorithm
play

Generalization of the Ball-Collision Algorithm Violetta Weger joint - PowerPoint PPT Presentation

Generalization of the Ball-Collision Algorithm Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal University of Zurich Munich 18 July 2019 Violetta Weger Ball-Collision Algorithm Outline


  1. Generalization of the Ball-Collision Algorithm Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal University of Zurich Munich 18 July 2019 Violetta Weger Ball-Collision Algorithm

  2. Outline 1 Motivation 2 Introduction 3 Prange’s Algorithm 4 Improvements overview 5 Ball-collision Algorithm 6 New directions 7 Comparison of Complexities 8 Open questions 9 Surprise Violetta Weger Ball-Collision Algorithm

  3. Motivation Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD) Violetta Weger Ball-Collision Algorithm

  4. Motivation Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD) Violetta Weger Ball-Collision Algorithm

  5. ISD algorithms and syndome decoding problem 1978 Berlekamp, McEliece and van Tilborg: Decoding a random linear code is NP-complete Problem (Syndrome decoding problem) Given a parity check matrix H of a (binary) code of length n and dimension k and a syndrome s : s = Hx ⊺ ∈ F n − k 2 and the error correction capacity t , we want to find e ∈ F n 2 of weight t such that s = He ⊺ . Violetta Weger Ball-Collision Algorithm

  6. ISD algorithms and syndome decoding problem Syndrome decoding problem is equivalent to the decoding problem and Problem (Decoding problem) Given a generator matrix G of a (binary) code of length n and dimension k and a corrupted codeword c : c = mG + e ∈ F n 2 and the error correction capacity t , we want to find e ∈ F n 2 of weight t . equivalent to finding a minimum weight codeword, since in C + { 0 , c } the error vector e is now the minimum weight codeword. Violetta Weger Ball-Collision Algorithm

  7. Information set Notation Let c ∈ F n q and A ∈ F k × n , let S ⊂ { 1 , . . . , n } , then we denote by q c S the restriction of c to the entries indexed by S and by A S the columns of A indexed by S . For a code C ⊂ F n q , we denote by C S = { c S | c ∈ C} . Definition (Information set) Let C ⊂ F n q be a code of dimension k . If I ⊂ { 1 , . . . , n } of size k is such that | C | = | C I | , then we call I an information set of C . Violetta Weger Ball-Collision Algorithm

  8. Information set Notation Let c ∈ F n q and A ∈ F k × n , let S ⊂ { 1 , . . . , n } , then we denote by q c S the restriction of c to the entries indexed by S and by A S the columns of A indexed by S . For a code C ⊂ F n q , we denote by C S = { c S | c ∈ C} . Definition (Information set) Let C ⊂ F n q be a code of dimension k . If I ⊂ { 1 , . . . , n } of size k is such that | C | = | C I | , then we call I an information set of C . Violetta Weger Ball-Collision Algorithm

  9. Information set Definition (Information set) Let G be the k × n generator matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that G I is invertible, then I is an information set of C . Definition (Information set) Let H be the n − k × n parity check matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that H I c is invertible, then I is an information set of C . Violetta Weger Ball-Collision Algorithm

  10. Information set Definition (Information set) Let G be the k × n generator matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that G I is invertible, then I is an information set of C . Definition (Information set) Let H be the n − k × n parity check matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that H I c is invertible, then I is an information set of C . Violetta Weger Ball-Collision Algorithm

  11. Prange’s algorithm 1962 Prange proposes the first ISD algorithm. Assumption: All t errors occur outside of the information set. Input: H ∈ F n − k × n , s ∈ F n − k , t ∈ N 2 2 2 , wt ( e ) = t and He ⊺ = s . Output: e ∈ F n 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . 2 Find an invertible matrix U ∈ F n − k × n − k such that 2 ( UH ) I = A and ( UH ) I c = Id n − k . 3 If wt ( Us ) = t , then e I = 0 and e I c = Us . 4 Else start over. Violetta Weger Ball-Collision Algorithm

  12. Prange’s algorithm 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . Let us assume for simplicity that I = { 1 , . . . , k } . Violetta Weger Ball-Collision Algorithm

  13. Prange’s algorithm 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . 2 Find an invertible matrix U ∈ F n − k × n − k such that 2 ( UH ) I = A and ( UH ) I c = Id n − k . Let us assume for simplicity that I = { 1 , . . . , k } . � � UH = A Id n − k , hence � � 0 � UHe ⊺ = � A Id n − k = Us. e I c Violetta Weger Ball-Collision Algorithm

  14. Prange’s algorithm 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . 2 Find an invertible matrix U ∈ F n − k × n − k such that 2 ( UH ) I = A and ( UH ) I c = Id n − k . 3 If wt ( Us ) = t , then e I = 0 and e I c = Us . Let us assume for simplicity that I = { 1 , . . . , k } . � � UH = A Id n − k , hence � � 0 � UHe ⊺ = � A Id n − k = Us. e I c From which we get the condition e I c = Us . Violetta Weger Ball-Collision Algorithm

  15. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) � − 1 � n − k �� n . t t Remark Brute force � = ISD. Violetta Weger Ball-Collision Algorithm

  16. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) � − 1 � n − k �� n . t t Remark Brute force � = ISD. Violetta Weger Ball-Collision Algorithm

  17. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) � − 1 � n − k �� n . t t Remark Brute force � = ISD. Violetta Weger Ball-Collision Algorithm

  18. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) � − 1 � n − k �� n . t t Remark Brute force � = ISD. Violetta Weger Ball-Collision Algorithm

  19. Improvements Overview Violetta Weger Ball-Collision Algorithm

  20. Ball-collision Algorithm Violetta Weger Ball-Collision Algorithm

  21. Ball-collision Algorithm 1 Choose an information set I . Let us assume for simplicity that I = { 1 , . . . , k } . Violetta Weger Ball-Collision Algorithm

  22. Ball-collision Algorithm 1 Choose an information set I . 2 Partition I into X 1 and X 2 . Let us assume for simplicity that I = { 1 , . . . , k } . Violetta Weger Ball-Collision Algorithm

  23. Ball-collision Algorithm 1 Choose an information set I . 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 , Y 2 , Y 3 . Let us assume for simplicity that I = { 1 , . . . , k } . Violetta Weger Ball-Collision Algorithm

  24. Ball-collision Algorithm 1 Choose an information set I . 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 , Y 2 , Y 3 . 4 Bring H in systematic form. �  e 1  � A 1 � s 1 � Id ℓ 1 + ℓ 2 0 UHe ⊺ =  = e 2 = Us.  A 2 0 Id ℓ 3 s 2 e 3 We get the conditions A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 . Violetta Weger Ball-Collision Algorithm

  25. Ball-collision Algorithm 1 Choose an information set I . 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 , Y 2 , Y 3 . 4 Bring H in systematic form. �  e 1  � A 1 � s 1 � Id ℓ 1 + ℓ 2 0 UHe ⊺ =  = e 2 = Us.  A 2 0 Id ℓ 3 s 2 e 3 We get the conditions A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 . Violetta Weger Ball-Collision Algorithm

  26. Ball-collision Algorithm Conditions: A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 . Assumptions: a e 1 has support in I = X 1 ∪ X 2 and weight 2 v b e 2 has support in Y 1 ∪ Y 2 and weight 2 w c e 3 has support in Y 3 and weight t − 2 v − 2 w Violetta Weger Ball-Collision Algorithm

  27. Ball-collision Algorithm a e 1 has support in I = X 1 ∪ X 2 and weight 2 v Violetta Weger Ball-Collision Algorithm

  28. Ball-collision Algorithm a e 1 has support in I = X 1 ∪ X 2 and weight 2 v b e 2 has support in Y 1 ∪ Y 2 and weight 2 w Violetta Weger Ball-Collision Algorithm

  29. Ball-collision Algorithm A 1 e 1 + e 2 = s 1 , (1) A 2 e 1 + e 3 = s 2 . (2) Condition (1): Go through all choices of e 1 and e 2 and check with collision if (1) is satisfied. Condition (2): Define e 3 = s 2 − A 2 e 1 and check if e 3 has weight t − 2 v − 2 w . Violetta Weger Ball-Collision Algorithm

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend