improved information set decoding
play

Improved Information Set Decoding Alexander Meurer , Ruhr-Universitt - PowerPoint PPT Presentation

Improved Information Set Decoding Alexander Meurer , Ruhr-Universitt Bochum CBC Workshop 2012, Lyngby The Asymptotic Playground The Asymptotic Playground We are interested in asymptotically fastest algorithms Prominent example: Matrix


  1. Improved Information Set Decoding Alexander Meurer , Ruhr-Universität Bochum CBC Workshop 2012, Lyngby

  2. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices

  3. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices naive 2 3

  4. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices Strassen naive 2 2,808 3

  5. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices CW Strassen naive 2 2,376 2,808 3

  6. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices Williams CW Strassen naive 2 2,3727 2,376 2,808 3

  7. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices Williams CW Strassen naive 2 2,3727 2,376 2,808 3 ● Strassen still performs best in practice (for reasonable n)

  8. The Asymptotic Playground The Asymptotic Playground ● We are interested in asymptotically fastest algorithms ● Prominent example: Matrix multiplication ● Measure runtime as for n x n - matrices Williams CW Strassen naive 2 2,3727 2,376 2,808 3 ● Strassen still performs best in practice (for reasonable n) This talk: recent (asymptotic) progress in ISD .

  9. Recap Binary Linear Codes Recap Binary Linear Codes ● C = random binary [n,k,d] code ● n = length / k = dimension / d = minimum distance Bounded Distance Decoding (BDD) Bounded Distance Decoding (BDD) ● Given x = c + e with c C · · 2 4 5 d-1 and w := wt( e ) = 4 5 d-1 2 2 x · c ● Find e and thus c = x + e · ·

  10. Comparing Running Times Comparing Running Times How to compare performance of decoding algorithms ● Running time T(n,k,d) ● Fixed code rate R = k/n ● For n→∞, k and d are related via Gilbert-Varshamov bound, thus T(n,k,d) = T(n,k) ● Compare algorithms by complexity coeffcient F(k), i.e. T(n,k) = 2 F(k) • n + o(n)

  11. Comparing Running Times Comparing Running Times How to compare performance of decoding algorithms Minimize F(k)! Minimize F(k)! ● Running time T(n,k,d) ● Fixed code rate R = k/n ● For n→∞, k and d are related via Gilbert-Varshamov bound, thus T(n,k,d) = T(n,k) ● Compare algorithms by complexity coeffcient F(k), i.e. T(n,k) = 2 F(k) • n + o(n)

  12. Syndrome Decoding Syndrome Decoding (BDD) Given x = c + e with c C and wt( e )=w, fnd e ! 2 ● H = parity check matrix ● Consider syndrome s := s( x ) = H · x = H ·( c + e ) = H · e → Find linear combination of w columns of H matching s weight w n H s n-k = + + =

  13. Syndrome Decoding Syndrome Decoding (BDD) Given x = c + e with c C and wt( e )=w, fnd e ! 2 ● H = parity check matrix ● Consider syndrome s := s( x ) = H · x = H ·( c + e ) = H · e → Find linear combination of w columns of H matching s weight w n Brute-Force complexity Brute-Force complexity H s n-k = + + = T(n,k,d) = T(n,k,d) =

  14. Complexity Coeffcients (BDD) Complexity Coeffcients (BDD) Brute-Force F(k) 0,05 0,06 0,3868

  15. Complexity Coeffcients (BDD) Complexity Coeffcients (BDD) Prange Brute-Force F(k) 0,05 0,0576 0,06 0,3868

  16. Complexity Coeffcients (BDD) Complexity Coeffcients (BDD) Stern Prange Brute-Force F(k) 0,05 0,0557 0,0576 0,06 0,3868

  17. Some Basic Observations for BDD Some Basic Observations for BDD Allowed (linear algebra) transformations ● Permuting the columns of H does not change the problem

  18. Some Basic Observations for BDD Some Basic Observations for BDD Allowed (linear algebra) transformations ● Permuting the columns of H does not change the problem weight w n H s = n-k

  19. Some Basic Observations for BDD Some Basic Observations for BDD Allowed (linear algebra) transformations ● Permuting the columns of H does not change the problem weight w weight w n n H H s s = = n-k n-k

  20. Some Basic Observations for BDD Some Basic Observations for BDD Allowed (linear algebra) transformations ● Permuting the columns of H does not change the problem ● Elementary row operations on H do not change the problem

  21. Some Basic Observations for BDD Some Basic Observations for BDD Allowed (linear algebra) transformations ● Permuting the columns of H does not change the problem ● Elementary row operations on H do not change the problem weight w n H = s n-k

  22. Some Basic Observations for BDD Some Basic Observations for BDD Allowed (linear algebra) transformations ● Permuting the columns of H does not change the problem ● Elementary row operations on H do not change the problem weight w n U G U G H s = n-k Invertible (n-k)x(n-k) matrix

  23. Randomized quasi-systematic form Randomized quasi-systematic form ● Work on randomly column-permuted version of H ● Transform H into quasi-systematic form k+l n-k-l q 1 , ..., q k+l 0 l rows H = I n-k-l I n-k-l Q ' First used in generalized ISD framework of [FS09]

  24. Information Set Decoding Information Set Decoding ''Reducing the brute-force search space by linear algebra.''

  25. The ISD Principle The ISD Principle k+l n-k-l ● Structure of H allows to divide e = e 1 e 2 e 1 e 2 q 1 , ..., q k+l 0 I n-k-l I n-k-l Q '

  26. The ISD Principle The ISD Principle ● Structure of H allows to divide e = e 1 e 2 e 1 e 2 e 1 e 2 q 1 , ..., q k+l q 1 , ..., q k+l 0 0 = + I n-k-l I n-k-l I n-k-l I n-k-l Q ' Q '

  27. The ISD Principle The ISD Principle ● Structure of H allows to divide e = e 1 e 2 e 1 e 2 e 1 e 2 q 1 , ..., q k+l q 1 , ..., q k+l 0 0 = + I n-k-l I n-k-l I n-k-l I n-k-l Q ' Q ' ! * 0 l coordinates = = + s * * * *

  28. The ISD Principle The ISD Principle ● Structure of H allows to divide e = e 1 e 2 e 1 e 2 e 1 e 2 q 1 , ..., q k+l q 1 , ..., q k+l 0 0 = + I n-k-l I n-k-l I n-k-l I n-k-l Q ' Q ' ! Focus on e 1 matching Focus on e 1 matching * 0 l coordinates = = + s s on frst l coordinates * * s on frst l coordinates * *

  29. The ISD Principle The ISD Principle ● Structure of H allows to divide e = Find all e 1 of weight p matching s on frst l coordinates e 1 e 2 e 1 e 2 e 1 e 2 q 1 , ..., q k+l q 1 , ..., q k+l 0 0 = + I n-k-l I n-k-l I n-k-l I n-k-l Q ' Q ' ! * 0 l coordinates = = + s * * * *

  30. The ISD Principle The ISD Principle ● Structure of H allows to divide e = Find all e 1 of weight p matching s on frst l coordinates e 1 e 2 e 1 e 1 e 2 e 2 e 1 e 2 q 1 , ..., q k+l q 1 , ..., q k+l q 1 , ..., q k+l 0 0 0 = = + ● Method only recovers I n-k-l I n-k-l I n-k-l I n-k-l I n-k-l I n-k-l Q ' Q ' Q ' particular error patterns k+l n-k-l e 1 e 2 p w-p ! * 0 l coordinates = = = ● If no solution found: + s * * → Rerandomize H * *

  31. The ISD Principle The ISD Principle ● 1 st step (randomization): Compute „fresh“ random quasi- systematic form of H 0 H ● 2 nd step (search): Try to fnd a solution e amongst all n-k-l k+l with e 1 e 2 e 1 w-p p = q 1 , ..., q k+l s

  32. The ISD Principle The ISD Principle ● 1 st step (randomization): Compute „fresh“ random quasi- systematic form of H 0 H T = Pr[„good rand.“] -1 * T[search] ● 2 nd step (search): Try to fnd a solution e amongst all n-k-l k+l with e 1 e 2 e 1 w-p p = q 1 , ..., q k+l s

  33. The ISD Search Step (Notation) The ISD Search Step (Notation) ● Find vector e 1 of weight p with e 1 = q 1 , ..., q k+l s

  34. The ISD Search Step (Notation) The ISD Search Step (Notation) ● Find vector e 1 of weight p with e 1 = q 1 , ..., q k+l s ● Find selection with

  35. The ISD Search Step (Notation) The ISD Search Step (Notation) ● Find vector e 1 of weight p with e 1 = q 1 , ..., q k+l s ● Find selection with We exploit 1+1=0 to fnd e 1 more effciently!

  36. A Meet-in-the-Middle Approach A Meet-in-the-Middle Approach Find a selection with ● Disjoint partition into left and right half (k+l) p (k+l) / 2 p / 2 p / 2 (k+l) / 2

  37. A Meet-in-the-Middle Approach A Meet-in-the-Middle Approach Find a selection with ● To fnd run a Meet-in-the-Middle algorithm based on ● Same F(k) as recent Ball-Collision decoding [BLP11] as shown in [MMT11]

  38. Complexity Coeffcients (BDD) Complexity Coeffcients (BDD) Ball-Collision Stern Prange Brute-Force F(k) 0,05 0,0557 0,0576 0,06 0,3868 0,0556

  39. The Representation Technique [HGJ10] The Representation Technique [HGJ10] How to fnd a needle N in a haystack H... ● Expand H into larger stack H' ● Expanding H' introduces r many representations N 1 , … , N r ● Examine a 1/r – fraction of H' to fnd one N i

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend