Generalization of the Ball-Collision Algorithm Violetta Weger joint - - PowerPoint PPT Presentation

generalization of the ball collision algorithm
SMART_READER_LITE
LIVE PREVIEW

Generalization of the Ball-Collision Algorithm Violetta Weger joint - - PowerPoint PPT Presentation

Generalization of the Ball-Collision Algorithm Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal University of Zurich 7th Code-Based Cryptography Workshop 19 May 2019 Violetta Weger


slide-1
SLIDE 1

Generalization of the Ball-Collision Algorithm

Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal

University of Zurich

7th Code-Based Cryptography Workshop 19 May 2019

Violetta Weger Ball-Collision Algorithm

slide-2
SLIDE 2

Outline

1 Motivation 2 Introduction 3 Prange’s Algorithm 4 Improvements overview 5 Ball-collision Algorithm 6 New directions 7 Comparison of Complexities 8 Open questions

Violetta Weger Ball-Collision Algorithm

slide-3
SLIDE 3

Motivation

Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD)

Violetta Weger Ball-Collision Algorithm

slide-4
SLIDE 4

Motivation

Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD)

Violetta Weger Ball-Collision Algorithm

slide-5
SLIDE 5

ISD algorithms and syndome decoding problem

1978 Berlekamp, McEliece and van Tilborg: Decoding a random linear code is NP-complete Problem (Syndrome decoding problem) Given a parity check matrix H of a (binary) code of length n and dimension k and a syndrome s: s = Hx⊺ ∈ Fn−k

2

and the error correction capacity t, we want to fjnd e ∈ Fn

2 of

weight t such that s = He⊺.

Violetta Weger Ball-Collision Algorithm

slide-6
SLIDE 6

ISD algorithms and syndome decoding problem

  • Syndrome decoding problem is equivalent to the decoding

problem and Problem (Decoding problem) Given a generator matrix G of a (binary) code of length n and dimension k and a corrupted codeword c: c = mG + e ∈ Fn

2

and the error correction capacity t, we want to fjnd e ∈ Fn

2 of

weight t.

  • equivalent to fjnding a minimum weight codeword, since in

C + {0, c} the error vector e is now the minimum weight codeword.

Violetta Weger Ball-Collision Algorithm

slide-7
SLIDE 7

Informationset

Notation Let c ∈ Fn

q and A ∈ Fk×n q

, let S ⊂ {1, . . . , n}, then we denote by cS the restriction of c to the entries indexed by S and by AS the columns of A indexed by S. For a code C ⊂ Fn

q, we denote by

CS = {cS | c ∈ C}. Defjnition (Informationset) Let C ⊂ Fn

q be a code of dimension k. If I ⊂ {1, . . . , n} of size k

is such that | C |=| CI |, then we call I an information set of C.

Violetta Weger Ball-Collision Algorithm

slide-8
SLIDE 8

Informationset

Notation Let c ∈ Fn

q and A ∈ Fk×n q

, let S ⊂ {1, . . . , n}, then we denote by cS the restriction of c to the entries indexed by S and by AS the columns of A indexed by S. For a code C ⊂ Fn

q, we denote by

CS = {cS | c ∈ C}. Defjnition (Informationset) Let C ⊂ Fn

q be a code of dimension k. If I ⊂ {1, . . . , n} of size k

is such that | C |=| CI |, then we call I an information set of C.

Violetta Weger Ball-Collision Algorithm

slide-9
SLIDE 9

Informationset

Defjnition (Informationset) Let G be the k × n generator matrix of C. If I ⊂ {1, . . . , n} of size k is such that GI is invertible, then I is an informationset

  • f C.

Defjnition (Informationset) Let H be the n − k × n parity check matrix of C. If I ⊂ {1, . . . , n} of size k is such that HIc is invertible, then I is an informationset of C.

Violetta Weger Ball-Collision Algorithm

slide-10
SLIDE 10

Informationset

Defjnition (Informationset) Let G be the k × n generator matrix of C. If I ⊂ {1, . . . , n} of size k is such that GI is invertible, then I is an informationset

  • f C.

Defjnition (Informationset) Let H be the n − k × n parity check matrix of C. If I ⊂ {1, . . . , n} of size k is such that HIc is invertible, then I is an informationset of C.

Violetta Weger Ball-Collision Algorithm

slide-11
SLIDE 11

Prange’s algorithm

1962 Prange proposes the fjrst ISD algorithm. Assumption: All t errors occur outside of the information set. Input: H ∈ Fn−k×n

2

, s ∈ Fn−k

2

, t ∈ N Output: e ∈ Fn

2, wt(e) = t and He⊺ = s.

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U ∈ Fn−k×n−k

2

such that (UH)I = A and (UH)Ic = Idn−k. 3 If wt(Us) = t, then eI = 0 and eIc = Us. 4 Else start over.

Violetta Weger Ball-Collision Algorithm

slide-12
SLIDE 12

Prange’s algorithm

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U

n k n k 2

such that UH I A and UH Ic Idn

k.

3 If wt Us t, then eI 0 and eIc Us. 4 Else start over. Let us assume for simplicity that I = {1, . . . , k}. UH A Idn

k

hence UHe A Idn

k

eIc Us From which we get the condition eIc Us.

Violetta Weger Ball-Collision Algorithm

slide-13
SLIDE 13

Prange’s algorithm

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U ∈ Fn−k×n−k

2

such that (UH)I = A and (UH)Ic = Idn−k. 3 If wt Us t, then eI 0 and eIc Us. 4 Else start over. Let us assume for simplicity that I = {1, . . . , k}. UH = ( A Idn−k ) , hence UHe⊺ = ( A Idn−k ) ( 0 eIc ) = Us. From which we get the condition eIc Us.

Violetta Weger Ball-Collision Algorithm

slide-14
SLIDE 14

Prange’s algorithm

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U ∈ Fn−k×n−k

2

such that (UH)I = A and (UH)Ic = Idn−k. 3 If wt(Us) = t, then eI = 0 and eIc = Us. 4 Else start over. Let us assume for simplicity that I = {1, . . . , k}. UH = ( A Idn−k ) , hence UHe⊺ = ( A Idn−k ) ( 0 eIc ) = Us. From which we get the condition eIc = Us.

Violetta Weger Ball-Collision Algorithm

slide-15
SLIDE 15

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) (n − k t )(n t )−1 .

Violetta Weger Ball-Collision Algorithm

slide-16
SLIDE 16

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) (n − k t )(n t )−1 .

Violetta Weger Ball-Collision Algorithm

slide-17
SLIDE 17

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) (n − k t )(n t )−1 .

Violetta Weger Ball-Collision Algorithm

slide-18
SLIDE 18

Improvements Overview

Violetta Weger Ball-Collision Algorithm

slide-19
SLIDE 19

Ball-collision Algorithm

Violetta Weger Ball-Collision Algorithm

slide-20
SLIDE 20

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1 Y2 Y3. 4 Bring H in systematic form. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-21
SLIDE 21

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1 Y2 Y3. 4 Bring H in systematic form. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-22
SLIDE 22

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1, Y2, Y3. 4 Bring H in systematic form. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-23
SLIDE 23

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1, Y2, Y3. 4 Bring H in systematic form. UHe⊺ = (A1 Idℓ1+ℓ2 A2 Idℓ3 )   e1 e2 e3   = (s1 s2 ) = Us. We get the conditions A1e1 + e2 = s1, A2e1 + e3 = s2.

Violetta Weger Ball-Collision Algorithm

slide-24
SLIDE 24

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1, Y2, Y3. 4 Bring H in systematic form. UHe⊺ = (A1 Idℓ1+ℓ2 A2 Idℓ3 )   e1 e2 e3   = (s1 s2 ) = Us. We get the conditions A1e1 + e2 = s1, A2e1 + e3 = s2.

Violetta Weger Ball-Collision Algorithm

slide-25
SLIDE 25

Ball-collision Algorithm

Conditions: A1e1 + e2 = s1, A2e1 + e3 = s2. Assumptions: a e1 has support in I = X1 ∪ X2 and weight 2v b e2 has support in Y1 ∪ Y2 and weight 2w c e3 has support in Y3 and weight t − 2v − 2w

Violetta Weger Ball-Collision Algorithm

slide-26
SLIDE 26

Ball-collision Algorithm

a e1 has support in I = X1 ∪ X2 and weight 2v

Violetta Weger Ball-Collision Algorithm

slide-27
SLIDE 27

Ball-collision Algorithm

a e1 has support in I = X1 ∪ X2 and weight 2v b e2 has support in Y1 ∪ Y2 and weight 2w

Violetta Weger Ball-Collision Algorithm

slide-28
SLIDE 28

Ball-collision Algorithm

A1e1 + e2 = s1, (1) A2e1 + e3 = s2. (2) For condition (1): go through all choices of e1 and e2 and check with collision if (1) is satisfjed. For condition (2): defjne e3 = s2 − A2e1 and check if e3 has weight t − 2v − 2w.

Violetta Weger Ball-Collision Algorithm

slide-29
SLIDE 29

Ball-collision Algorithm

Success probability: (⌊k/2⌋ v )(⌈k/2⌉ v )(⌊ℓ/2⌋ w )(⌈ℓ/2⌉ w )( n − k − ℓ n − 2v − 2w )(n t )−1 .

Violetta Weger Ball-Collision Algorithm

slide-30
SLIDE 30

New directions

Idea of overlapping sets: 2009 Finiasz and Sendrier: X1 and X2 can overlap 2012 Becker, Joux, May and Meurer: can add redundant errors in the overlap

Violetta Weger Ball-Collision Algorithm

slide-31
SLIDE 31

New directions

New parameters: α overlap-ratio δ amount of redundant errors 2009 Finiasz and Sendrier: α = 1/2, δ = 0 2012 BJMM: α = 1/2, δ > 0

Violetta Weger Ball-Collision Algorithm

slide-32
SLIDE 32

Comparison of Complexities

Let F(q, R) be the exponent of the optimized asymptotic

  • complexity. The asymptotic complexity of half-distance

decoding at rate R over Fq is then given by qF(q,R)n+o(n). q q−Stern q−Stern-MO q−Ball-collision q−BJMM-MO 2 0.05563 0.05498 0.055573 0.04730 3 0.05217 0.05242 0.052145 0.04427 4 0.04987 0.05032 0.049846 0.04294 5 0.04815 0.04864 0.048140 0.03955 7 0.04571 0.04614 0.045697 0.03706 8 0.04478 0.04519 0.044770 0.03593 11 0.04266 0.04299 0.042656 0.03335

Violetta Weger Ball-Collision Algorithm

slide-33
SLIDE 33

Open questions

  • Is partitioning into more sets giving us better asymptotic

complexities?

  • With new code-based cryptographic schemes, e.g. using

rank-metric codes, can we adapt these ideas to these metrics?

  • Can we use some structure, e.g. of cyclic codes, to improve

the ISD algorithms in these cases?

Violetta Weger Ball-Collision Algorithm

slide-34
SLIDE 34

Thank you!

Violetta Weger Ball-Collision Algorithm