Generalization of the Ball-Collision Algorithm Violetta Weger joint - - PowerPoint PPT Presentation

generalization of the ball collision algorithm
SMART_READER_LITE
LIVE PREVIEW

Generalization of the Ball-Collision Algorithm Violetta Weger joint - - PowerPoint PPT Presentation

Generalization of the Ball-Collision Algorithm Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal University of Zurich Munich 18 July 2019 Violetta Weger Ball-Collision Algorithm Outline


slide-1
SLIDE 1

Generalization of the Ball-Collision Algorithm

Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal

University of Zurich

Munich 18 July 2019

Violetta Weger Ball-Collision Algorithm

slide-2
SLIDE 2

Outline

1 Motivation 2 Introduction 3 Prange’s Algorithm 4 Improvements overview 5 Ball-collision Algorithm 6 New directions 7 Comparison of Complexities 8 Open questions 9 Surprise

Violetta Weger Ball-Collision Algorithm

slide-3
SLIDE 3

Motivation

Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD)

Violetta Weger Ball-Collision Algorithm

slide-4
SLIDE 4

Motivation

Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD)

Violetta Weger Ball-Collision Algorithm

slide-5
SLIDE 5

ISD algorithms and syndome decoding problem

1978 Berlekamp, McEliece and van Tilborg: Decoding a random linear code is NP-complete Problem (Syndrome decoding problem) Given a parity check matrix H of a (binary) code of length n and dimension k and a syndrome s: s = Hx⊺ ∈ Fn−k

2

and the error correction capacity t, we want to find e ∈ Fn

2 of

weight t such that s = He⊺.

Violetta Weger Ball-Collision Algorithm

slide-6
SLIDE 6

ISD algorithms and syndome decoding problem

Syndrome decoding problem is equivalent to the decoding problem and Problem (Decoding problem) Given a generator matrix G of a (binary) code of length n and dimension k and a corrupted codeword c: c = mG + e ∈ Fn

2

and the error correction capacity t, we want to find e ∈ Fn

2 of

weight t. equivalent to finding a minimum weight codeword, since in C + {0, c} the error vector e is now the minimum weight codeword.

Violetta Weger Ball-Collision Algorithm

slide-7
SLIDE 7

Information set

Notation Let c ∈ Fn

q and A ∈ Fk×n q

, let S ⊂ {1, . . . , n}, then we denote by cS the restriction of c to the entries indexed by S and by AS the columns of A indexed by S. For a code C ⊂ Fn

q , we denote by

CS = {cS | c ∈ C}. Definition (Information set) Let C ⊂ Fn

q be a code of dimension k. If I ⊂ {1, . . . , n} of size k

is such that | C |=| CI |, then we call I an information set of C.

Violetta Weger Ball-Collision Algorithm

slide-8
SLIDE 8

Information set

Notation Let c ∈ Fn

q and A ∈ Fk×n q

, let S ⊂ {1, . . . , n}, then we denote by cS the restriction of c to the entries indexed by S and by AS the columns of A indexed by S. For a code C ⊂ Fn

q , we denote by

CS = {cS | c ∈ C}. Definition (Information set) Let C ⊂ Fn

q be a code of dimension k. If I ⊂ {1, . . . , n} of size k

is such that | C |=| CI |, then we call I an information set of C.

Violetta Weger Ball-Collision Algorithm

slide-9
SLIDE 9

Information set

Definition (Information set) Let G be the k × n generator matrix of C. If I ⊂ {1, . . . , n} of size k is such that GI is invertible, then I is an information set

  • f C.

Definition (Information set) Let H be the n − k × n parity check matrix of C. If I ⊂ {1, . . . , n} of size k is such that HIc is invertible, then I is an information set of C.

Violetta Weger Ball-Collision Algorithm

slide-10
SLIDE 10

Information set

Definition (Information set) Let G be the k × n generator matrix of C. If I ⊂ {1, . . . , n} of size k is such that GI is invertible, then I is an information set

  • f C.

Definition (Information set) Let H be the n − k × n parity check matrix of C. If I ⊂ {1, . . . , n} of size k is such that HIc is invertible, then I is an information set of C.

Violetta Weger Ball-Collision Algorithm

slide-11
SLIDE 11

Prange’s algorithm

1962 Prange proposes the first ISD algorithm. Assumption: All t errors occur outside of the information set. Input: H ∈ Fn−k×n

2

, s ∈ Fn−k

2

, t ∈ N Output: e ∈ Fn

2, wt(e) = t and He⊺ = s.

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U ∈ Fn−k×n−k

2

such that (UH)I = A and (UH)Ic = Idn−k. 3 If wt(Us) = t, then eI = 0 and eIc = Us. 4 Else start over.

Violetta Weger Ball-Collision Algorithm

slide-12
SLIDE 12

Prange’s algorithm

1 Choose an information set I ⊂ {1, . . . , n} of size k. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-13
SLIDE 13

Prange’s algorithm

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U ∈ Fn−k×n−k

2

such that (UH)I = A and (UH)Ic = Idn−k. Let us assume for simplicity that I = {1, . . . , k}. UH =

  • A

Idn−k

  • ,

hence UHe⊺ =

  • A

Idn−k eIc

  • = Us.

Violetta Weger Ball-Collision Algorithm

slide-14
SLIDE 14

Prange’s algorithm

1 Choose an information set I ⊂ {1, . . . , n} of size k. 2 Find an invertible matrix U ∈ Fn−k×n−k

2

such that (UH)I = A and (UH)Ic = Idn−k. 3 If wt(Us) = t, then eI = 0 and eIc = Us. Let us assume for simplicity that I = {1, . . . , k}. UH =

  • A

Idn−k

  • ,

hence UHe⊺ =

  • A

Idn−k eIc

  • = Us.

From which we get the condition eIc = Us.

Violetta Weger Ball-Collision Algorithm

slide-15
SLIDE 15

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) n − k t n t −1 . Remark Brute force = ISD.

Violetta Weger Ball-Collision Algorithm

slide-16
SLIDE 16

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) n − k t n t −1 . Remark Brute force = ISD.

Violetta Weger Ball-Collision Algorithm

slide-17
SLIDE 17

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) n − k t n t −1 . Remark Brute force = ISD.

Violetta Weger Ball-Collision Algorithm

slide-18
SLIDE 18

Prange’s algorithm

The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) n − k t n t −1 . Remark Brute force = ISD.

Violetta Weger Ball-Collision Algorithm

slide-19
SLIDE 19

Improvements Overview

Violetta Weger Ball-Collision Algorithm

slide-20
SLIDE 20

Ball-collision Algorithm

Violetta Weger Ball-Collision Algorithm

slide-21
SLIDE 21

Ball-collision Algorithm

1 Choose an information set I. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-22
SLIDE 22

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-23
SLIDE 23

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1, Y2, Y3. Let us assume for simplicity that I = {1, . . . , k}.

Violetta Weger Ball-Collision Algorithm

slide-24
SLIDE 24

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1, Y2, Y3. 4 Bring H in systematic form. UHe⊺ = A1 Idℓ1+ℓ2 A2 Idℓ3   e1 e2 e3   = s1 s2

  • = Us.

We get the conditions A1e1 + e2 = s1, A2e1 + e3 = s2.

Violetta Weger Ball-Collision Algorithm

slide-25
SLIDE 25

Ball-collision Algorithm

1 Choose an information set I. 2 Partition I into X1 and X2. 3 Partition Y into Y1, Y2, Y3. 4 Bring H in systematic form. UHe⊺ = A1 Idℓ1+ℓ2 A2 Idℓ3   e1 e2 e3   = s1 s2

  • = Us.

We get the conditions A1e1 + e2 = s1, A2e1 + e3 = s2.

Violetta Weger Ball-Collision Algorithm

slide-26
SLIDE 26

Ball-collision Algorithm

Conditions: A1e1 + e2 = s1, A2e1 + e3 = s2. Assumptions: a e1 has support in I = X1 ∪ X2 and weight 2v b e2 has support in Y1 ∪ Y2 and weight 2w c e3 has support in Y3 and weight t − 2v − 2w

Violetta Weger Ball-Collision Algorithm

slide-27
SLIDE 27

Ball-collision Algorithm

a e1 has support in I = X1 ∪ X2 and weight 2v

Violetta Weger Ball-Collision Algorithm

slide-28
SLIDE 28

Ball-collision Algorithm

a e1 has support in I = X1 ∪ X2 and weight 2v b e2 has support in Y1 ∪ Y2 and weight 2w

Violetta Weger Ball-Collision Algorithm

slide-29
SLIDE 29

Ball-collision Algorithm

A1e1 + e2 = s1, (1) A2e1 + e3 = s2. (2) Condition (1): Go through all choices of e1 and e2 and check with collision if (1) is satisfied. Condition (2): Define e3 = s2 − A2e1 and check if e3 has weight t − 2v − 2w.

Violetta Weger Ball-Collision Algorithm

slide-30
SLIDE 30

Hidden details

Collision check For e1 = x1 + x2 and e2 = y1 + y2 we want to check that A1(x1 + x2) + (y1 + y2) = s1.

  • 1. For all x1 having support in X1 and weight v,

for all y1 having support in Y1 and weight w: compute A1x1 + y1.

  • 2. For all x2 having support in X2 and weight v,

for all y2 having support in Y2 and weight w: compute s1 − y2 − A1x2. If A1x1 + y1 = s1−y2−A1x2 it follows A1(x1 + x2) + (y1 + y2) = s1.

Violetta Weger Ball-Collision Algorithm

slide-31
SLIDE 31

Hidden details

Collision check Cost Given A ∈ Fk×n

q

. Instead of computing Ax⊺ for all x of a fixed weight w, one can use intermediate sums:

  • 1. Compute Ax⊺

1 for all x1 of weight 1, i.e. compute scalar

multiples of the columns of A Cost: n(q − 1) log2(q)2 bit operations.

  • 2. Compute Ax⊺

2 for all x2 having weight 2, i.e. choose two

columns and add their already computed scalar multiples Cost: n

2

  • (q − 1)2k log2(q) bit operations.

. . . Total cost:

w

  • i=2

n

i

  • (q − 1)ik log2(q) + n(q − 1) log2(q)2 bit
  • perations.

Violetta Weger Ball-Collision Algorithm

slide-32
SLIDE 32

Hidden details

Average number of collisions: assuming uniform distribution the average number of collision between a set S and T in Fℓ

q is given by

| S || T | qℓ Early abort: we want to check wheter x + y has weight t, we compute and simultaneously check the weight, as soon as the weight t + 1 is reached we abort. An entry in Fq has

  • n average the weight q−1

q , hence on average after

computing

q q−1(t + 1) entries of the solution we can abort.

Violetta Weger Ball-Collision Algorithm

slide-33
SLIDE 33

New directions

Idea of overlapping sets: 2009 Finiasz and Sendrier: X1 and X2 can overlap 2012 Becker, Joux, May and Meurer: can add redundant errors in the overlap

Violetta Weger Ball-Collision Algorithm

slide-34
SLIDE 34

New directions

New parameters: α overlap-ratio δ amount of redundant errors 2009 Finiasz and Sendrier: α = 1/2, δ = 0 2012 BJMM: α = 1/2, δ > 0

Violetta Weger Ball-Collision Algorithm

slide-35
SLIDE 35

Comparison of Complexities

Let F(q, R) be the exponent of the optimized asymptotic

  • complexity. The asymptotic complexity of half-distance

decoding at rate R over Fq is then given by qF(q,R)n+o(n). q q−Stern q−Stern-MO q−Ball-collision q−BJMM-MO 2 0.05563 0.05498 0.055573 0.04730 3 0.05217 0.05242 0.052145 0.04427 4 0.04987 0.05032 0.049846 0.04294 5 0.04815 0.04864 0.048140 0.03955 7 0.04571 0.04614 0.045697 0.03706 8 0.04478 0.04519 0.044770 0.03593 11 0.04266 0.04299 0.042656 0.03335

Violetta Weger Ball-Collision Algorithm

slide-36
SLIDE 36

Comparison of Complexities

If prefered in base 2: The asymptotic complexity of half-distance decoding at rate R over Fq is then given by 2F(q,R) log2(q)n+o(n). q q−Stern q−Stern-MO q−Ball-collision q−BJMM-MO 2 0.05563 0.05498 0.055573 0.04730 3 0.08269 0.08308 0.082648 0.07017 4 0.09974 0.10064 0.099692 0.08588 5 0.11180 0.11294 0.111778 0.09183 7 0.12832 0.12953 0.128288 0.10404 8 0.13434 0.13557 0.13431 0.10779 11 0.14758 0.14872 0.147566 0.11537

Violetta Weger Ball-Collision Algorithm

slide-37
SLIDE 37

Open questions

Partitioning more Mixture of both directions Cyclic structure Rank metric ISD Other metrics

Violetta Weger Ball-Collision Algorithm

slide-38
SLIDE 38

Open questions

Partitioning more Cyclic structure Tillich and Canto-Torres: Speeding up decoding a code with a non-trivial automorphism group up to an exponential factor Rank metric ISD Other metrics BONUS ROUND

Violetta Weger Ball-Collision Algorithm

slide-39
SLIDE 39

Lee metric

Bonus Round: Lee metric Joint work with Annalena Horlemann-Trautmann Joint work with Marco Baldi, Franco Chiaraluce, Paolo Santini, Massimo Battaglioni

Violetta Weger Ball-Collision Algorithm

slide-40
SLIDE 40

Definitions

We will consider the integer-residue ring Z/mZ := Zm, and for x ∈ Zm we will always consider its representative in {0, . . . , m − 1}. Definition (Lee weight) For x ∈ Zm, the Lee weight is defined as wtL(x) = min{x, m − x}. For v ∈ Zn

m, the Lee weight is defined as

wtL(v) =

n

  • i=1

wtL(vi).

Violetta Weger Ball-Collision Algorithm

slide-41
SLIDE 41

Definitions

Definition (Lee distance) For x, y ∈ Zm, the Lee distance is defined as dL(x, y) = wtL(x − y) = min{| x − y |, m− | x − y |}. For v, w ∈ Zn

m, the Lee distance is defined as

dL(v, w) =

n

  • i=1

dL(vi, wi).

Violetta Weger Ball-Collision Algorithm

slide-42
SLIDE 42

Example

Example (Lee weight in Z7) x 1 2 3 4 5 6 wtL(x) 1 2 3 3 2 1

Violetta Weger Ball-Collision Algorithm

slide-43
SLIDE 43

Quaternary codes

The case m = 4 is the most studied one in ring linear coding theory. Definition (Quaternary Codes) We say that C is a quaternary code of length n, if C is an additive subgroup of Zn

4.

Reason: We have a connection between ring linear coding theory and classical coding theory over finite fields:

Violetta Weger Ball-Collision Algorithm

slide-44
SLIDE 44

Gray Isometry

Definition (Gray Isometry) φ : (Z4, wtL) → (F2

2, wtH)

→ (0, 0), 1 → (0, 1), 2 → (1, 1), 3 → (1, 0). The Gray isometry can be extended componentwise to φ : (Zn

4, wtL) → (F2n 2 , wtH).

Violetta Weger Ball-Collision Algorithm

slide-45
SLIDE 45

Main Differences

Main changes from Fq to Z4 Systematic form ⇒ Have to change the algorithm Amount of vectors having fixed Lee weight ⇒ New concept of intermediate sums and success probability

Violetta Weger Ball-Collision Algorithm

slide-46
SLIDE 46

Main Differences

Systematic form Definition (Parity Check matrix) Let C be a quaternary code of length n and type | C |= 4k12k2. C has a (n − k1) × n parity check matrix H = D E Idn−k1−k2 2F 2Idk2

  • ,

where D ∈ Z(n−k1−k2)×k1

4

, E ∈ Z(n−k1−k2)×k2

2

, F ∈ Zk2×k1

2

.

Violetta Weger Ball-Collision Algorithm

slide-47
SLIDE 47

Main Differences

Systematic form Definition (Generator matrix) Let C be a quaternary code of length n and type | C |= 4k12k2. C has a (k1 + k2) × n generator matrix G = Idk1 A B 2Idk2 2C

  • ,

where A ∈ Zk1×k2

2

, B ∈ Zk1×(n−k1−k2)

4

, C ∈ Zk2×(n−k1−k2)

2

.

Violetta Weger Ball-Collision Algorithm

slide-48
SLIDE 48

Main Differences

Amount of vectors having fixed Lee weight The amount of all vectors in Zn

4 having Lee weight w is

c(n, w) =

⌊w/2⌋

  • i=0

n i n − i w − 2i

  • 2w−2i.

c(n, w) = 2n w

  • .

Hence many concepts can be easily translated using the Gray isometry, getting the parameter 2n instead of n.

Violetta Weger Ball-Collision Algorithm

slide-49
SLIDE 49

Main Differences

Amount of vectors having fixed Lee weight The amount of all vectors in Zn

4 having Lee weight w is

c(n, w) =

⌊w/2⌋

  • i=0

n i n − i w − 2i

  • 2w−2i.

c(n, w) = 2n w

  • .

Hence many concepts can be easily translated using the Gray isometry, getting the parameter 2n instead of n.

Violetta Weger Ball-Collision Algorithm

slide-50
SLIDE 50

GV bound

Again using the Gray isometry, we get a Gilbert-Varshamov bound for quaternary codes: Proposition (Quaternary GV bound) Let C be a quaternary code of length n and minimum Lee distance d, then | C |≥ 4n

d−1

  • j=0

2n

j

.

Violetta Weger Ball-Collision Algorithm

slide-51
SLIDE 51

Theoretical Key Sizes

Assuming quaternary GV bound and half minimum distance error correction: Theoretical parameters for the McEliece cryptosystem using quaternary codes: system n k k1 k2 t Key size security quaternary 425 55 370 42 20355 128 binary 425 240 42 44400 62 binary 850 2240 42 146400 37 Goppa 2960 2288 1537536 128

Violetta Weger Ball-Collision Algorithm

slide-52
SLIDE 52

Difficulties and open questions

Find a suitable quaternary code For example Kerdock codes are having length n = 2m has a generator matrix of size (2 + m) × 2m. Increasing m to m + 1 gives exponential growth in the key size and adds 2 bits to the security level of m. Generalizing ISD algorithms to Zpm It is even difficult to compute an exact formula for the amount of vectors in Zn

pm having Lee weight w.

NP completeness:

1978 Berlekamp, McEliece, van Tilborg: Syndrome Decoding Problem in the binary is NP-complete 1994 Barg: Syndrome Decoding Problem over Fq is NP-complete 2016 Gaborit, Zemor: Probabilistic Proof of NP-completeness of the rank metric Syndrome Decoding Problem

Other metrics?

Violetta Weger Ball-Collision Algorithm

slide-53
SLIDE 53

Thank you!

Violetta Weger Ball-Collision Algorithm