List-Decoding of Polar Codes Ido Tal and Alexander Vardy University - - PowerPoint PPT Presentation

list decoding of polar codes
SMART_READER_LITE
LIVE PREVIEW

List-Decoding of Polar Codes Ido Tal and Alexander Vardy University - - PowerPoint PPT Presentation

List-Decoding of Polar Codes Ido Tal and Alexander Vardy University of California San Diego 9500 Gilman Drive, La Jolla, CA 92093, USA Problem and goal Channel polarization is slow. For short to moderate code lengths, polar codes have


slide-1
SLIDE 1

List-Decoding of Polar Codes

Ido Tal and Alexander Vardy

University of California San Diego

9500 Gilman Drive, La Jolla, CA 92093, USA

slide-2
SLIDE 2

Problem and goal

Channel polarization is slow. For short to moderate code lengths, polar codes have disappointing performance.

10−5 10−4 10−3 10−2 10−1

Bit error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] successive cancellation, n = 2048, k = 1024 LDPC (Wimax standard, n = 2304)

Legend:

In this talk, we present a generalization of the SC decoder which greatly improves performance at short code lengths.

slide-3
SLIDE 3

Avenues for improvement

From here onward, consider a polar code of length n = 2048 and rate R = 0.5, optimized for a BPSK-AWGN channel with Eb/N0 = 2.0 dB.

10−5 10−4 10−3 10−2 10−1

Bit error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] successive cancellation, n = 2048, k = 1024 LDPC (Wimax standard, n = 2304)

Legend:

Why is our polar code under-performing? Is the SC decoder under-performing? Are the polar codes themselves weak at this length?

slide-4
SLIDE 4

A critical look at successive cancellation Successive Cancellation Decoding

for i = 0, 1, . . . , n − 1 do if ui is frozen then set ui accordingly; else if Wi(yn−1 , ui−1 |0) > Wi(yn−1 , ui−1 |1) then set ui ← 0; else set ui ← 1 ; Potential weaknesses (interplay): Once an unfrozen bit is set, there is “no going back”. A bit that was set at step i can not be changed at step j > i. Knowledge of the value of future frozen bits is not taken into account.

slide-5
SLIDE 5

List decoding of polar codes Key idea: Each time a decision on

ui is needed, split the current de- coding path into two paths: try both ui = 0 and ui = 1.

1

slide-6
SLIDE 6

List decoding of polar codes Key idea: Each time a decision on

ui is needed, split the current de- coding path into two paths: try both ui = 0 and ui = 1.

1 1 1

slide-7
SLIDE 7

List decoding of polar codes Key idea: Each time a decision on

ui is needed, split the current de- coding path into two paths: try both ui = 0 and ui = 1.

1 1 1 1 1 1 1

slide-8
SLIDE 8

List decoding of polar codes Key idea: Each time a decision on

ui is needed, split the current de- coding path into two paths: try both ui = 0 and ui = 1.

L = 4

1 1 1 1 1 1 1

When the number of paths grows beyond a prescribed threshold L, dis- card the worst (least probable) paths, and keep only the L best paths.

slide-9
SLIDE 9

List decoding of polar codes Key idea: Each time a decision on

ui is needed, split the current de- coding path into two paths: try both ui = 0 and ui = 1.

L = 4

1 1 1 1 1 1 1 1 1 1 1

When the number of paths grows beyond a prescribed threshold L, dis- card the worst (least probable) paths, and keep only the L best paths.

slide-10
SLIDE 10

List decoding of polar codes Key idea: Each time a decision on

ui is needed, split the current de- coding path into two paths: try both ui = 0 and ui = 1.

L = 4

1 1 1 1 1 1 1 1 1 1 1

When the number of paths grows beyond a prescribed threshold L, dis- card the worst (least probable) paths, and keep only the L best paths. At the end, select the single most likely path.

slide-11
SLIDE 11

List-decoding: complexity issues

The idea of branching while decoding is not new. In fact a very similar idea was applied for Reed-Muller codes.

  • I. Dumer, K. Shabunov, Soft-decision decoding of Reed-Muller codes:

recursive lists, IEEE Trans. on Information Theory, 52, pp. 1260–1266, 2006.

Our contribution

We consider list decoding of polar codes. However, in a naive implementation, the time would be O(L · n2). We show that this can be done in O(L · n log n) time and O(L · n) space.

We will return to the complexity issue later. For now, let’s see how decoding performance is affected.

slide-12
SLIDE 12

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1

Legend:

slide-13
SLIDE 13

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2

Legend:

slide-14
SLIDE 14

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4

Legend:

slide-15
SLIDE 15

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4 n = 2048, L = 8

Legend:

slide-16
SLIDE 16

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4 n = 2048, L = 8 n = 2048, L = 16

Legend:

slide-17
SLIDE 17

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4 n = 2048, L = 8 n = 2048, L = 16 n = 2048, L = 32

Legend:

slide-18
SLIDE 18

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4 n = 2048, L = 8 n = 2048, L = 16 n = 2048, L = 32 n = 2048, ML bound

Legend:

List-decoding performance quickly approaches that of maximum-likelihood decoding as a function of list-size.

slide-19
SLIDE 19

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4 n = 2048, L = 8 n = 2048, L = 16 n = 2048, L = 32 n = 2048, ML bound

Legend:

List-decoding performance quickly approaches that of maximum-likelihood decoding as a function of list-size. Good: our decoder is essentially optimal. Bad: Still not competitive with LDPC. . .

slide-20
SLIDE 20

Approaching ML performance

10−4 10−3 10−2 10−1

Word error rate

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio (Eb/N0) [dB] n = 2048, L = 1 n = 2048, L = 2 n = 2048, L = 4 n = 2048, L = 8 n = 2048, L = 16 n = 2048, L = 32 n = 2048, ML bound

Legend:

List-decoding performance quickly approaches that of maximum-likelihood decoding as a function of list-size. Good: our decoder is essentially optimal. Bad: Still not competitive with LDPC. . . Conclusions: Must somehow “fix” the polar code.

slide-21
SLIDE 21

A simple concatenation scheme

Recall that the last step of decoding was “pick the most likely codeword from the list”. An error: the transmitted codeword is not the most likely codeword in the list. However, very often, the transmitted codeword is still a member

  • f the list.

We need a “genie” to single-out the transmitted codeword. Idea: Let there be k + r unfrozen bits. Of these,

Use the first k bits to encode information. Use the last r unfrozen bits to encode the CRC value of the first k bits. Pick the most probable codeword on the list with correct CRC.

slide-22
SLIDE 22

Approaching LDPC performance

Simulation results for a polar code of length n = 2048 and rate R = 0.5, opti- mized for a BPSK-AWGN channel with Eb/N0 = 2.0 dB.

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio [dB]

10−6 10−5 10−4 10−3 10−2 10−1

Bit error rate

Successive cancellation List-decoding (L = 32)

slide-23
SLIDE 23

Approaching LDPC performance

Simulation results for a polar code of length n = 2048 and rate R = 0.5, opti- mized for a BPSK-AWGN channel with Eb/N0 = 2.0 dB.

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio [dB]

10−6 10−5 10−4 10−3 10−2 10−1

Bit error rate

Successive cancellation List-decoding (L = 32) WiMax turbo (n = 960) WiMax LDPC (n = 2304)

slide-24
SLIDE 24

Approaching LDPC performance

Simulation results for a polar code of length n = 2048 and rate R = 0.5, opti- mized for a BPSK-AWGN channel with Eb/N0 = 2.0 dB.

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio [dB]

10−6 10−5 10−4 10−3 10−2 10−1

Bit error rate

Successive cancellation List-decoding (L = 32) WiMax turbo (n = 960) WiMax LDPC (n = 2304) List + CRC-16 (n = 2048)

Polar codes (+CRC) under list decoding are competitive with the best LDPC codes at lengths as short as n = 2048.

slide-25
SLIDE 25

Quadratic complexity of list decoding

Naive implementation recap In a naive implementation, the decoding paths are independent. They don’t share information. Each decoding path has a set of variables associated with it. For example, at stage i, each decoding path must remember the values

  • f the bits

u0, u1, . . . , ui−1. It turns out (as we shall see) that each decoding path has Θ(n) memory associated with it. When a path is split in two, one decoding path is left with the

  • riginal variables while the other must be handed a copy of them.

Each copy operation takes O(n) time. Thus, the overall time complexity is O(L · n2).

slide-26
SLIDE 26

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1

slide-27
SLIDE 27

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1

slide-28
SLIDE 28

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
slide-29
SLIDE 29

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1

? ? ? ? ? ? ? ?

slide-30
SLIDE 30

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1

?

  • ?
  • ?

? ? ?

slide-31
SLIDE 31

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?
  • ?

? ? ?

slide-32
SLIDE 32

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?
  • ?

? ?

slide-33
SLIDE 33

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?
  • ?

? ?

slide-34
SLIDE 34

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?

? ?

slide-35
SLIDE 35

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?

? ?

slide-36
SLIDE 36

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?
  • ?
slide-37
SLIDE 37

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
  • ?
  • ?
slide-38
SLIDE 38

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
slide-39
SLIDE 39

A closer look at successive cancellation

u0 u1 x0 → y0 x1 → y1 probability pair variable boolean variable (bit)

(P(y0|x0 = 0), P(y0|x0 = 1))

  • x0

(P(y1|x1 = 0), P(y1|x1 = 1))

  • x1

(P(y0y1|u0 = 0), P(y0y1|u0 = 1))

  • u0

(P(y0y1 u0|u1 = 0), P(y0y1 u0|u1 = 1))

  • u1
slide-40
SLIDE 40

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-41
SLIDE 41

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-42
SLIDE 42

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-43
SLIDE 43

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-44
SLIDE 44

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-45
SLIDE 45

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-46
SLIDE 46

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-47
SLIDE 47

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-48
SLIDE 48

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-49
SLIDE 49

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-50
SLIDE 50

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-51
SLIDE 51

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-52
SLIDE 52

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-53
SLIDE 53

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-54
SLIDE 54

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-55
SLIDE 55

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-56
SLIDE 56

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-57
SLIDE 57

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-58
SLIDE 58

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-59
SLIDE 59

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-60
SLIDE 60

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-61
SLIDE 61

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-62
SLIDE 62

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-63
SLIDE 63

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-64
SLIDE 64

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-65
SLIDE 65

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-66
SLIDE 66

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-67
SLIDE 67

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-68
SLIDE 68

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-69
SLIDE 69

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-70
SLIDE 70

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-71
SLIDE 71

A larger example

Key point The memory needed to hold the variables at level t is O(n/2t).

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-72
SLIDE 72

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-73
SLIDE 73

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-74
SLIDE 74

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-75
SLIDE 75

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-76
SLIDE 76

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-77
SLIDE 77

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-78
SLIDE 78

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-79
SLIDE 79

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-80
SLIDE 80

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-81
SLIDE 81

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-82
SLIDE 82

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-83
SLIDE 83

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-84
SLIDE 84

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-85
SLIDE 85

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-86
SLIDE 86

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-87
SLIDE 87

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-88
SLIDE 88

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-89
SLIDE 89

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-90
SLIDE 90

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-91
SLIDE 91

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-92
SLIDE 92

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-93
SLIDE 93

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-94
SLIDE 94

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-95
SLIDE 95

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-96
SLIDE 96

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-97
SLIDE 97

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-98
SLIDE 98

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-99
SLIDE 99

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-100
SLIDE 100

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-101
SLIDE 101

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-102
SLIDE 102

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-103
SLIDE 103

A larger example

Key point Level t is written to once every O(2m−t) stages.

y0 u0 y1 u1 y2 u2 y3 u3 y4 u4 y5 u5 y6 u6 y7 u7 y8 u8 y9 u9 y10 u10 y11 u11 y12 u12 y13 u13 y14 u14 y15 u15 4 3 2 1

slide-104
SLIDE 104

Application to list decoding

In a naive implementation, at each split we make a copy of the variables. We can do better:

At each split, flag the corresponding variables as belonging to both paths. Give each path a unique variable (make a copy) only before that variable will be written to. If a path is killed, deflag its corresponding variables.

Thus, instead of wasting a lot of time on copy operations at each stage, we typically perform only a small number of copy

  • perations.

This was a mile high view, there are many details to be filled (book-keeping, data structures), but the end result is a running time of O(L · n log n) with O(L · n) memory requirements.

slide-105
SLIDE 105

Very recent results

Gross and Sarkis (MacGill University) have recently attained the following results. Full independent verification of our simulation data. Further improvement of performance using systematic polar codes.

  • E. Arıkan, Systematic polar codes, IEEE Comm. Letters, accepted for

publication.

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio [dB]

10−6 10−5 10−4 10−3 10−2 10−1

Bit error rate

Successive cancellation List-decoding (L = 32) WiMax turbo (n = 960) WiMax LDPC (n = 2304) List + CRC-16 (n = 2048)

slide-106
SLIDE 106

Very recent results

Gross and Sarkis (MacGill University) have recently attained the following results. Full independent verification of our simulation data. Further improvement of performance using systematic polar codes.

  • E. Arıkan, Systematic polar codes, IEEE Comm. Letters, accepted for

publication.

1.0 1.5 2.0 2.5 3.0

Signal-to-noise ratio [dB]

10−6 10−5 10−4 10−3 10−2 10−1

Bit error rate

Successive cancellation List-decoding (L = 32) WiMax turbo (n = 960) WiMax LDPC (n = 2304) List + CRC-16 (n = 2048) Systematic + List + CRC-16 (n = 2048)