List Decoding for Universal Polar Codes Boaz Shuval and Ido Tal - - PowerPoint PPT Presentation

list decoding for universal polar codes
SMART_READER_LITE
LIVE PREVIEW

List Decoding for Universal Polar Codes Boaz Shuval and Ido Tal - - PowerPoint PPT Presentation

List Decoding for Universal Polar Codes Boaz Shuval and Ido Tal 1/15 Overview Universal polarization: memoryless setting: o S as glu & Wang, ISIT 2011 settings with memory: Shuval & Tal, ISIT 2019 Error


slide-1
SLIDE 1

List Decoding for Universal Polar Codes

Boaz Shuval and Ido Tal

1/15

slide-2
SLIDE 2

Overview

Universal polarization:

memoryless setting: S ¸as ¸o˘ glu & Wang, ISIT 2011 settings with memory: Shuval & Tal, ISIT 2019

Error probability analysis based on successive cancellation (SC) decoding Potential for better results using successive cancellation list (SCL) decoding

This talk:

An efficient implementation of SCL for universal polarization

2/15

slide-3
SLIDE 3

Setting

Communication with uncertainty:

Encoder: Knows channel belongs to a set of channels Decoder: Knows channel statistics (e.g., via estimation)

Memory:

In channels In input distribution

Universal polar code achieves:

Vanishing error probability over set Best rate (infimal information rate over set)

3/15

slide-4
SLIDE 4

List decoding improves probability of error!

0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC SC upper bound

4/15

slide-5
SLIDE 5

List decoding improves probability of error!

0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC SC upper bound

4/15

slide-6
SLIDE 6

List decoding improves probability of error!

0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC L = 1 SC upper bound

4/15

slide-7
SLIDE 7

List decoding improves probability of error!

0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC L = 1 L = 2 L = 4 L = 8 L = 16 SC upper bound

4/15

slide-8
SLIDE 8

List decoding improves probability of error!

0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BEC L = 1 L = 2 L = 4 L = 8 L = 16 SC upper bound

4/15

slide-9
SLIDE 9

List decoding improves probability of error!

0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate AWGN L = 1 L = 2 L = 4 L = 8 L = 16 SC upper bound

4/15

slide-10
SLIDE 10

SCL in a nutshell

Parameter: list size L Successively decode u0, u1, . . . , uN−1 Decoding path: sequence of decoding decisions ˆ ui−1 = ˆ u0 ˆ u1 . . . ˆ ui−1

  • If ui is not frozen, split each decoding path into two paths,

ˆ ui−1 ˆ ui−1

  • ˆ

ui−1 1

  • For each i, keep L best decoding paths

Final decoding: choose best path

5/15

slide-11
SLIDE 11

The Universal Construction

Concatenation of two transform blocks UN−1 fast slow channel universal YN−1 XN−1

Slow — for universality Fast (Arıkan) — for vanishing error probability

6/15

slide-12
SLIDE 12

The Universal Construction

Concatenation of two transform blocks UN−1 channel universal YN−1 XN−1

slow slow slow slow fast fast fast

Slow — for universality Fast (Arıkan) — for vanishing error probability

Each block contains multiple copies of basic transforms Polar: certain indices of UN−1 are data bits,

  • ther indices “frozen” (Honda & Yamamoto 2013)

Universal: data bit locations regardless of channel

6/15

slide-13
SLIDE 13

Fast Transform vs. Slow Transform

Fast (Arıkan) transform Slow transform

7/15

slide-14
SLIDE 14

Fast Transform vs. Slow Transform

Fast (Arıkan) transform Slow transform

7/15

slide-15
SLIDE 15

Fast Transform vs. Slow Transform

Fast (Arıkan) transform Slow transform

lateral lateral 7/15

slide-16
SLIDE 16

Fast Transform vs. Slow Transform

Fast (Arıkan) transform Slow transform

lateral lateral

lateral lateral medial

7/15

slide-17
SLIDE 17

Fast Transform vs. Slow Transform

Fast (Arıkan) transform Slow transform

lateral lateral

lateral lateral medial

i i i + 1 i

7/15

slide-18
SLIDE 18

The Rules of Slow

Any medial on left is a child of medial − and medial + on right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-19
SLIDE 19

The Rules of Slow

Any medial on left is a child of medial − and medial + on right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-20
SLIDE 20

The Rules of Slow

Any medial on left is a child of medial − and medial + on right Update of a medial +

  • n left updates both

medial − and medial +

  • n right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-21
SLIDE 21

The Rules of Slow

Any medial on left is a child of medial − and medial + on right Update of a medial +

  • n left updates both

medial − and medial +

  • n right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-22
SLIDE 22

The Rules of Slow

Any medial on left is a child of medial − and medial + on right Update of a medial +

  • n left updates both

medial − and medial +

  • n right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-23
SLIDE 23

The Rules of Slow

Any medial on left is a child of medial − and medial + on right Update of a medial +

  • n left updates both

medial − and medial +

  • n right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-24
SLIDE 24

The Rules of Slow

Any medial on left is a child of medial − and medial + on right Update of a medial +

  • n left updates both

medial − and medial +

  • n right

Corollary

Update of a medial + on left results in a cascade of updates, all the way to the channel on the far right

lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral

lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral

8/15

slide-25
SLIDE 25

List Decoding — Fast Transform vs. Slow Transform Fast Transform

The layer at depth λ gets updated once every 2λ writes to the U vector Deep layers get updated infrequently

Slow Transform

The layer at depth λ gets updated every other write to the U vector All layers get updated frequently

9/15

slide-26
SLIDE 26

List Decoding — Fast Transform vs. Slow Transform Fast Transform

The layer at depth λ gets updated once every 2λ writes to the U vector Deep layers get updated infrequently Good news for sharing data between list decoding paths

Slow Transform

The layer at depth λ gets updated every other write to the U vector All layers get updated frequently Bad news for sharing data between list decoding paths

9/15

slide-27
SLIDE 27

Is All Hope Lost? Fast Transform

Layer at depth λ is updated once every 2λ writes At layer of depth λ, 2λ indices are updated at

  • nce

Slow Transform

Layer at depth λ is updated at every other write At layer of depth λ, at most two indices are updated

10/15

slide-28
SLIDE 28

Is All Hope Lost? Fast Transform

Layer at depth λ is updated once every 2λ writes At layer of depth λ, 2λ indices are updated at

  • nce

Slow Transform

Layer at depth λ is updated at every other write At layer of depth λ, at most two indices are updated

Our Saving Grace

The updates are cyclic!

10/15

slide-29
SLIDE 29

A Bespoke Data Structure

Aim: mimic the fast transform’s update regime

Frequent

small

updates Infrequent large updates

Bespoke data structure: Cyclic Exponential Array Main idea: an array comprised of sub-arrays, growing exponentially in length

11/15

slide-30
SLIDE 30

Example

entire array: cyclic exponential array: last written value

12/15

slide-31
SLIDE 31

Example

entire array: cyclic exponential array: last written value

12/15

slide-32
SLIDE 32

Example

entire array: cyclic exponential array: last written value 1 1

12/15

slide-33
SLIDE 33

Example

entire array: cyclic exponential array: last written value 1 2 1 2

12/15

slide-34
SLIDE 34

Example

entire array: cyclic exponential array: last written value 1 2 3 1 2 3

12/15

slide-35
SLIDE 35

Example

entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4

12/15

slide-36
SLIDE 36

Example

entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4 4 5 4 5

12/15

slide-37
SLIDE 37

Example

entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4 4 5 4 5 5 6 4 5 6

12/15

slide-38
SLIDE 38

Example

entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4 4 5 4 5 5 6 4 5 6 6 7 6 7

12/15

slide-39
SLIDE 39

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8

12/15

slide-40
SLIDE 40

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 9 8 9

12/15

slide-41
SLIDE 41

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 9 8 9 10 10

12/15

slide-42
SLIDE 42

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 9 8 9 10 10 10 10 11 11

12/15

slide-43
SLIDE 43

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12

12/15

slide-44
SLIDE 44

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12 12 13 12 13

12/15

slide-45
SLIDE 45

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12 12 12 13 13 14 14

12/15

slide-46
SLIDE 46

Example

entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12 12 12 13 13 14 14 14 14 15 15

12/15

slide-47
SLIDE 47

Example

entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a

12/15

slide-48
SLIDE 48

Example

entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a a a b b

12/15

slide-49
SLIDE 49

Example

entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a a a b b c c

12/15

slide-50
SLIDE 50

Example

entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a a a b b c c

And so on...

12/15

slide-51
SLIDE 51

Theorem

Let: Blocklength N Channel with or without memory (S states) List size L

Theorem

Our SCL decoder for universal polar codes has: running time: O(L · S3 · N log N) space complexity: O(L · S2 · N)

13/15

slide-52
SLIDE 52

We made the deadline, thanks to

Computing resources (cores galore)

Amir Baer Goel Samuel

And we parallelized using the excellent

GNU Parallel by Ole Tange1

Origami artwork by Bernie Peyton

  • 1O. Tange (2018): GNU Parallel 2018, March 2018,

https://doi.org/10.5281/zenodo.1146014

14/15

slide-53
SLIDE 53

Something to look forward to Look out for our code on github!

For the impatient: email us at polarbear@technion.ac.il Universal / Non-universal Memory / No memory p

  • l

a r e n c

  • d

i n g / d e c

  • d

i n g w i t h S C L a n d C R C

15/15