List Decoding for Universal Polar Codes
Boaz Shuval and Ido Tal
1/15
List Decoding for Universal Polar Codes Boaz Shuval and Ido Tal - - PowerPoint PPT Presentation
List Decoding for Universal Polar Codes Boaz Shuval and Ido Tal 1/15 Overview Universal polarization: memoryless setting: o S as glu & Wang, ISIT 2011 settings with memory: Shuval & Tal, ISIT 2019 Error
Boaz Shuval and Ido Tal
1/15
Universal polarization:
memoryless setting: S ¸as ¸o˘ glu & Wang, ISIT 2011 settings with memory: Shuval & Tal, ISIT 2019
Error probability analysis based on successive cancellation (SC) decoding Potential for better results using successive cancellation list (SCL) decoding
This talk:
An efficient implementation of SCL for universal polarization
2/15
Communication with uncertainty:
Encoder: Knows channel belongs to a set of channels Decoder: Knows channel statistics (e.g., via estimation)
Memory:
In channels In input distribution
Universal polar code achieves:
Vanishing error probability over set Best rate (infimal information rate over set)
3/15
0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC SC upper bound
4/15
0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC SC upper bound
4/15
0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC L = 1 SC upper bound
4/15
0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BSC L = 1 L = 2 L = 4 L = 8 L = 16 SC upper bound
4/15
0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate BEC L = 1 L = 2 L = 4 L = 8 L = 16 SC upper bound
4/15
0.25 0.3 0.35 0.4 0.45 0.5 10−5 10−4 10−3 10−2 10−1 100 Capacity Word Error Rate AWGN L = 1 L = 2 L = 4 L = 8 L = 16 SC upper bound
4/15
Parameter: list size L Successively decode u0, u1, . . . , uN−1 Decoding path: sequence of decoding decisions ˆ ui−1 = ˆ u0 ˆ u1 . . . ˆ ui−1
ˆ ui−1 ˆ ui−1
ui−1 1
Final decoding: choose best path
5/15
Concatenation of two transform blocks UN−1 fast slow channel universal YN−1 XN−1
Slow — for universality Fast (Arıkan) — for vanishing error probability
6/15
Concatenation of two transform blocks UN−1 channel universal YN−1 XN−1
slow slow slow slow fast fast fast
Slow — for universality Fast (Arıkan) — for vanishing error probability
Each block contains multiple copies of basic transforms Polar: certain indices of UN−1 are data bits,
Universal: data bit locations regardless of channel
6/15
Fast (Arıkan) transform Slow transform
7/15
Fast (Arıkan) transform Slow transform
7/15
Fast (Arıkan) transform Slow transform
lateral lateral 7/15
Fast (Arıkan) transform Slow transform
lateral lateral
lateral lateral medial
7/15
Fast (Arıkan) transform Slow transform
lateral lateral
lateral lateral medial
i i i + 1 i
7/15
Any medial on left is a child of medial − and medial + on right
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
Any medial on left is a child of medial − and medial + on right
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
Any medial on left is a child of medial − and medial + on right Update of a medial +
medial − and medial +
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
Any medial on left is a child of medial − and medial + on right Update of a medial +
medial − and medial +
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
Any medial on left is a child of medial − and medial + on right Update of a medial +
medial − and medial +
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
Any medial on left is a child of medial − and medial + on right Update of a medial +
medial − and medial +
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
Any medial on left is a child of medial − and medial + on right Update of a medial +
medial − and medial +
Corollary
Update of a medial + on left results in a cascade of updates, all the way to the channel on the far right
lateral lateral lateral medial − medial + medial − medial + medial − medial + medial − medial + medial − medial + lateral lateral lateral
lateral lateral medial − medial − medial + medial + medial − medial − medial + medial + medial − medial − medial + medial + lateral lateral
8/15
The layer at depth λ gets updated once every 2λ writes to the U vector Deep layers get updated infrequently
The layer at depth λ gets updated every other write to the U vector All layers get updated frequently
9/15
The layer at depth λ gets updated once every 2λ writes to the U vector Deep layers get updated infrequently Good news for sharing data between list decoding paths
The layer at depth λ gets updated every other write to the U vector All layers get updated frequently Bad news for sharing data between list decoding paths
9/15
Layer at depth λ is updated once every 2λ writes At layer of depth λ, 2λ indices are updated at
Layer at depth λ is updated at every other write At layer of depth λ, at most two indices are updated
10/15
Layer at depth λ is updated once every 2λ writes At layer of depth λ, 2λ indices are updated at
Layer at depth λ is updated at every other write At layer of depth λ, at most two indices are updated
Our Saving Grace
The updates are cyclic!
10/15
Aim: mimic the fast transform’s update regime
Frequent
small
updates Infrequent large updates
Bespoke data structure: Cyclic Exponential Array Main idea: an array comprised of sub-arrays, growing exponentially in length
11/15
entire array: cyclic exponential array: last written value
12/15
entire array: cyclic exponential array: last written value
12/15
entire array: cyclic exponential array: last written value 1 1
12/15
entire array: cyclic exponential array: last written value 1 2 1 2
12/15
entire array: cyclic exponential array: last written value 1 2 3 1 2 3
12/15
entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4
12/15
entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4 4 5 4 5
12/15
entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4 4 5 4 5 5 6 4 5 6
12/15
entire array: cyclic exponential array: last written value 1 2 3 4 1 2 3 4 4 5 4 5 5 6 4 5 6 6 7 6 7
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 9 8 9
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 9 8 9 10 10
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 9 8 9 10 10 10 10 11 11
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12 12 13 12 13
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12 12 12 13 13 14 14
12/15
entire array: cyclic exponential array: last written value 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 8 8 9 9 10 10 11 11 12 12 12 12 13 13 14 14 14 14 15 15
12/15
entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a
12/15
entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a a a b b
12/15
entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a a a b b c c
12/15
entire array: cyclic exponential array: last written value previous cycle 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 11 12 12 13 13 14 14 15 15 a a a a b b c c
12/15
Let: Blocklength N Channel with or without memory (S states) List size L
Theorem
Our SCL decoder for universal polar codes has: running time: O(L · S3 · N log N) space complexity: O(L · S2 · N)
13/15
Computing resources (cores galore)
Amir Baer Goel Samuel
And we parallelized using the excellent
GNU Parallel by Ole Tange1
Origami artwork by Bernie Peyton
https://doi.org/10.5281/zenodo.1146014
14/15
For the impatient: email us at polarbear@technion.ac.il Universal / Non-universal Memory / No memory p
a r e n c
i n g / d e c
i n g w i t h S C L a n d C R C
15/15