algorithmic and combinatorial methods to discover low
play

Algorithmic and Combinatorial Methods to Discover Low Weight - PowerPoint PPT Presentation

Algorithmic and Combinatorial Methods to Discover Low Weight Pseudo-Codewords Shashi Kiran Chilappagari 1 Bane Vasic 1 Mikhail Stepanov 2 Michael Chertkov 3 1 Dept. of Elec. and Comp. Eng., University of Arizona, Tucson, AZ. 2 Dept. of Mathematics,


  1. Algorithmic and Combinatorial Methods to Discover Low Weight Pseudo-Codewords Shashi Kiran Chilappagari 1 Bane Vasic 1 Mikhail Stepanov 2 Michael Chertkov 3 1 Dept. of Elec. and Comp. Eng., University of Arizona, Tucson, AZ. 2 Dept. of Mathematics, University of Arizona, Tucson, AZ. 3 T-4, Theoretical Division, LANL, Los Alamos, NM. Physics of Algorithms Workshop, Santa Fe, NM September 2, 2009 S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  2. LDPC Codes Representation Codes based on sparse bipartite graphs Tanner graph with variable node set V and check node set C ( d v , d c ) regular bipartite graphs, irregular bipartite graphs S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  3. Preliminaries Decoding algorithms Belief propagation (BP), Linear programming (LP) decoder, Gallager type decoders, Bit flipping decoders Asymptotic Analysis Density evolution, EXIT charts, Expansion arguments (can be applied to finite length codes too) Error Floor Abrupt degradation in performance in the high SNR region. Due to sub-optimality of the decoder. S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  4. Error Surface noise ... Point at the ES errors errors errors ❅ closest to “0” ❅ ❅ ❅ noise 2 ✟ ✟✟✟✟ ❅ Error Surface (ES) (decoding specific) 0 no errors no errors no errors noise 1 S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  5. The LP Decoder Feldman et al. ‘05 Channel Codeword y = ( y 1 , . . . , y n ) transmitted over a symmetric y = (ˆ y 1 , . . . , ˆ y n ) . memoryless channel and received as ˆ Log-likelihood-ratio (LLR) corresponding to variable node i � Pr (ˆ � y i | y i = 0 ) γ i = log . Pr (ˆ y i | y i = 1 ) ML-LP Decoder poly( C ): Codeword polytope whose vertices correspond to codewords in C Find f = ( f 1 , . . . , f n ) minimizing the cost function � i ∈ V γ i f i subject to the constraint f ∈ poly ( C ) S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  6. Realxed Polytope Associating ( f 1 , . . . , f n ) with bits of the code we require 0 ≤ f i ≤ 1 , ∀ i ∈ V (1) For every check node j , let N ( j ) denote the set of variable nodes which are neighbors of j . Let E j = { T ⊆ N ( j ) : | T | is even } . The polytope Q j associated with the check node j is defined as the set of points ( f , w ) for which the following constraints hold 0 ≤ w j , T ≤ 1 , ∀ T ∈ E j (2) � T ∈ E j w j , T = 1 (3) f i = � T ∈ E j , T ∋ i w j , T , ∀ i ∈ N ( j ) (4) Now, let Q = ∩ j Q j be the set of points ( f , w ) such that (1)-(4) hold for all j ∈ C . The Linear Code Linear Program (LCLP) can be stated as � γ i f i , s.t. ( f , w ) ∈ Q . min ( f , w ) i ∈ V S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  7. Features of LP Decoding ML Certificate Integer solution is a codeword. If the LP decoder outputs a codeword, then the ML decoder would also output the same codeword. Other Advantages of LP Decoding Discrete output: no numerical issues Attractive from analysis point of view Systematic sequential improvement Assumptions Channel: binary symmetric channel (BSC) Transmission of the all-zero-codeword S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  8. Pseudo-codewords A pseudo-codeword , p = ( p 1 , . . . , p n ) where 0 ≤ p i ≤ 1 , ∀ i ∈ V , is simply equal to the output of the LP decoder. Cost For the BSC, γ i = 1 if i = 0 and γ i = − 1 if i = 1 The output of the LP decoder on a received vector r is the pseudo-codeword p which has the minimum cost associated with it, where the cost ( r , p ) is given by � � cost ( r , p ) = p i − p i . ∈ supp ( r ) i ∈ supp ( r ) i / The cost associated with decoding any vector r to the all-zero-codeword is zero S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  9. Example Consider a code of length 7 Let p = ( 1 , 0 , 1 2 , 2 3 , 1 3 , 0 , 1 2 ) be a pseudo-codeword Let the received vector be r = ( 1 , 1 , 0 , 0 , 0 , 0 , 0 ) . Then, γ = ( − 1 , − 1 , 1 , 1 , 1 , 1 , 1 ) cost ( r , 0 ) = 0 and cost ( r , p ) = 1 If r = ( 1 , 0 , 0 , 1 , 0 , 0 , 0 ) . Then, γ = ( − 1 , 1 , 1 , − 1 , 1 , 1 , 1 ) cost ( r , 0 ) = 0 and cost ( r , p ) = − 1 3 S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  10. Weight and Median Weight of a pseudo-codeword Forney et al. ‘01 Let p = ( p 1 , . . . , p n ) be a pseudo-codeword distinct from the all-zero-codeword. Let e be the smallest number such that the sum of �� � the e largest p i s is at least / 2. Then, the BSC i ∈ V p i pseudo-codeword weight of p is � �� � 2 e , if � e p i = / 2 ; i ∈ V p i w BSC ( p ) = if � �� � 2 e − 1 , e p i > i ∈ V p i / 2 . Median The median noise vector (or simply the median) M ( p ) of a pseudo-codeword p distinct from the all-zero-codeword is a binary vector with support S = { i 1 , i 2 , . . . , i e } , such that p i 1 , . . . , p i e are the e (= ⌈ w BSC ( p ) / 2 ⌉ ) largest components of p . S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  11. Instanton The BSC instanton i is a binary vector with the following properties: There exists a pseudo-codeword p such that 1 cost ( i , p ) ≤ cost ( i , 0 ) = 0 For any binary vector r such that supp ( r ) ⊂ supp ( i ) , there exists 2 no pseudo-codeword with cost ( r , p ) ≤ 0. The size of an instanton is the cardinality of its support. Theorem Let i be an instanton. Then for any binary vector r such that supp ( i ) ⊂ supp ( r ) , there exists a pseudo-codeword p satisfying cost ( r , p ) ≤ 0. S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  12. Instanton Search Algorithm Initialization ( l = 0) step Initialize to a binary input vector r containing sufficient number of flips so that the LP decoder decodes it into a pseudo-codeword different from the all-zero-codeword. Apply the LP decoder to r and denote the pseudo-codeword output of LP by p 1 . l ≥ 1 step Take the pseudo-codeword p l (output of the ( l − 1 ) step) and calculate its median M ( p l ) . Apply the LP decoder to M ( p l ) and denote the output by p M l . Only two cases arise: w BSC ( p M l ) < w BSC ( p l ) . Then p l + 1 = p M l becomes the l -th step output/ ( l + 1 ) step input. w BSC ( p M l ) = w BSC ( p l ) . Let the support of M ( p l ) be S = { i 1 , . . . , i k l } . Let S i t = S \{ i t } for some i t ∈ S . Let r i t be a binary vector with support S i t . Apply the LP decoder to all r i t and denote the i t -output by p i t . If p i t = 0 , ∀ i t , then M ( p l ) is the desired instanton and the algorithm halts. Else, p i t � = 0 becomes the l -th step output/ ( l + 1 ) step input. S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  13. Illustration of the ISA (a) (b) (c) (d) 0 (e) Figure: Squares represent pseudo-codewords and circles represent medians or related noise configurations (a) LP decodes median of a pseudo-codeword into another pseudo-codeword of smaller weight (b) LP decodes median of a pseudo-codeword into another pseudo-codeword of the same weight (c) LP decodes median of a pseudo-codeword into the same pseudo-codeword (d) Reduced subset (three different green circles) of a noise configuration (e.g. of a median from the previous step of the ISA) is decoded by the LP decoder into three different pseudo-codewords (e) LP decodes the median (blue circle) of a pseudo-codeword (low red square) into another pseudo-codeword of the same weigh (upper red square). Reduced subset of the median (three configurations depicted as green circles are all decoded by LP into all-zero-codeword. Thus, the median is an instanton. S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  14. The ISA Converges Lemma 1 Let M ( p ) be a median of p with support S . Then the result of LP decoding of any binary vector with support S ′ ⊂ S and | S ′ | < | S | is distinct from p . Lemma 2 If the output of the LP decoder on M ( p ) is a pseudo-codeword p M � = p , then w BSC ( p M ) ≤ w BSC ( p ) . Also, cost ( M ( p ) , p M ) ≤ cost ( M ( p ) , p ) . Theorem w BSC ( p l ) and | supp ( M ( p l )) | are monotonically decreasing. Also, the ISA terminates in at most 2 k 0 steps, where k 0 is the number of flips in the input. S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

  15. Instanton Statistics Table: Instanton statistics obtained by running the ISA with 20 random flips for 10000 initiations for the Tanner code and the MacKay code Number of instantons of weight Code 4 5 6 7 8 9 Total 3506 1049 1235 1145 1457 Tanner code Unique 155 675 1028 1129 1453 Total 213 749 2054 2906 2418 1168 MacKay code Unique 26 239 1695 2864 2417 1168 S. K. Chilappagari, B. Vasic, M. Stepanov, M. Chertkov Discovering Low Weight Pseudo-Codewords

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend