linear programming decoding of tanner codes with local
play

Linear-Programming Decoding of Tanner Codes with Local-Optimality - PowerPoint PPT Presentation

Linear-Programming Decoding of Tanner Codes with Local-Optimality Certificates Nissim Halabi Guy Even School of Electrical Engineering, Tel-Aviv University July 6, 2012 1/16 Communication Over a Noisy Channel u { 0 , 1 } k c C { 0


  1. Linear-Programming Decoding of Tanner Codes with Local-Optimality Certificates Nissim Halabi Guy Even School of Electrical Engineering, Tel-Aviv University July 6, 2012 1/16

  2. Communication Over a Noisy Channel u ∈ { 0 , 1 } k c ∈ C ⊂ { 0 , 1 } N λ ( y ) ∈ R N ˆ c ∈ { 0 , 1 } N Channel Channel Noisy Channel noisy codeword Encoder Decoder codeword u ∈ { 0 , 1 } k ˆ MBIOS channel: memoryless, binary-input, output-symmetric Log-Likelihood-Ratio (LLR): � Pr( y i | c i = 0) � λ i ( y i ) � ln Pr( y i | c i = 1) Linear Code: C ⊆ { 0 , 1 } N is subspace of F N 2 of dimension k . Optimal decoding: Maximum Likelihood decoding. Input: y . Output: ml ( y ) . ml ( y ) � arg max Pr { y | c = x } x ∈C = arg min � λ ( y ) , x � x ∈C 2/16

  3. Tanner Graphs and Tanner Codes G = ( V ∪ J , E ) Tanner code C ( G, C J ) represented x 1 v 1 by bipartite graph x 2 v 2 x ∈ C ( G, C J ) iff x ∈ C j for every C 1 C 1 x 3 v 3 j ∈ { 1 , . . . , J } C 2 C 2 x 4 v 4 degrees: can be regular, irregular, bounded, or arbitrary x 5 v 5 C 3 C 3 can allow arbitrary linear local x 6 v 6 codes C 4 C 4 x 7 v 7 minimum local distance d ∗ � min j d j x 8 v 8 C 5 C 5 x 9 v 9 Examples: LDPC codes x 10 v 10 [Gallager’63], Expander codes [Sipser-Spielman’96] V J Variable Nodes Local-Code Nodes 3/16

  4. Linear Programming (LP) Decoding conv( X ) ⊆ R N - the convex hull a set of points X ⊆ R N . ML-decoding can be rephrased: ml ( y ) � arg min � λ ( y ) , x � x ∈ conv( C ) Generalized fundamental polytope of a Tanner code C ( G, C J ) - relaxation of conv( C ) [following Feldman-Wainwright-Karger’05] P ( G, C J ) � � conv( C j ) C j ∈C J LP-decoding: lp ( y ) � arg min � λ ( y ) , x � x ∈P ( G, C J ) 4/16

  5. LP Decoding with ML Certificate LP-decode ( λ ) x lp ← arg min x ∈P ( G, C J ) � λ, x � . solve LP: ˆ x lp ∈ { 0 , 1 } N then if ˆ x lp is an ML codeword return ˆ else return fail end if Polynomial time algorithm Applies to any MBIOS channel! Integral solution ⇒ ML-certificate 5/16

  6. Goal: Analysis of Finite Length Codes Problem (Finite Length Analysis) Design: Constant rate code C ( G, C J ) and an efficient decoding algorithm dec . Analyze: If SNR > t , then Pr ( dec ( λ ) � = x | c = x ) � exp ( − N α ) for some 0 < α . Goal: Minimize t (lower bound on SNR). Remarks: Not an asymptotic problem Code is not chosen randomly from an ensemble Successful decoding � = ML decoding 6/16

  7. Certificate for ML-Optimality / LP-Optimality Problem (Optimality Certificate) Input: Channel observation λ and a codeword x ∈ C Question 1: Is x ML-optimal with respect to λ ? is it unique? (NP-Hard) Question 2: Is x LP-optimal with respect to λ ? is it unique? Relax: one-sided error test A positive answer = certificate for the optimality of x w.r.t. λ A negative answer = don’t know if optimal or not (allow one sided error) Prefer: efficient test via local computations ⇒ “Local-Optimality” criterion 7/16

  8. Definition of Local-Optimality [Feldman’03] For x ∈ { 0 , 1 } N and f ∈ [0 , 1] N ⊆ R N , define the relative point x ⊕ f by ( x ⊕ f ) i � | x i − f i | Consider a finite set of “deviations” � B ⊂ [0 , 1] N Definition (following [Arora-Daskalakis-Steurer’09]) A codeword x ∈ C is locally-optimal w.r.t. λ ∈ R N if for all vectors β ∈ B , � λ, x ⊕ β � > � λ, x � Goal λ Find a set B such that: ML ( λ ) 1 x ∈ lo ( λ ) ⇒ x = ml ( λ ) & unique LP ( λ ) 2 x ∈ lo ( λ ) ⇒ x = lp ( λ ) & unique LO ( λ ) 3 Pr λ { x ∈ lo ( λ ) | c = x } = 1 − o (1) 8/16

  9. Set B : Projections of Normalized Weighted Subtrees in Computation Trees of the Tanner Graph 1. Tanner graph G 2. Computation tree of G , 3. d -tree: subtree T height = 2 h , root= var node r deg (local-code node)= d T 4 r r ( r ) 3-tree ( d = 3) 4. Weight function 5. Projection of weighted d -tree to Tanner graph. w T : ˆ Deviation β ∈ R N = projection assignment to var. nodes V ( T ) → R β v + w T ( p ) w 1 w 2 9/16

  10. Local-Optimality based on Deviations Set B ( w ) d Set of deviations B ( w ) = projections of w -weighted d -trees d � � β ∈ R N | ∃ w − weighted d − tree T B ( w ) � : β = π ( T ) d Definition A codeword x ∈ C is ( h, w, d ) -locally optimal w.r.t. λ ∈ R N if for all vectors β ∈ B ( w ) , d � λ, x ⊕ β � > � λ, x � 2 h - tree height ( h levels) w ∈ R h + - tree level weights d � 2 - local-code nodes degree 10/16

  11. d -Trees Get “Fatter” As d Increases 2−tree 3−tree 4−tree (skinny tree [ADS’09]) Over an MBIOS channel, the probability of a local-optimality certificate increases as deviations become denser 11/16

  12. Local Optimality ⇒ unique ML-codeword Theorem Let d � 2 . If x is ( h, w, d ) -locally optimal w.r.t. λ , then x is the unique ML-codeword w.r.t. λ . hard: is x the unique ML-codeword? easy: is x is locally optimal? (dynamic programming) Proof method: Lemma (Decomposition Lemma) Every codeword is a conic combination of projections of weighted d -trees in computation trees of G x = α · E β ∈ ρ B ( w ) d [ β ] Following [ADS’09]: decomposition lemma ⇒ unique ML 12/16

  13. Local Optimality ⇒ unique LP optimality Theorem If x is a ( h, w, d ) -locally optimal codeword w.r.t. λ , then x is also the unique optimal LP solution given λ . Proof method: Use graph cover decoding [Koetter-Vontobel’05]: In graph 1 covers, realization of LP-opt and ML codeword are the same Lemma: local-optimality is invariant w.r.t. lifting to covering 2 graphs Lift of locally optimal codeword is the unique ML-codeword in 3 the graph cover. 13/16

  14. Local-Optimality for LP-decoding - Comparison [KV’06][ADS’09][HE’11] Current h is unbounded. Charac- h < 1 Deviation 4 girth( G ) . Local terization using computa- height isomorphism tion trees Irregular Tanner graph. Add normalization factors Regularity Regular Tanner graph according to node degrees [Von’10] Linear codes. Tighter re- Constraints Single parity-check laxation for the generalized codes fundamental polytope (also in [Von’10]) “skinny”. Locally satis- “fat”. Not necessarily sat- Deviations fies parity checks. isfies local codes. LP solu- Use reduction to ML via Dual/Primal LP. Poly- tion anal- characterization of graph hedral analysis. ysis cover decoding 14/16

  15. Probabilistic Analysis for Regular Tanner Codes - Examples Form of finite length bounds: ∃ c > 1 . ∃ t. ∀ noise < t . Pr { LP decoder fails } � exp ( − c girth ) If girth = θ (log N ) , then Pr { LP decoder fails } � exp ( − N α ) , for 0 < α < 1 N → ∞ : t is a lower bound on the threshold of LP-decoding with LO-certificate [Skachek-Roth’03] [Feldman-Stein’05] Current work Decoder Iterative LP LP Channels Bit-flipping (worst- Bit-flipping (worst- MBIOS (average- case) case) case) Technique Expansion Expansion Density evolution of sum-min-sum random process Example: d R >> 2 d R >> 2 d R = 16 BSC( p ) threshold d ∗ = 4 d ∗ >> 2 d ∗ >> 2 (2 , d R ) -reg Tan- ner code, Rate= 0 . 375 p lp > 0 . 0008 p lp > 0 . 044 p iterat. > 0 . 0016 15/16

  16. Summary Conclusions Follows line of works based on combinatorial characterizations of local-optimality [KV’06] [ADS’09] [Von’10] [HE’11]: A new combinatorial characterization of local-optimality for 1 irregular Tanner codes Local-opt. ⇒ ML-opt. 2 Local-opt. ⇒ LP-opt. 3 Efficiently computed certificate (dynamic-programming) 4 Upper bounds on the word error probability of LP-decoding 5 Open questions Prove bounds on noise thresholds for LP-decoding that are better than p lp � 0 . 05 for rate- 1 2 codes [ADS’09] Probabilistic analysis for irregular Tanner codes Probabilistic analysis beyond the girth 16/16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend