universal polarization for processes with memory
play

Universal Polarization for Processes with Memory Boaz Shuval and - PowerPoint PPT Presentation

Universal Polarization for Processes with Memory Boaz Shuval and Ido Tal Andrew and Erna Viterbi Department of Electrical Engineering Technion Israel Institute of Technology Haifa, 32000, Israel July 2019 1/16 Setting Communication


  1. Universal Polarization for Processes with Memory Boaz Shuval and Ido Tal Andrew and Erna Viterbi Department of Electrical Engineering Technion — Israel Institute of Technology Haifa, 32000, Israel July 2019 1/16

  2. Setting � Communication with uncertainty: � Encoder: Knows channel belongs to a set of channels � Decoder: Knows channel statistics (e.g., via estimation) � Memory: � In channels � In input distribution � Universal code: � Vanishing error probability over set � Best rate (infimal information rate over set) Goal: Universal Code based on Polarization 2/16

  3. Why? � Polar codes have many good properties � rate-optimal (even under memory!) � vanishing error probability � low complexity encoding/decoding/construction � But... � Polar codes must be tailored to the channel at hand � Sometimes, the channel isn’t known apriori to encoder � Example: Frequency Selective Fading ⇒ ISI � m � � Y n = h 0 X n + h i X n − i + noise i = 1 3/16

  4. Polar Codes: lightning reminder X N Y N 1 1 Channel � Goal: Decode X N 1 from Y N 1 4/16

  5. Polar Codes: lightning reminder X N Y N 1 1 Channel 1 = f Arıkan ( X N 1 ) F N � Goal: Decode X N 1 from Y N 1 � Transform f Arıkan is one-to-one and onto � recursively defined 1 ⇐ ⇒ Decoding F N � Decoding X N 1 4/16

  6. Polar Codes: lightning reminder X N Y N 1 1 Channel 1 = f Arıkan ( X N 1 ) G i = ( F i − 1 , Y N 1 ) F N 1 � Successive-Cancellation decoding: � Compute G i from decoded F i − 1 1 � Decode F i from G i � Polarization: fix β < 1 / 2 L N = { i | H ( F i | G i ) < 2 − N β } � Low-Entropy set: � High-Entropy set: H N = { i | H ( F i | G i ) > 1 − 2 − N β } � For N large, | L N | + | H N | ≈ N � Coding scheme (simplified): � i ∈ L N ⇒ Transmit data � i ∈ H N ⇒ Reveal to decoder 4/16

  7. Polar Codes: lightning reminder X N Y N 1 1 Channel 1 = f Arıkan ( X N 1 ) G i = ( F i − 1 , Y N 1 ) F N 1 � Successive-Cancellation decoding: � Compute G i from decoded F i − 1 1 � Decode F i from G i � Polarization: fix β < 1 / 2 L N = { i | H ( F i | G i ) < 2 − N β } � Low-Entropy set: � High-Entropy set: H N = { i | H ( F i | G i ) > 1 − 2 − N β } � For N large, | L N | + | H N | ≈ N � Coding scheme (simplified): Not Universal! � i ∈ L N ⇒ Transmit data L N , H N channel-dependent � i ∈ H N ⇒ Reveal to decoder 4/16

  8. Previous Work on Universal Polarization � All for the memoryless case � Works with memoryless settings similar to ours: � Hassani & Urbanke 2014 � S ¸as ¸o˘ glu& Wang 2016 (conference version: 2014) 5/16

  9. Previous Work on Universal Polarization � All for the memoryless case � Works with memoryless settings similar to ours: � Hassani & Urbanke 2014 � S ¸as ¸o˘ glu& Wang 2016 (conference version: 2014) 5/16

  10. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” 6/16

  11. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” L f H X N F N 1 1 � f one-to-one and onto, recursively defined � ( η , L , H )-monopolarization: For any η > 0, there exist N and index sets L , H such that H ( F i | G i ) < η for all i ∈ L either H ( F i | G i ) > 1 − η for all i ∈ H or � Universal : L , H process independent � Slow 6/16

  12. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” i ∈ L ⇒ H ( F i | G i ) < η L f N H L f H L f H ˆ N copies L f H 6/16

  13. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” L ˆ f f Arıkan N N H L f H L f H ˆ N copies L f H 6/16

  14. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” L ˆ f f Arıkan N N H L f H f Arıkan L f | L | copies H ˆ N copies f Arıkan L f H 6/16

  15. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” L ˆ f f Arıkan N N H L f P e ≤ | L | · 2 − ˆ N β H f Arıkan Rate ≈ | L | L f | L | copies H N ˆ N copies f Arıkan L f H 6/16

  16. Our Construction � Simplified generalization of S ¸as ¸o˘ glu-Wang construction � Memory at channel and/or input � Two stages: “slow” and “fast” Our focus L ˆ f f Arıkan N N H L f P e ≤ | L | · 2 − ˆ N β H f Arıkan Rate ≈ | L | L f | L | copies H N ˆ N copies f Arıkan L f H 6/16

  17. A framework for memory � Stationary process: ( S i , X i , Y i ) N i = 1 � Finite number of states: S i ∈ S , where | S | < ∞ � Hidden state: S i is unknown to encoder and decoder � Markov property: P ( s i , x i , y i |{ s j , x j , y j } j < i ) = P ( s i , x i , y i | s i − 1 ) � FAIM state sequence: F inite-state, a periodic, i rreducible M arkov chain � ( X i , Y i ) N i = 1 FAIM-derived process � FAIM ⇒ mixing: if M − N large enough, ( X N −∞ , Y N −∞ ) and ( X ∞ M , Y ∞ M ) almost independent 7/16

  18. Forgetfulness � Required for proof of monopolarization � FAIM process ( S i , X i , Y i ) is forgetful if for any ǫ > 0 there exists natural λ such that if k ≥ λ , I ( S 1 ; S k | X k 1 , Y k 1 ) ≤ ǫ I ( S 1 ; S k | Y k 1 ) ≤ ǫ � Neither inequality implies the other � FAIM does not imply forgetfulness � We have a sufficient condition for forgetfulness � Under it, ǫ decreases exponentially with λ 8/16

  19. FAIM Does Not Imply Forgetfulness 1 2 � a , S j ∈ { 1 , 2 } a Y j = b , S j ∈ { 3 , 4 } b 3 4 I ( S 1 ; S k | Y k 1 ) �→ 0 9/16

  20. Why Forgetfulness? � ( S i , X i , Y i ) forgetful if for any ǫ > 0 exists λ such that � I ( S 1 ; S k | X k 1 , Y k 1 ) ≤ ǫ k ≥ λ = ⇒ I ( S 1 ; S k | Y k 1 ) ≤ ǫ � Can show: for any k + 1 ≤ i ≤ N − k i − k , Y i + k 0 ≤ H ( X i | X i − 1 i − k ) − H ( X i | X i − 1 , Y N 1 ) ≤ 2 ǫ 1 Takeaway point Only a “window” surrounding i really matters 10/16

  21. Slow Stage is Monopolarizing � FAIM-derived: ( X i , Y i ) derived from ( S i , X i , Y i ) such that P ( s i , x i , y i |{ s j , x j , y j } j < i ) = P ( s i , x i , y i | s i − 1 ) with S i finite-state, aperiodic, irreducible, Markov � Forgetful: for any ǫ > 0 there exists λ such that if k ≥ λ , I ( S 1 ; S k | X k 1 , Y k 1 ) ≤ ǫ I ( S 1 ; S k | Y k 1 ) ≤ ǫ Main Result (simplified) If process ( X i , Y i ) is FAIM-derived and forgetful, the slow stage is monopolarizing, with universal L , H (unrelated to process) 11/16

  22. Slow Stage � Presented for the case | L | = | H | Level- n block � Transforms F 1 ֌ G 1 L n lateral f X N n 1 ֌ Y N n = ⇒ F N n 1 ֌ G N n F Ln ֌ G Ln 1 1 F Ln + 1 ֌ G Ln + 1 decode F i from G i transmitted received � Recursively defined M n medial � Parameters L 0 , M 0 N 0 = 2 L 0 + M 0 � Level 0 length: N n = 2 N n − 1 � Level n length: F Ln + Mn ֌ G Ln + Mn � Index types at level n : F Ln + Mn + 1 ֌ G Ln + Mn + 1 � First L n indices: L n lateral lateral F Nn ֌ G Nn � Middle M n indices: medial � Last L n indices: lateral 12/16

  23. Slow Stage — Lateral Recursion Level- n block Level- ( n + 1 ) block L n lateral L n + 1 = 2 L n + 1 lateral U ֌ Q M n L n lateral M n + 1 = 2 ( M n − 1 ) F ֌ G L n lateral M n V ֌ R L n + 1 = 2 L n + 1 lateral L n lateral Level- n block � Lateral indices always remain lateral � Two medial indices become lateral 13/16

  24. Slow Stage — Medial Recursion Level- n block lateral U Ln + 1 H L H � Two type of medial L indices: U ֌ Q � H � L H L � Alternating: H L H , L , H , L , . . . lateral � Two medial become lateral H L lateral: H L U L n + 1 , V L n + M n � Join H from one V ֌ R block with L from H L other H V Ln + Mn L lateral Level- n block 14/16

  25. Slow Stage — Medial Recursion Level- n block medial ( n + 1 ) lateral H U Ln + 2 + L F 2 L n + 2 H � Two type of medial L indices: U ֌ Q � H � L H L � Alternating: H L H , L , H , L , . . . lateral � Two medial become lateral V Ln + 1 H F 2 L n + 3 L lateral: H L U L n + 1 , V L n + M n � Join H from one V ֌ R block with L from H L other H L lateral Level- n block 14/16

  26. Slow Stage — Medial Recursion Level- n block medial ( n + 1 ) lateral H U Ln + 2 + L F 2 L n + 2 U Ln + 3 H F 2 L n + 5 � Two type of medial L indices: U ֌ Q � H � L H L � Alternating: H L H , L , H , L , . . . lateral � Two medial become lateral V Ln + 1 H F 2 L n + 3 V Ln + 2 + L F 2 L n + 4 lateral: H L U L n + 1 , V L n + M n � Join H from one V ֌ R block with L from H L other H L lateral Level- n block 14/16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend