achieving channel capacity
play

Achieving channel capacity ... Shannon says this is possible ... how - PowerPoint PPT Presentation

Introduction Turbo Principle SISO (Soft Input Soft Output) Example of a product code Achieving channel capacity ... Shannon says this is possible ... how ? Make use of the gap between the source rate and the channel capacity: coding scheme


  1. Introduction Turbo Principle SISO (Soft Input Soft Output) Example of a product code Achieving channel capacity ... Shannon says this is possible ... how ? Make use of the gap between the source rate and the channel capacity: coding scheme Claude Shannon, 1953 A scheme of coding and decoding can be found allowing correction of all transmission errors, if the information rate is inferior or equal to the channel capacity. J.-M. Brossier Turbo codes.

  2. Introduction Turbo Principle Coding and uncoding SISO (Soft Input Soft Output) Example of a product code Turbo coding Serial encoding u Data Code 1 p Interleaver Code 2 q Two short systematic codes are used to build a large code Data u Redundancy of the first coder p Redundancy of the second coder q J.-M. Brossier Turbo codes.

  3. Introduction Turbo Principle Coding and uncoding SISO (Soft Input Soft Output) Example of a product code Turbo decoding Iterative Decoding Scheme u First iteration The two decoders provide a p E Decoder 1 first estimation of the transmitted symbols. Each decoder transmits its output to the input of the other one for the second E Decoder 2 D iteration. q J.-M. Brossier Turbo codes.

  4. Introduction Turbo Principle Coding and uncoding SISO (Soft Input Soft Output) Example of a product code Turbo decoding Iterative Decoding Scheme Second Iteration u Using the outputs computed p E at the first iteration, the two Decoder 1 decoders provide a second estimation of the transmitted symbols. the same sequence of E Decoder 2 D operations is applied for all iterations ... q J.-M. Brossier Turbo codes.

  5. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Soft information What is a soft information? A log-likelihood ratio Example: the Additive White Gaussian Noise Channel Its output is given by r = x + b with x = ± 1 and b zero-mean Gaussian random variable with variance σ 2 . LLR (Log Likelihood Ratio): − ( r − 1) 2 � �   1 2 π exp √ log p ( r | + 1)  = 2 2 σ 2 σ p ( r | − 1) = log σ 2 r  − ( r +1) 2 � � 1 2 π exp √ 2 σ 2 σ Interpretation: The sign of the LLR is a hard decision Its module indicates the reliability of this decision. J.-M. Brossier Turbo codes.

  6. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Modelisation of the decoder input Decomposition of the information Emitted codeword given by LLRs X = ( X 1 , · · · , X n ) Hard decision: the LLR sign provides a hard decision: Soft received sequence of values Y i = sgn [ LLR i ] R = ( LLR 1 , · · · , LLR n ) LLR i are Log-Likelihood Ratios Reliability: the LLR module provides its reliability: Iteration 1: information is hard or soft. α i = | LLR i | Following iterations: only soft information. J.-M. Brossier Turbo codes.

  7. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Several kinds of decoders Notations Vector of errors Z m = Y ⊕ X m n Weights of errors W ( Z m ) = Z m � i i =1 Analog weight (soft) W α ( Z m ) = � n i =1 α i Z m i . Incomplete Decoder (Hard input) It only uses the hard information. It gives the word X m = ( X m 1 , · · · , X m n ) whose Hamming distance to Y = ( Y 1 , · · · , Y n ) is minimum. � d min − 1 1 word is found if W ( Z m ) ≤ � else no word found. 2 The decision is right if the number of errors is less than � d min − 1 � . 2 J.-M. Brossier Turbo codes.

  8. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Several kinds of decoders Notations Vector of errors Z m = Y ⊕ X m n Weights of errors W ( Z m ) = Z m � i i =1 Analog weight (soft) W α ( Z m ) = � n i =1 α i Z m i . Complete Decoder (Soft input) It uses the whole information. Complete decoder: min m W α ( Y ⊕ X m ) can provide a codeword even if the number of errors is greater than � d min − 1 � 2 Complete Soft Decoder: min m W α ( Y ⊕ X m ) with analog weight W α ( Z m ). J.-M. Brossier Turbo codes.

  9. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code SISO Convolutional Soft input The Viterbi algorithm is able to use soft inputs: it only needs to use an Euclidian metric. Soft output The Viterbi must be modified: Soft Output Viterbi Algorithm (SOVA) Idea: keep more than a single path: the difference between metrics of the two best paths is an indication about the reliability of the decision. J.-M. Brossier Turbo codes.

  10. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Block turbo codes Soft input - Chase Algorithm If only hard decisions are known. Vector R is received. A hard version Y of R is usable by a usual algebric decoder. An algebraic decoder provides a codeword XA. For an incomplete decoder, the procedure stops here. But, if reliabilities are known, it is possible to improve the estimation: Find weak positions (weak LLRs) Modify Y for these positions and produce a small set of decoded codewords using this set of decisions about R. Select the codeword whose analog distance to R is minimum. J.-M. Brossier Turbo codes.

  11. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Block turbo codes Soft input - Chase Algorithm If only hard decisions are known. Vector R is received. A hard version Y of R is usable by a usual algebric decoder. An algebraic decoder provides a codeword XA. For an incomplete decoder, the procedure stops here. But, if reliabilities are known, it is possible to improve the estimation: Find weak positions (weak LLRs) Modify Y for these positions and produce a small set of decoded codewords using this set of decisions about R. Select the codeword whose analog distance to R is minimum. J.-M. Brossier Turbo codes.

  12. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Block turbo codes Soft outputs - Pyndiah Algorithm Aim : computation of reliabilities Λ ( d j ) = log P ( a j = +1 | R ) P ( a j = − 1 | R ) Reliabilities S ± 1 is the set of words with c i j = ± 1, j � E = C i | R � � P ( a j = ± 1 | R ) = P C i ∈ S ± 1 j J.-M. Brossier Turbo codes.

  13. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Block turbo codes Soft outputs - Pyndiah Algorithm Aim : computation of reliabilities Λ ( d j ) = log P ( a j = +1 | R ) P ( a j = − 1 | R ) Reliabilities � R | E = C i � � P C i ∈ S +1 j Λ ( d j ) = log � P ( R | E = C i ) C i ∈ S − 1 j � 2 � � n −| R − C i | � R | E = C i � 1 � with P = exp √ 2 σ 2 2 πσ J.-M. Brossier Turbo codes.

  14. Introduction Definition of a soft information, how to use it? Turbo Principle Convolutional codes SISO (Soft Input Soft Output) Block codes Example of a product code Block turbo codes Soft outputs - Pyndiah Algorithm Aim : computation of reliabilities Λ ( d j ) = log P ( a j = +1 | R ) P ( a j = − 1 | R ) A good approximation of reliabilities is given by: 1 �� 2 2 � � R − C − 1( j ) � � � R − C +1( j ) � Λ ( d j ) ≈ − � � � � 2 σ 2 � � C ± 1( j ) are words in S ± 1 whose Euclidian distance to R is minimum. j S ± 1 is the set of words with c i j = ± 1 j J.-M. Brossier Turbo codes.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend