coding and decoding with convolutional codes the viterbi
play

Coding and decoding with convolutional codes. The Viterbi Algorithm. - PowerPoint PPT Presentation

Convolutional encoding Finite State Machine Channel models The Viterbi algorithm Coding and decoding with convolutional codes. The Viterbi Algorithm. J.-M. Brossier 2008 J.-M. Brossier Coding and decoding with convolutional codes. The


  1. Convolutional encoding Finite State Machine Channel models The Viterbi algorithm Coding and decoding with convolutional codes. The Viterbi Algorithm. J.-M. Brossier 2008 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  2. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Repetition code TX: CODING THEORY RX: C P DING T O EORY No way to recover from transmission errors, we need to add some redundancy at the transmitter side. Repetition of transmitted symbols make detection and correction possible: TX:CCC OOO DDD III NNN GGG TTT HHH EEE OOO RRR YYY RX:CCC OPO DDD III NNN GGD TTT OHO EEE OOO RRR YYY C O D I N G T O E O R Y: 2 corrections - 1 detection. Beyond repetition ... Better codes exist. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  3. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Repetition code TX: CODING THEORY RX: C P DING T O EORY No way to recover from transmission errors, we need to add some redundancy at the transmitter side. Repetition of transmitted symbols make detection and correction possible: TX:CCC OOO DDD III NNN GGG TTT HHH EEE OOO RRR YYY RX:CCC OPO DDD III NNN GGD TTT OHO EEE OOO RRR YYY C O D I N G T O E O R Y: 2 corrections - 1 detection. Beyond repetition ... Better codes exist. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  4. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Repetition code TX: CODING THEORY RX: C P DING T O EORY No way to recover from transmission errors, we need to add some redundancy at the transmitter side. Repetition of transmitted symbols make detection and correction possible: TX:CCC OOO DDD III NNN GGG TTT HHH EEE OOO RRR YYY RX:CCC OPO DDD III NNN GGD TTT OHO EEE OOO RRR YYY C O D I N G T O E O R Y: 2 corrections - 1 detection. Beyond repetition ... Better codes exist. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  5. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Geometric view 0 1 1 1 1 1 k=1 bit of information 0 0 1 1 0 1 2 code words (length n=3): (000) (111) 0 1 0 1 1 0 0 0 0 1 0 0 RX: how can the receiver decide about transmitted words: (001),(010),(100): Detection + correction (000) (110),(101),(011): Detection + correction (111) (000) (111) (Probably) right J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  6. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Linear block codes, e.g. Hamming codes. A binary linear block code takes k information bits at its input and calculates n bits. If the 2 k codewords are enough and well spaced in the n -dim space, it is possible to detect or even correct errors. In 1950, Hamming introduced the (7,4) Hamming code. It encodes 4 data bits into 7 bits by adding three parity bits. It can detect and correct single-bit errors but can only detect double-bit errors. The code parity-check matrix is:   1 0 1 0 1 0 1 H = 0 1 1 0 0 1 1   0 0 0 1 1 1 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  7. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  8. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix A convolutional code can be described by an “infinite matrix”: G 0 G 1 · · · G M 0 k × n · · ·   0 k × n G 0 · · · G M − 1 G M   . . .  ... ... ...  . . . . . .     .   . .  0 k × n G 0 G 1  G =   . ... ...  .  . G 0     . ...  .  . 0 k × n     ... This matrix depends on K = M + 1 k × n sub-matrices { G i } i =0 .. M . K is known as the constraint length of the code. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  9. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix A convolutional code can be described by an “infinite matrix”: · · · · · ·  G 0 G 1 G M 0 k × n  0 k × n G 0 · · · G M − 1 G M   . . . ... ... ...   . . .  . . .    .   . . 0 k × n G 0 G 1   ( C 0 , C 1 · · · ) = ( I 0 , I 1 · · · )   . ... ...   . . G 0     . ...  .  . 0 k × n     ... It looks like a block coding: C = IG J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  10. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix Denoting by: I j = ( I j 1 · · · I jk ) the j th block of k informative bits, C j = ( C j 1 · · · C jn ) a block of n coded bits at the output. Coding an infinite sequence of blocks (length k ) I = ( I 0 I 1 · · · ) produces an infinite sequence C = ( C 0 C 1 · · · ) of coded blocks (length n ). C 0 = I 0 G 0 C 1 = I 0 G 1 + I 1 G 0 Block form of the coding . . . scheme: it looks like a block C M = I 0 G M + I 1 G M − 1 + · · · + I M G 0 coding: . . . C = IG C j = I j − M G M + · · · + I j G 0 for j ≥ M . . . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  11. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix performs a convolution Using the convention I i = 0 for i < 0, the encoding structure C = IG is clearly a convolution : M � C j = I j − l G l . l =0 For an informative bits sequence I whose length is finite,only L < + ∞ blocks of k bits are different from zero at the input of the coder: I = ( I 0 · · · I L − 1 ). The sequence C = ( C 0 · · · C L − 1+ M ) at the coder output is finite too. This truncated coded sequence is generated by a linear block code whose generator matrix is a size kL × n ( L + M ) sub-matrix of G J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  12. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Let us write g ( l ) αβ elements of matrix G l . We now expand the convolution C j = � M l =0 I j − l G l to explicit the n components C j 1 , · · · , C jn of each output block C j : � M k M k � I j − l ,α g ( l ) I j − l ,α g ( l ) � � � � C j = [ C j 1 , · · · , C jn ] = α 1 , · · · , α n l =0 α =1 l =0 α =1 If the length of the shift register is L , there are M L different internal configurations. The behavior of the convolutional coder can be captured by a M L states machine. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  13. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization k M � � I j − l ,α g ( l ) C j β = αβ α =1 l =0 depends on: the present input I j M previous input blocks I j − 1 , · · · , I j − M . C j β can be calculated by memorizing M input values in shift registers One shift register α ∈ 1 · · · k for each k bits of the input. For register α , only memories for which g ( l ) αβ = 1 are connected to adder β ∈ 1 · · · n . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend