lecture 4 noisy channel coding
play

Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of - PowerPoint PPT Presentation

Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56


  1. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4

  2. Channel Capacity and the Weak Converse Meta Description I-Hsiang Wang 2 / 56 certain decoding criterion. w from the channel output y N . . determining the stochastic relationship between the output symbol y k Achievability Proof and Source-Channel Separation p and a family of conditional distributions IT Lecture 4 The Channel Coding Problem x N y N Channel Noisy Channel b w w Encoder Channel Decoder 1 Message : Random message W ∼ Unif [1 : 2 K ] . 2 Channel : Consist of an input alphabet X , an output alphabet Y , � { ( � x k , y k − 1 ) } | k ∈ N ( x k − 1 , y k − 1 ) y k and the input symbol x k along with all past signals 3 Encoder : Encode the message w by a length N codeword x N ∈ X N . 4 Decoder : Reconstruct message � 5 Efficiency : Maximize the code rate R ≜ K N bits/channel use, given

  3. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation I-Hsiang Wang 3 / 56 Recall: In lossless source coding, we see that the infimum of compression code rate R where vanishing error probability is possible? e Is it possible to have a sequence of encoder/decoder pairs such attention to answering the following question: Following the development of lossless source coding, Shannon turned the Ans : Probably not, unless the channel noise has some special structure. Question : Is it possible to get zero error probability? . W IT Lecture 4 e Decoding Criterion: Vanishing Error Probability x N y N Channel Noisy Channel b w w Encoder Channel Decoder { } A key performance measure: Error Probability P ( N ) W ̸ = � ≜ P that P ( N ) → 0 as N → ∞ ? If so, what is the largest possible rates where vanishing error probability is possible is H ( { S i } ) .

  4. Channel Capacity and the Weak Converse = I-Hsiang Wang 4 / 56 Remark : For source coding, one can establish a similar framework. . N log N V = e Achievability Proof and Source-Channel Separation = e IT Lecture 4 e Rate R Block Probability Length of Error P ( N ) N e Capacity : Take N → ∞ , Require P ( N ) → 0 ⇒ sup R = C . Error Exponent : Take N → ∞ , Fix rate R ⇒ min P ( N ) ≈ 2 − NE ( R ) . Finite Block Length : Fix N , Require P ( N ) ≤ ε √ ( ) N Q − 1 ( ε ) + O ⇒ sup R = C −

  5. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation In this lecture we only focus on capacity. In other words, we ignore the issue of finite block length (FBL). FBL performance can be obtained via techniques extending from CLT. We do not pursue finer analysis on the error probability via large deviation techniques either. 5 / 56 I-Hsiang Wang IT Lecture 4

  6. Channel Capacity and the Weak Converse is memoryless if I-Hsiang Wang 6 / 56 the channel output y N ? , Question : is our definition of a channel sufficient to specify p transition function. . y k Achievability Proof and Source-Channel Separation p y k p Discrete Memoryless Channel (DMC) In order to demonstrate the key ideas in channel coding, in this lecture we shall focus on discrete memoryless channels (DMC) defined below. Definition 1 (Discrete Memoryless Channel) A discrete channel IT Lecture 4 ( { ( � x k , y k − 1 ) � } ) X , | k ∈ N , Y ∀ k ∈ N , ( � x k , y k − 1 ) � = p Y | X ( y k | x k ) . ( X k − 1 , Y k − 1 ) In other words, Y k − X k − Here the conditional p.m.f. p Y | X is called the channel law or channel y N � ( � x N ) the stochastic relationship between the channel input (codeword) x N and

  7. Channel Capacity and the Weak Converse cannot be obtained from p y k p Achievability Proof and Source-Channel Separation x k Hence, we need to further specify p x k , which . N Interpretation : p x k is induced by the encoding function, which implies that the encoder can potentially make use of the past channel output, i.e., feedback . 7 / 56 I-Hsiang Wang p IT Lecture 4 p p N p ( x N , y N ) y N � ( � x N ) = p p ( x N ) ∏ ( x N , y N ) ( � � x k − 1 , y k − 1 ) = x k , y k k =1 ∏ ( � x k , y k − 1 ) � ( � � x k − 1 , y k − 1 ) = k =1 { ( � � x k − 1 , y k − 1 ) } | k ∈ N ( x N ) � { ( � x k − 1 , y k − 1 ) } | k ∈ N

  8. Channel Capacity and the Weak Converse y k realization of the channel output, then, p x k x k Achievability Proof and Source-Channel Separation specifying p suffices to specify p With Feedback . Proposition 1 (DMC without Feedback) For a DMC without feedback, p N 8 / 56 I-Hsiang Wang Suppose in the system, the encoder has no knowledge about the IT Lecture 4 No Feedback DMC without Feedback x k x k y k y k Channel Noisy Channel Noisy w w Encoder Channel Encoder Channel y k − 1 D ( � x k − 1 , y k − 1 ) � ( � � x k − 1 ) = p for all k ∈ N , and it is said the the channel has no feedback. In this case, { ( � x k , y k − 1 ) � } ( y N � � x N ) | k ∈ N ∏ y N � ( ) ( � x N ) X , p Y | X , Y = p Y | X ( y i | x i ) . k =1

  9. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation I-Hsiang Wang 9 / 56 3 Prove the achievability part with a random coding argument. 2 Prove the converse part: an achievable rate cannot exceed C . couple of examples to show how to compute channel capacity. 1 Give the problem formulation, state the main theorem, and visit a To demonstrate this result, we organize the lecture as follows: The above holds regardless of the availability of feedback. IT Lecture 4 vanishing error probability is the channel capacity , the maximum code rate with For a DMC noisy channel coding theorem due to Shannon: In this lecture, we would like to establish the following (informally described) Overview ( ) X , p Y | X , Y C ≜ max p X ( · ) I ( X ; Y ) .

  10. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Channel Capacity Proof of the Weak Converse Feedback Capacity 1 Channel Capacity and the Weak Converse Channel Capacity Proof of the Weak Converse Feedback Capacity 2 Achievability Proof and Source-Channel Separation Achievability Proof Source-Channel Separation 10 / 56 I-Hsiang Wang IT Lecture 4

  11. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Channel Capacity Proof of the Weak Converse Feedback Capacity 1 Channel Capacity and the Weak Converse Channel Capacity Proof of the Weak Converse Feedback Capacity 2 Achievability Proof and Source-Channel Separation Achievability Proof Source-Channel Separation 11 / 56 I-Hsiang Wang IT Lecture 4

  12. Channel Capacity and the Weak Converse 1 A I-Hsiang Wang 12 / 56 e 3 A rate R is said to be achievable if there exist a sequence of . W e w or an Achievability Proof and Source-Channel Separation channel code consists of IT Lecture 4 Feedback Capacity Channel Coding without Feedback: Problem Setup Channel Capacity Proof of the Weak Converse x N y N Channel Noisy Channel b w w Encoder Channel Decoder ( ) 2 NR , N an encoding function (encoder) enc N : [1 : 2 K ] → X N that maps each message w to a length N codeword x N , where K ≜ ⌈ NR ⌉ . a decoding function (decoder) dec N : Y N → [1 : 2 K ] ∪ {∗} that maps a channel output sequence y N to a reconstructed message � error message ∗ . { } 2 The error probability is defined as P ( N ) W ̸ = � ≜ P ( ) codes such that P ( N ) 2 NR , N → 0 as N → ∞ . The channel capacity is defined as C ≜ sup { R | R : achievable } .

  13. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation I-Hsiang Wang 13 / 56 maximized. input distribution so that the amount of information transfer is from the channel output Y . amount of information about the channel input X that one can infer (1) Channel Coding Theorem for Discrete Memoryless Channel Theorem 1 (Channel Coding Theorem for DMC without Feedback) Feedback Capacity Proof of the Weak Converse Channel Capacity IT Lecture 4 The capacity C of the DMC p ( y | x ) without feedback is given by C = max p ( x ) I ( X ; Y ) . The capacity formula (1) is intuitive, since I ( X ; Y ) represents the The maximization over p ( x ) stands for choosing the best possible

  14. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation I-Hsiang Wang 14 / 56 argument called random coding . encoding/decoding schemes such that the error probability vanishes vanishing error probability ( converse ). IT Lecture 4 compute capacity. 1 First we give some examples of noisy channels to show how to Rest of the lecture: Feedback Capacity Proof of the Weak Converse Channel Capacity 2 Then, we prove that for any rate R > C , it is impossible to have 3 Finally, we prove that for any R < C , there exist a sequence of as blocklength tends to ∞ ( achievability ), based on a probabilistic

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend