lecture 4 noisy channel coding
play

Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of - PowerPoint PPT Presentation

Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Channel Coding with Input Cost Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University


  1. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Channel Coding with Input Cost Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 2, 2015 1 / 74 I-Hsiang Wang IT Lecture 4

  2. Channel Capacity and the Weak Converse Meta Description I-Hsiang Wang 2 / 74 certain decoding criterion. w from the channel output y N . . determining the stochastic relationship between the output symbol y k Achievability Proof and Source-Channel Separation and a family of conditional distributions p IT Lecture 4 Channel Coding with Input Cost The Channel Coding Problem x N y N Channel Noisy Channel b w w Encoder Channel Decoder 1 Message : Random message W ∼ Unif [1 : 2 K ] . 2 Channel : Consist of an input alphabet X , an output alphabet Y , � { ( � x k , y k − 1 ) } | k ∈ N ( x k − 1 , y k − 1 ) y k and the input symbol x k along with all past signals 3 Encoder : Encode the message w by a length N codeword x N ∈ X N . 4 Decoder : Reconstruct message � 5 Efficiency : Maximize the code rate R ≜ K N bits/channel use, given

  3. Channel Capacity and the Weak Converse e I-Hsiang Wang 3 / 74 Recall: In lossless source coding, we see that the infimum of compression code rate R where vanishing error probability is possible? e Is it possible to have a sequence of encoder/decoder pairs such attention to answering the following question: Following the development of lossless source coding, Shannon turned the Ans : Probably not, unless the channel noise has some special structure. Question : Is it possible to get zero error probability? . W Achievability Proof and Source-Channel Separation IT Lecture 4 Channel Coding with Input Cost Decoding Criterion: Vanishing Error Probability x N y N Channel Noisy Channel b w w Encoder Channel Decoder { } A key performance measure: Error Probability P ( N ) W ̸ = � ≜ P that P ( N ) → 0 as N → ∞ ? If so, what is the largest possible rates where vanishing error probability is possible is H ( { S i } ) .

  4. Channel Capacity and the Weak Converse = I-Hsiang Wang 4 / 74 Remark : For source coding, one can establish a similar framework. . N log N V = e e Achievability Proof and Source-Channel Separation = IT Lecture 4 Channel Coding with Input Cost e Rate R Block Probability Length of Error P ( N ) N e Capacity : Take N → ∞ , Require P ( N ) → 0 ⇒ sup R = C . Error Exponent : Take N → ∞ , Fix rate R ⇒ min P ( N ) ≈ 2 − NE ( R ) . Finite Block Length : Fix N , Require P ( N ) ≤ ε √ ( ) N Q − 1 ( ε ) + O ⇒ sup R = C −

  5. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation Channel Coding with Input Cost In this lecture we only focus on capacity. In other words, we ignore the issue of finite block length (FBL). FBL performance can be obtained via techniques extending from CLT. We do not pursue finer analysis on the error probability via large deviation techniques either. 5 / 74 I-Hsiang Wang IT Lecture 4

  6. Channel Capacity and the Weak Converse y k I-Hsiang Wang 6 / 74 the channel output y N ? , Question : is our definition of a channel sufficient to specify p transition function. . y k Achievability Proof and Source-Channel Separation is memoryless if p IT Lecture 4 Channel Coding with Input Cost Discrete Memoryless Channel (DMC) In order to demonstrate the key ideas in channel coding, in this lecture we shall focus on discrete memoryless channels (DMC) defined below. Definition 1 (Discrete Memoryless Channel) p A discrete channel ( { ( � x k , y k − 1 ) � } ) X , | k ∈ N , Y ∀ k ∈ N , ( � � x k , y k − 1 ) = p Y | X ( y k | x k ) . ( X k − 1 , Y k − 1 ) In other words, Y k − X k − Here the conditional p.m.f. p Y | X is called the channel law or channel y N � ( � x N ) the stochastic relationship between the channel input (codeword) x N and

  7. Channel Capacity and the Weak Converse cannot be obtained from p y k p Achievability Proof and Source-Channel Separation x k Hence, we need to further specify p x k , which . N Interpretation : p x k is induced by the encoding function, which implies that the encoder can potentially make use of the past channel output, i.e., feedback . 7 / 74 I-Hsiang Wang p IT Lecture 4 p Channel Coding with Input Cost N p p ( x N , y N ) y N � ( � x N ) = p p ( x N ) ∏ ( x N , y N ) ( � � x k − 1 , y k − 1 ) = x k , y k k =1 ∏ ( � x k , y k − 1 ) � ( � x k − 1 , y k − 1 ) � = k =1 { ( � x k − 1 , y k − 1 ) � } | k ∈ N ( x N ) � { ( � x k − 1 , y k − 1 ) } | k ∈ N

  8. Channel Capacity and the Weak Converse y k realization of the channel output, then, p x k x k Achievability Proof and Source-Channel Separation specifying p suffices to specify p With Feedback . Proposition 1 (DMC without Feedback) For a DMC without feedback, p N 8 / 74 I-Hsiang Wang Suppose in the system, the encoder has no knowledge about the IT Lecture 4 No Feedback Channel Coding with Input Cost DMC without Feedback x k x k y k y k Channel Noisy Channel Noisy w w Encoder Channel Encoder Channel y k − 1 D ( � x k − 1 , y k − 1 ) � ( � x k − 1 ) � = p for all k ∈ N , and it is said the the channel has no feedback. In this case, { ( � x k , y k − 1 ) � } ( y N � � x N ) | k ∈ N ∏ y N � ( ) ( � x N ) X , p Y | X , Y = p Y | X ( y i | x i ) . k =1

  9. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation I-Hsiang Wang 9 / 74 3 Prove the achievability part with a random coding argument. 2 Prove the converse part: an achievable rate cannot exceed C . couple of examples to show how to compute channel capacity. 1 Give the problem formulation, state the main theorem, and visit a To demonstrate this result, we organize the lecture as follows: The above holds regardless of the availability of feedback. vanishing error probability is the channel capacity For a DMC , the maximum code rate with noisy channel coding theorem due to Shannon: In this lecture, we would like to establish the following (informally described) Overview Channel Coding with Input Cost IT Lecture 4 ( ) X , p Y | X , Y C ≜ max p X ( · ) I ( X ; Y ) .

  10. Channel Capacity and the Weak Converse Achievability Proof I-Hsiang Wang 10 / 74 Achievability Proof Converse Proof Cost Constraints 3 Channel Coding with Input Cost Source-Channel Separation 2 Achievability Proof and Source-Channel Separation Achievability Proof and Source-Channel Separation Feedback Capacity Proof of the Weak Converse Channel Capacity 1 Channel Capacity and the Weak Converse Feedback Capacity Proof of the Weak Converse Channel Capacity Channel Coding with Input Cost IT Lecture 4

  11. Channel Capacity and the Weak Converse Achievability Proof I-Hsiang Wang 11 / 74 Achievability Proof Converse Proof Cost Constraints 3 Channel Coding with Input Cost Source-Channel Separation 2 Achievability Proof and Source-Channel Separation Achievability Proof and Source-Channel Separation Feedback Capacity Proof of the Weak Converse Channel Capacity 1 Channel Capacity and the Weak Converse Feedback Capacity Proof of the Weak Converse Channel Capacity Channel Coding with Input Cost IT Lecture 4

  12. Channel Capacity and the Weak Converse 1 A I-Hsiang Wang 12 / 74 e 3 A rate R is said to be achievable if there exist a sequence of . W e w or an Achievability Proof and Source-Channel Separation channel code consists of IT Lecture 4 Channel Coding with Input Cost Channel Capacity Proof of the Weak Converse Feedback Capacity Channel Coding without Feedback: Problem Setup x N y N Channel Noisy Channel b w w Encoder Channel Decoder ( ) 2 NR , N an encoding function (encoder) enc N : [1 : 2 K ] → X N that maps each message w to a length N codeword x N , where K ≜ ⌈ NR ⌉ . a decoding function (decoder) dec N : Y N → [1 : 2 K ] ∪ {∗} that maps a channel output sequence y N to a reconstructed message � error message ∗ . { } 2 The error probability is defined as P ( N ) W ̸ = � ≜ P ( ) codes such that P ( N ) 2 NR , N → 0 as N → ∞ . The channel capacity is defined as C ≜ sup { R | R : achievable } .

  13. Channel Capacity and the Weak Converse Achievability Proof and Source-Channel Separation I-Hsiang Wang 13 / 74 maximized. input distribution so that the amount of information transfer is from the channel output Y . amount of information about the channel input X that one can infer (1) IT Lecture 4 Theorem 1 (Channel Coding Theorem for DMC without Feedback) Channel Coding Theorem for Discrete Memoryless Channel Feedback Capacity Proof of the Weak Converse Channel Capacity Channel Coding with Input Cost The capacity C of the DMC p ( y | x ) without feedback is given by C = max p ( x ) I ( X ; Y ) . The capacity formula (1) is intuitive, since I ( X ; Y ) represents the The maximization over p ( x ) stands for choosing the best possible

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend