lecture 6 channel coding over continuous channels
play

Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang - PowerPoint PPT Presentation

Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 2015


  1. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 2015 1 / 59 I-Hsiang Wang IT Lecture 6

  2. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel We have investigated the measures of information for continuous r.v.’s: The amount of uncertainty (entropy) is mostly infinite. Mutual information and KL divergences are well defined. Differential entropy is a useful entity to compute and manage measures of information for continuous r.v.’s. Question : How about coding theorems? Is there a general way or framework to extend coding theorems from discrete (memoryless) sources/channels to continuous (memoryless) sources/channels? 2 / 59 I-Hsiang Wang IT Lecture 6

  3. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 3 / 59 IT Lecture 6 Discrete Memoryless Channel x N y N Channel Channel Channel b w w Encoder p Y | X Decoder C ( B ) = X : E [ b ( X )] ≤ B I ( X ; Y ) . max ? Continuous Memoryless Channel x N y N Channel Channel Channel b w w f Y | X Encoder Decoder C ( B ) = sup I ( X ; Y ) . X : E [ b ( X )] ≤ B

  4. Channel Coding over Continuous Memoryless Channels entropy terms in the definitions of weakly typical sequences by I-Hsiang Wang 4 / 59 threshold decoder, similar to weak typicality. and Yeung[5] use weak typicality for continuous r.v.’s. Moser[4] uses capacity follows Gallager[2] and El Gamal&Kim[6] . Cover&Thomas[1] Using discretization to derive the achievability of Gaussian channel differential entropy terms. repeat the arguments in a similar way. In particular, replace the Parallel Gaussian Channel 2 New typicality : Extend weak typicality for continuous r.v. and finer to prove the achievability. create a discrete system, and then make the discretization finer and 1 Discretization : Discretize the source and channel input/output to theorems from the discrete world to the continuous world: Two main techniques for extending the achievability part of coding Coding Theorems: from Discrete to Continuous (1) IT Lecture 6

  5. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Coding Theorems: from Discrete to Continuous (2) In this lecture, we use discretization for the achievability proof. Pros : No need for new tools (eg., typicality) for continuous r.v.’s. Extends naturally to multi-terminal settings – can focus on discrete memoryless networks. Cons : Technical; not much insight on how to achieve capacity. Hence, we use a geometric argument to provide insights on how to achieve capacity. Disclaimer : We will not be 100% rigorous in deriving the results in this lecture. Instead, you can find rigorous treatment in the references. 5 / 59 I-Hsiang Wang IT Lecture 6

  6. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Outline 1 First, we formulate the channel coding problem over continuous memoryless channels (CMC), state the coding theorem, and sketch the converse and achievability proofs. 2 Second, we introduce additive Gaussian noise (AGN) channel, derive the Gaussian channel capacity, and provide insights based on geometric arguments. 3 We then explore extensions, including parallel Gaussian channels and correlated Gaussian channels, and continuous-time bandlimited Gaussian channels. 6 / 59 I-Hsiang Wang IT Lecture 6

  7. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity 1 Channel Coding over Continuous Memoryless Channels Continuous Memoryless Channel Gaussian Channel Capacity 2 Parallel Gaussian Channel Parallel Channel with Independent Noises Parallel Channel with Colored Noises 7 / 59 I-Hsiang Wang IT Lecture 6

  8. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity 1 Channel Coding over Continuous Memoryless Channels Continuous Memoryless Channel Gaussian Channel Capacity 2 Parallel Gaussian Channel Parallel Channel with Independent Noises Parallel Channel with Colored Noises 8 / 59 I-Hsiang Wang IT Lecture 6

  9. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 9 / 59 same as those in channel coding over DMC. The definitions of error probability, achievable rate, and capacity, are the N 3 Average input cost constraint B : . 2 Continuous Memoryless Channel (CMC): IT Lecture 6 Continous Memoryless Channel Continuous Memoryless Channel Gaussian Channel Capacity x N y N Channel Channel Channel b w w f Y | X Encoder Decoder 1 Input/output alphabet X = Y = R . Channel Law : Governed by the conditional density (p.d.f.) f Y | X . X k − 1 , Y k − 1 ) ( Memoryless : Y k − X k − ∑ N 1 k =1 b ( x k ) ≤ B , where b : R → [0 , ∞ ) is the (single-letter) cost function.

  10. Channel Coding over Continuous Memoryless Channels with input cost constraint B is I-Hsiang Wang 10 / 59 Converse proof : Exactly the same as that in the DMC case. In other words, it could also be discrete. Note : The input distribution of the r.v. X needs not to have a density. Parallel Gaussian Channel sup (1) IT Lecture 6 Theorem 1 (Continuous Memoryless Channel Capacity) Continuous Memoryless Channel Gaussian Channel Capacity Channel Coding Theorem The capacity of the CMC ( ) R , f Y | X , R C = I ( X ; Y ) . X : E [ b ( X )] ≤ B How to compute h ( Y | X ) when X has no density? Recall [ ] ∫ h ( Y | X ) = E X − supp Y f ( y | X ) log f ( y | X ) dy , where f ( y | x ) is the conditional density of Y given X .

  11. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 11 / 59 apply the result in DMC with input cost: The proof of achievability makes use of discretization, so that one can IT Lecture 6 Gaussian Channel Capacity Sketch of the Achievability (1): Discretization Continuous Memoryless Channel x N y N b w f Y | X w ENC DEC

  12. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 12 / 59 apply the result in DMC with input cost: The proof of achievability makes use of discretization, so that one can IT Lecture 6 Sketch of the Achievability (1): Discretization Gaussian Channel Capacity Continuous Memoryless Channel b w f Y | X Q in Q out w ENC DEC Q in : (single-letter) discretization that maps X ∈ R to X d ∈ X d . Q out : (single-letter) discretization that maps Y ∈ R to Y d ∈ Y d . Note that both X d and Y d are discrete (countable) alphabets.

  13. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 13 / 59 as shown above. equivalent DMC apply the result in DMC with input cost: The proof of achievability makes use of discretization, so that one can IT Lecture 6 Sketch of the Achievability (1): Discretization Continuous Memoryless Channel Gaussian Channel Capacity New ENC Equivalent DMC b w f Y | X Q in Q out w ENC DEC Q in : (single-letter) discretization that maps X ∈ R to X d ∈ X d . Q out : (single-letter) discretization that maps Y ∈ R to Y d ∈ Y d . Note that both X d and Y d are discrete (countable) alphabets. ( ) Idea : With the two discretization blocks Q in and Q out , one can build an X d , p Y d | X d , Y d

  14. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 14 / 59 4 Achievability in the original CMC : Prove that when the discretization channel coding theorem for DMC with input constraint, any rate 3 Achievability in the equivalent DMC : By the achievability part of the 1 Random codebook generation : Generate the codebook randomly IT Lecture 6 Gaussian Channel Capacity Continuous Memoryless Channel Sketch of the Achievability (2): Arguments Q in Q out x N y N d d Equivalent DMC b w w New ENC DEC p Y d | X d based on the original (continuous) r.v. X , satisfying E [ b ( X )] ≤ B . 2 Choice of discretization : Choose Q in such that the cost constraint will not be violated after discretization. Specifically, E [ b ( X d )] ≤ B . R < I ( X d ; Y d ) is achievable. in Q in and Q out gets finer and finer, I ( X d ; Y d ) → I ( X ; Y ) .

  15. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity 1 Channel Coding over Continuous Memoryless Channels Continuous Memoryless Channel Gaussian Channel Capacity 2 Parallel Gaussian Channel Parallel Channel with Independent Noises Parallel Channel with Colored Noises 15 / 59 I-Hsiang Wang IT Lecture 6

  16. Channel Coding over Continuous Memoryless Channels 2 AWGN Channel: I-Hsiang Wang 16 / 59 N 3 Average input power constraint P : . Parallel Gaussian Channel IT Lecture 6 Additive White Gaussian Noise (AWGN) Channel Gaussian Channel Capacity Continuous Memoryless Channel z N x N y N Channel Channel b w w Encoder Decoder 1 Input/output alphabet X = Y = R . Conditional p.d.f. f Y | X is given by Y = X + Z , Z ∼ N ( 0 , σ 2 ) ⊥ ⊥ X . { Z k } form an i.i.d. (white) Gaussian r.p. with Z k ∼ N ( 0 , σ 2 ) , ∀ k . W , X k − 1 , Z k − 1 ) Memoryless: Z k ⊥ ⊥ ( Without feedback: Z N ⊥ ⊥ X N . ∑ N k =1 | x k | 2 ≤ P . 1

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend