Chapter 9 Gaussian Channel
Peng-Hua Wang
Graduate Inst. of Comm. Engineering National Taipei University
Chapter 9 Gaussian Channel Peng-Hua Wang Graduate Inst. of Comm. - - PowerPoint PPT Presentation
Chapter 9 Gaussian Channel Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University Chapter Outline Chap. 9 Gaussian Channel 9.1 Gaussian Channel: Definitions 9.2 Converse to the Coding Theorem for Gaussian Channels 9.3
Graduate Inst. of Comm. Engineering National Taipei University
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 2/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 3/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 4/31
■ Xi: input, Yi:output, Zi: noise. Zi is independent of Xi. ■ Without further constraint, the capacity of this channel may be infinite. ◆ If the noise variance N is zero, the channel can transmit an
◆ If the noise variance N is nonzero, we can choose an infinite
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 5/31
■ The most common limitation on the input is an energy or power
■ We assume an average power constraint. For any codeword
n
i ≤ P
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 6/31
f(x):E[X2]≤P I(X; Y )
2 log 2πeσ2.
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 7/31
E[X2]≤P I(X; Y ) = 1
■ Next, we will show that this capacity is achievable.
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 8/31
n
i (w) ≤ P,
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 9/31
■ I(·) is the indicator function.
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 10/31
i∈{1,2,...,M} λi
e
M
■ The decoding error is
M
e
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 11/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 12/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 13/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 14/31
where A the constant for calculating the volume of n-dimensional sphere. For example,
3π. Therefore, the capacity is
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 15/31
■ Codebook. Let Xi(w), i = 1, 2, . . . , n, w = 1, 2, . . . , 2nR be
i → P − ǫ. ■ Encoding. The codebook is revealed to both the sender and the
■ Decoding. The receiver searches for the one that is jointly typical
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 16/31
■ Probability of error. Assume that codeword 1 was sent.
n
j (1) > P
ǫ
◆ The power constraint is violate. ⇒ E0 occurs. ◆ The transmitted codeword and the received sequence are not
1 occurs. ◆ Wrong codeword is jointly typical with the received sequence. ⇒
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 17/31
e
a ∪ E2 ∪ E3 ∪ · · · ∪ E2nR)
1) + 2nR
2nR
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 18/31
■ Since the average probability of error over codebooks is less then 3ǫ,
◆ C∗ can be found by an exhaustive search over all codes. ■ Deleting the worst half of the codewords in C∗, we obtain a code with
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 19/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 20/31
e
2 log(1 + P N ). Let W
e
e
e
n
n
n
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 21/31
n
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 22/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 23/31
■ Suppose the output of a band-limited channel can be represented by
■ The sampling frequency is 2W. If the channel be used over the time
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 24/31
■ If the noise has power spectral density N0/2 watts/Hz, the noise
■ The capacity is 1 2 log
N0/2
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 25/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 26/31
■ In this section we consider k independent Gaussian channels in
j
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 27/31
f(X−1,...,xn):EX2
i <P I(X1, X2 . . . , Xk; Y1, Y2, . . . , Yk)
i and Pi = P
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 28/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 29/31
1 2λ = constant,
1 2(λ−µi),
i Pi = i( 1 2λ − Ni)+ = P
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 30/31
Peng-Hua Wang, May 14, 2012 Information Theory, Chap. 9 - p. 31/31