SLIDE 19 Mutual Information and Differential Entropy Channel Coding with Input Cost Gaussian Channel Capacity Summary
Channel Coding with Input Cost over DMC: Problem Setup
Channel Encoder Channel Decoder
xN yN
Noisy Channel
w b w
1 A
( 2NR, N, B ) channel code consists of
an encoding function (encoder) encN : [1 : 2K] → X N that maps each message w to a length N codeword xN, where K := ⌈NR⌉. The codeword follows the input cost constraint
1 N
∑N
i=1 b (xi) ≤ B.
a decoding function (decoder) decN : YN → [1 : 2K] ∪ {∗} that maps a channel output yN to a reconstructed message w or an error ∗.
2 The error probability is defined as P(N) e
:= Pr { W ̸= W } .
3 A rate R is said to be achievable with input cost B if there exist a
sequence of ( 2NR, N, B ) codes such that P(N)
e
→ 0 as N → ∞. The channel capacity is defined as C(B) := sup {R | R : achievable}.
19 / 34 I-Hsiang Wang NIT Lecture 5