Part V. AWGN Channel Capacity AWGN Capacity Formula; Sphere - - PowerPoint PPT Presentation

part v awgn channel capacity
SMART_READER_LITE
LIVE PREVIEW

Part V. AWGN Channel Capacity AWGN Capacity Formula; Sphere - - PowerPoint PPT Presentation

Part V. AWGN Channel Capacity AWGN Capacity Formula; Sphere Packing; Resources in AWGN Channel 61 Channel coding theorem For every memoryless channel, there is a definite number C (computable) such that: If the data rate R < C ,


slide-1
SLIDE 1

61

Part V. AWGN Channel Capacity

AWGN Capacity Formula; Sphere Packing; Resources in AWGN Channel

slide-2
SLIDE 2

62

Channel coding theorem

  • For every memoryless channel, there is a definite number C (computable) such that:
  • If the data rate R < C, then there exists a coding scheme that can deliver data at rate R
  • ver the channel with vanishing error probability as the block length
  • Conversely, if the data rate R > C, then no matter what coding scheme is used, the error

probability will converge to 1 as

  • C is called the capacity of the channel, and it has a computable formula for all

kinds of channels (depending on the channel statistics)

  • We focus on the additive white Gaussian noise (AWGN) channel in this course,

and give a heuristic argument to derive the AWGN channel capacity

n → ∞ n → ∞

slide-3
SLIDE 3

AWGN channel model (discrete-time, real-valued)

63

Codebook: Vm = um + Zm, m = 1, . . . , n, Zm

  • ∼ N(0, σ2)

Zm um C comprises 2nR length-n codewords u ∈ Rn Power constraint: 1 n

n

X

m=1

|um|2 ⌘ 1 n kuk2  P, 8 u 2 C

unit: joule per channel use

Code design problem is equivalent to placing 2nR n-dimensional vectors within a sphere of radius , so that the error probability is minimized √ nP

slide-4
SLIDE 4

Sphere packing interpretation

64

Rn

p n(P + σ2)

V = u + Z

√ nσ2

  • By LLN, as , the received V will lie

at the surface of the n-dimensional sphere centered at u with radius with probability 1

  • Also by LLN, as , the received V

will lie within the n-dimensional sphere with radius with probability 1

  • Asymptotically, vanishing error probability

is equivalent to non-overlapping spheres

  • How many non-overlapping spheres can

be packed into the large sphere? n → ∞ √ nσ2 n → ∞ p n(P + σ2)

slide-5
SLIDE 5

Necessary condition: capacity upper bound

  • maximum # of non-overlapping spheres =

maximum # of codewords that can be reliably delivered

  • A necessary condition is
  • The channel capacity is hence upper

bounded by

  • How to achieve it?

65

Rn

p n(P + σ2)

V = u + Z

√ nσ2

2nR ≤ √

n(P +σ2)

n

√ nσ2n

⇐ ⇒ R ≤ 1

n log

n(P +σ2)

n

√ nσ2n

  • =

1 2 log

  • 1 + P

σ2

  • 1

2 log

  • 1 + P

σ2

slide-6
SLIDE 6

Achieving capacity (1)

  • Prove the existence of good codebook

by random coding, as we did before for linear block codes:

  • Randomly generate 2nR length-n codewords

uniformly inside the “u-sphere” of radius

  • Goal: ensure the average-over-random-code

average probability of error vanishes as

  • Decoding:

66

C u-sphere

√ nP u1 u2

√ nP n → ∞

α ,

P P +σ2 : the MMSE coefficient

V − → MMSE − → αV − → Nearest Neighbor − → ˆ u

slide-7
SLIDE 7

Achieving capacity (2)

  • Due to symmetry, we can assume WLOG

the true codeword sent by Tx is

  • By LLN, the distance between and :
  • As long as lies inside the sphere centered

at with radius , decoding will be correct w.h.p.

  • Pairwise probability of error
  • The probability that a random falls inside

the sphere!

  • Ratio of the volume of the two spheres.

67

u-sphere

√ nP

  • n P σ2

P +σ2

u1 u2 αV

u1

u1 αV ∥αV − u1∥2 = ∥αZ + (α − 1)u1∥2 ≈ α2nσ2 + (α − 1)2nP = n P σ2

P +σ2

u1 αV

  • n P σ2

P +σ2

P {Eu1→u2}

u2

slide-8
SLIDE 8

Achieving capacity (3)

  • Pairwise probability of error
  • Ratio of the volume of the two spheres:
  • Union bound:
  • Sufficient condition for vanishing :

68

u-sphere

√ nP

  • n P σ2

P +σ2

u1 u2 αV

P {Eu1→u2}

P {Eu1→u2} = √

nP σ2/(P +σ2)

n

√ nP

n

=

  • σ2

P +σ2

n/2 P(n)

e

≤ (2nR − 1)P {Eu1→u2} ≤ 2nR

σ2 P +σ2

n/2

P(n)

e

= 2n(R− 1

2 log(1+ P σ2 ))

R < 1 2 log ✓ 1 + P σ2 ◆

slide-9
SLIDE 9

Continuous-time AWGN channel capacity

  • For the continuous-time (waveform) channel model considered in Lecture 03:
  • Power constraint P watts
  • White Gaussian noise PSD N0/2 joules per second per hertz
  • Total bandwidth W hertz (symbol duration T = 1/W)
  • Recall we can convert the waveform channel to an equivalent discrete-time

complex-valued AWGN channel:

  • Power constraint is PT = P/W joules per channel use
  • Variance of the circular symmetric complex Gaussian noise is N0 joules per channel use
  • For each real dimension, its capacity is bits per channel use
  • The channel capacity is bits per channel use

69

1 2 log

  • 1 + P/2W

N0/2

  • 2 × 1

2 log

  • 1 + P/2W

N0/2

  • = log
  • 1 +

P W N0

  • = W log

✓ 1 + P WN0 ◆ bits per second

slide-10
SLIDE 10

AWGN channel capacity

  • The capacity formula provides a high-level way of thinking about how the

performance fundamentally depends on the basic resources in the channel

  • No need to go into details of specific coding and modulation schemes
  • Basic resources: power P and bandwidth W

70

CAWGN(P, W) = W log ✓ 1 + P N0W ◆

  • = log (1 + SNR)

“spectral efficiency”

SNR

P N0W

slide-11
SLIDE 11

Resources in AWGN channel

71

30 5 Bandwidth W (MHz) Capacity Limit for W → ∞ Power limited region 0.2 1 Bandwidth limited region (Mbps) C(W ) 0.4 25 20 15 10 1.6 1.4 1.2 0.8 0.6 P N0log2 e

C(W) = W log ✓ 1 + P N0W ◆ ≈ W P N0W log2 e = P N0 log2 e

Fix P

slide-12
SLIDE 12

Bandwidth-limited vs. power-limited

  • When : power-limited regime
  • Linear in power; Insensitive to bandwidth
  • When : bandwidth-limited regime
  • Logarithmic in power; Approximately linear in bandwidth

72

CAWGN(P, W) = W log ✓ 1 + P N0W ◆

  • = log (1 + SNR)

“spectral efficiency”

SNR

P N0W

SNR ⌧ 1 CAWGN(P, W) ≈ W

  • P

N0W

  • log2 e =

P N0 log2 e

SNR 1 CAWGN(P, W) ≈ W log

  • P

N0W