lecture 4 capacity of wireless channels
play

Lecture 4 Capacity of Wireless Channels I-Hsiang Wang - PowerPoint PPT Presentation

Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/20, 2014 What we have learned So far: looked at specific schemes and techniques Lecture 2: point-to-point wireless channel -


  1. Lecture ¡4 Capacity ¡of ¡Wireless ¡Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/20, 2014

  2. What ¡we ¡have ¡learned • So far: looked at specific schemes and techniques • Lecture 2: point-to-point wireless channel - Diversity: combat fading by exploiting inherent diversity - Coding: combat noise, and further exploits degrees of freedom • Lecture 3: cellular system - Multiple access: TDMA, CDMA, OFDMA - Interference management: orthogonalization (partial frequency reuse), treat-interference-as-noise (interference averaging) 2

  3. Information ¡Theory • Is there a framework to … - Compare all schemes and techniques fairly? - Assert what is the fundamental limit on how much rate can be reliably delivered over a wireless channel? • Information theory! - Provides a fundamental limit to (coded) performance - Identifies the impact of channel resources on performance - Suggest novel techniques to communicate over wireless channels • Information theory provides the basis for the modern development of wireless communication 3

  4. Historical ¡Perspective G. Marconi C. Shannon 1901 1948 • • First radio built 100+ years ago Information theory: every channel has a capacity • Great stride in technology • Provides a systematic view of all • But design was somewhat ad-hoc communication problems Engineering ¡meets ¡science New ¡points ¡of ¡view ¡arise 4

  5. Modern ¡View ¡on ¡Multipath ¡Fading Channel quality Time • Classical view: fading channels are unreliable - Diversity techniques: average out the variation • Modern view: exploit fading to gain spectral efficiency - Thanks to the study on fading channel through the lens of information theory! 5

  6. Plot • Use a heuristic argument (geometric) to introduce the capacity of the AWGN channel • Discuss the two key resources in the AWGN channel: - Power - Bandwidth • The AWGN channel capacity serves as a building block towards fading channel capacity: - Slow fading channel: outage capacity - Fast fading channel: ergodic capacity 6

  7. Outline • AWGN Channel Capacity • Resources of the AWGN Channel • Capacity of some LTI Gaussian Channels • Capacity of Fading Channels 7

  8. AWGN ¡Channel ¡Capacity 8

  9. Channel ¡Capacity • Capacity := the highest data rate can be delivered reliably over a channel - Reliably ≡ Vanishing error probability • Before Shannon, it was widely believed that: - to communicate with error probability → 0 - ⟹ data rate must also → 0 • Repetition coding (with M -level PAM) over N time slots on AWGN channel: r ! - 6 N Error probability ∼ 2 Q M 3 SNR - = log 2 M Data rate N - As long as M ≤ N ⅓ , the error probability → 0 as N → ∞ - = log 2 N still → 0 as N → ∞ But, the data rate � � � 3 N 9

  10. Channel ¡Coding ¡Theorem • For every memoryless channel, there is a definite number C that is computable such that: - If the data rate R < C , then there exists a coding scheme that can deliver rate R data over the channel with error probability → 0 as the coding block length N → ∞ - Conversely, if the data rate R > C , then no matter what coding scheme is used, the error probability → 1 as N → ∞ • We shall focus on the additive white Gaussian noise (AWGN) channel - Give a heuristic argument to derive the AWGN channel capacity 10

  11. AWGN ¡Channel z [ n ] ∼ N (0 , σ 2 ) Power constraint: N | x [ n ] | 2 ≤ NP X x [ n ] y [ n ] = x [ n ] + z [ n ] n =1 • We consider real-valued Gaussian channel • As mentioned earlier, repetition coding yield zero rate if the error probability is required to vanish as N → ∞ • Because all codewords are spread on a single dimension in an N -dimensional space • How to do better? 11

  12. Sphere ¡Packing ¡Interpretation • By the law of large numbers, y = x + z R N as N → ∞ , most y will lie inside the N -dimensional p N ( P + σ 2 ) p sphere of radius N ( P + σ 2 ) Also by the LLN, as N → ∞ , • y will lie near the surface of √ the N -dimensional sphere N σ 2 centered at x with radius √ N σ 2 • Vanishing error probability ⟹ non-overlapping spheres How many non-overlapping spheres can be packed into the large sphere? 12

  13. Why ¡Repetition ¡Coding ¡is ¡Bad y = x + z R N It only uses one dimension out of N ! p N ( P + σ 2 ) 13

  14. Capacity ¡Upper ¡Bound Maximum # of non-overlapping y = x + z R N spheres = Maximum # of codewords that can be reliably delivered p N ( P + σ 2 ) N p N ( P + σ 2 ) 2 NR ≤ √ N σ 2 N √ N p ! N σ 2 N ( P + σ 2 ) ⇒ R ≤ 1 = N log √ N σ 2 N = 1 ✓ ◆ 1 + P 2 log σ 2 This is hence an upper bound of the capacity C . How to achieve it? 14

  15. Achieving ¡Capacity ¡(1/3) • (random) Encoding: randomly generate 2 NR codewords { x 1 , x 2 , ...} lying inside the “ x -sphere” of radius √ NP • Decoding: P α := P + σ 2 → b → MMSE − → Nearest Neighbor − y − → α y − x • Performance analysis: WLOG let x 1 is sent - By the LLN, P σ 2 || α y − x 1 || 2 = || α w + ( α − 1) x 1 || 2 ≈ α 2 N σ 2 + ( α − 1) 2 NP = N P + σ 2 - As long as x 1 lies inside the uncertainty sphere centered at α y r P σ 2 with radius � � � , decoding will be correct N P + σ 2 ◆ N/ 2 - σ 2 ✓ Pairwise error probability (see next slide) = P + σ 2 15

  16. Achieving ¡Capacity ¡(2/3) When does an error occur? x -sphere Ans : when another codeword x 2 falls inside the uncertainty sphere of α y √ NP What is that probability (pairwise error probability)? r P σ 2 Ans : the ratio of the volume of N P + σ 2 the two spheres N x 1 p NP σ 2 / ( P + σ 2 ) Pr { x 1 → x 2 } = √ N α y NP ◆ N/ 2 σ 2 ✓ x 2 = P + σ 2 Union bound: ◆ N/ 2 σ 2 ✓ Total error probability ≤ 2 NR P + σ 2 16

  17. Achieving ¡Capacity ¡(3/3) • Total error probability (by union bound) !! ◆ N/ 2 R + 1 1 σ 2 N ✓ 2 log 1+ P Pr {E} ≤ 2 NR = 2 σ 2 P + σ 2 • As long as the following holds, Pr {E} → 0 as N → ∞ ✓ ◆ R < 1 1 + P 2 log σ 2 • Hence, indeed the capacity is ✓ ◆ C = 1 1 + P 2 log bits per symbol time σ 2 17

  18. Resources ¡of ¡ AWGN ¡Channel 18

  19. Continuous-­‑Time ¡AWGN ¡Channel • System parameters: - Power constraint: P watts; Bandwidth: W Hz - Spectral density of the white Gaussian noise: N 0 /2 • Equivalent discrete-time baseband channel (complex) z [ n ] ∼ CN (0 , N 0 W ) Power constraint: N | x [ n ] | 2 ≤ NP X x [ n ] y [ n ] = x [ n ] + z [ n ] n =1 - 1 complex symbol = 2 real symbols • Capacity: C AWGN ( P, W ) = 2 × 1 ✓ P/ 2 ◆ 2 log 1 + bits per symbol time N 0 W/ 2 SNR := P / N 0 W SNR per complex symbol ✓ ◆ P bits/s = log (1 + SNR ) bits/s/Hz = W log 1 + N 0 W 19

  20. Complex ¡AWGN ¡Channel ¡Capacity ¡ ✓ ◆ P C AWGN ( P, W ) = W log 1 + bits/s N 0 W = log (1 + SNR ) bits/s/Hz Spectral Efficiency • The capacity formula provides a high-level way of thinking about how the performance fundamentally depends on the basic resources available in the channel • No need to go into details of specific coding and modulation schemes • Basic resources: power P and bandwidth W 20

  21. Power 7 P SNR = 6 N 0 W 5 4 log (1 + SNR ) 3 2 1 0 Fix W : 0 20 40 60 80 100 SNR • High SNR: C = log(1 + SNR ) ≈ log SNR - Logarithmic growth with power • Low SNR: C = log(1 + SNR ) ≈ SNR log 2 e - Linear growth with power 21

  22. Bandwidth ✓ ◆ P N 0 W log 2 e = P P Fix P C ( W ) = W log 1 + log 2 e ≈ W N 0 W N 0 1.6 P N 0 log 2 e 1.4 Power limited region 1.2 1 0.8 Capacity C ( W ) 0.6 (Mbps) Limit for W → ∞ 0.4 Bandwidth limited region 0.2 0 0 5 10 15 20 25 30 Bandwidth W (MHz) 22

  23. Bandwidth-­‑limited ¡vs. ¡Power-­‑limited ✓ ◆ P C AWGN ( P, W ) = W log 1 + bits/s N 0 W P SNR = N 0 W • When SNR ≪ 1: (Power-limited regime) ✓ ◆ P log 2 e = P C AWGN ( P, W ) ≈ W log 2 e N 0 W N 0 - Linear in power; Insensitive to bandwidth • When SNR ≫ 1: (Bandwidth-limited regime) ✓ ◆ P C AWGN ( P, W ) ≈ W log N 0 W - Logarithmic in power; Approximately linear in bandwidth 23

  24. Capacity ¡of ¡Some LTI ¡Gaussian ¡Channels 24

  25. SIMO ¡Channel y = h x + w ∈ C L Power constraint: P ↓ � 0 , σ 2 I L � w ∼ CN h y x h ∗ MRC , || h || ↓ � 0 , σ 2 � y = || h || x + e w ∼ CN e e w, • MRC is a lossless operation: we can generate y from : e y y = e y ( h / || h || ) • Hence the SIMO channel capacity is equal to the capacity of the equivalent AWGN channel, which is 1 + || h || 2 P ✓ ◆ C SIMO = log Power gain due to σ 2 Rx beamforming 25

  26. MISO ¡Channel x , h ∈ C L y = h ∗ x + w ∈ C , ↓ Power constraint: Tx Beamforming h y x N || x || 2 ≤ NP X x = x h / || h || ↓ n =1 ⇥ h 1 ⇤ h ∗ = h 2 y = x || h || + w • Goal: maximize the received power || h * x || 2 - The answer is || h || 2 P ! (check. Hint: Cauchy-Schwarz inequality) • Achieved by Tx beamforming - Send a scalar symbol x on the direction of h - Power constraint on x : still P • Capacity: 1 + || h || 2 P ✓ ◆ C MISO = log σ 2 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend