capacity and coding for multi antenna broadcast channels
play

Capacity and Coding for Multi-Antenna Broadcast Channels Wei Yu - PowerPoint PPT Presentation

Capacity and Coding for Multi-Antenna Broadcast Channels Wei Yu Electrical Engineering Department Stanford University February 20, 2002 Wei Yu Introduction Consider a communication situation involving mutliple transmitters and receivers:


  1. Capacity and Coding for Multi-Antenna Broadcast Channels Wei Yu Electrical Engineering Department Stanford University February 20, 2002 Wei Yu

  2. Introduction • Consider a communication situation involving mutliple transmitters and receivers: z 1 x 1 y 1 z 2 x 2 y 2 Multiuser z m − 1 x n − 1 y m − 1 Channel z m x n y m – What is the value of cooperation? Wei Yu 1

  3. Motivation: Multiuser DSL Environment • DSL environment is interference-limited. FEXT user 1 user 2 user n central downstream NEXT office upstream – Explore the benefit of cooperation. Wei Yu 2

  4. Gaussian Vector Channel • Capacity: C = max I ( X ; Y ) . Z n ˆ X n W ( Y n ) W ∈ 2 nC Y n H • Optimum Spectrum: 2 log | HK xx H T + K zz | 1 maximize | K zz | subject to tr( K xx ) ≤ P, K xx ≥ 0 . Wei Yu 3

  5. Gaussian Vector Broadcast Channel • Capacity Region: { ( R 1 , · · · , R K ) : Pr( W k � = ˆ W k ) → 0 , k = 1 , · · · K } . Z n ˆ W 1 ∈ 2 nR 1 Y n W 1 ( Y n 1 ) 1 X n H ˆ Y n W K ( Y n K ) W K ∈ 2 nR K K • Capacity is known only in special cases. – This talk focuses on sum capacity: C = max { R 1 + · · · + R K } . Wei Yu 4

  6. Broadcast Channel: Prior Work • Introduced by Cover (’72) – Superposition coding: Cover (’72). – Degraded broadcast channel: Bergman (’74), Gallager (’74) – Coding using binning: Marton (’79), El Gamal, van der Meulen (’81) – Sum and product channels: El Gamal (’80) – Gaussian vector channel, 2 × 2 case: Caire, Shamai (’00) • General capacity region remains unknown. Wei Yu 5

  7. Degraded Broadcast Channel Z 1 ∼ N (0 , σ 2 Z 2 ∼ N (0 , σ 2 2 − σ 2 1 ) 1 ) X 1 ∼ N (0 , P 1 ) X Y 1 Y 2 X 2 ∼ N (0 , P 2 ) • Superposition and successive decoding achieve capacity (Cover ’72) � � I ( X 1 ; Y 1 | X 2 ) = 1 1 + P 1 R 1 = 2 log σ 2 1 � � = 1 P 2 R 2 = I ( X 2 ; Y 2 ) 2 log 1 + P 1 + σ 2 2 Wei Yu 6

  8. Gaussian Vector Broadcast Channel Z 1 X 1 ∼ N (0 , K 1 ) H 1 Y 1 X Z 2 X 2 ∼ N (0 , K 2 ) H 2 Y 2 • Superposition coding gives: 2 log | H 1 K 1 H T 1 + H 1 K 2 H T I ( X 1 ; Y 1 ) = 1 1 + K z 1 z 1 | R 1 = | H 1 K 2 H T 1 + K z 1 z 1 | 2 log | H 2 K 2 H T 2 + H 2 K 1 H T I ( X 2 ; Y 2 ) = 1 2 + K z 2 z 2 | R 2 = | H 2 K 1 H T 2 + K z 2 z 2 | Wei Yu 7

  9. Channel with Transmitter Side Information Gaussian Channel ... with Transmitter Side Information Z ∼ N (0 , N ) S ∼ N (0 , Q ) Z ∼ N (0 , N ) X Y X Y P P � � C = 1 1 + P 2 log C =? N Wei Yu 8

  10. Writing on Dirty Paper • A surprising result due to Costa (’83): Sn ∼ N (0 , Q ) Zn ∼ N (0 , N ) W ∈ 2 nR Xn ( W, Sn ) Y n W ( Y n ) ˆ � � C = 1 1 + P 2 log N • This inspired Caire and Shamai’s work on 2x2 broadcast channel (’01). Wei Yu 9

  11. Channel with Side Information Sn W ∈ 2 nR Xn ( W, Sn ) Y n W ( Y n ) ˆ p ( y | x, s ) • Gel’fand and Pinsker (’80), Heegard and El Gamal (’83): p ( u,x | s ) { I ( U ; Y ) − I ( U ; S ) } , C = max • Key: What is the appropriate auxiliary random variable U ? Wei Yu 10

  12. Random Binning and Joint Typicality U X Q P + α 2 Q S • Randomly choose u n ( i ) , i ∈ 2 nI ( U ; Y ) . Binning using B : 2 nI → 2 nC . • Encode: Given s n and message W , find i such that ( u n ( i ) , s n ) is jointly typical, and B ( i ) = W . Send: x n = u n ( i ) − αs n . i )) jointly typical. Recover ˆ • Decode: Find ( y n , u n (ˆ W = B (ˆ i ) . Wei Yu 11

  13. Costa’s Choice for U Zn ∼ N (0 , N ) Sn ∼ N (0 , Q ) W ∈ 2 nR Xn ( W, Sn ) Y n W ( Y n ) ˆ • For i.i.d. S and Z : – Let U = X + αS , where α = P/ ( P + N ) . – Let X be independent of S . – This gives the optimal joint distribution on ( S, X, U, Y, Z ) . � � C = I ( U ; Y ) − I ( U ; S ) = 1 1 + P 2 log N . Wei Yu 12

  14. Colored Gaussian Channel with Side Information Sn ∼ N (0 , Kss ) Zn ∼ N (0 , Kzz ) W ∈ 2 nR Xn ( W, Sn ) Y n W ( Y n ) ˆ • For colored S and Z : – Let U = X + FS , where F = K xx ( K xx + K zz ) − 1 . – Let X be independent of S . 2 log | K xx + K zz | C = I ( U ; Y ) − I ( U ; S ) = 1 | K zz | Wei Yu 13

  15. Wiener Filtering • The optimal non-causal estimate of X given X + Z is ˆ X = F ( X + Z ) , where F = K xx ( K xx + K zz ) − 1 . • The optimal auxiliary random variable for channel with non-causal transmitter side information is U = X + FS , where F = K xx ( K xx + K zz ) − 1 . • Curiously, the two filters are the same. Wei Yu 14

  16. Writing on Colored Paper Gaussian Channel ... with Transmitter Side Information Z ∼ N (0 , Kzz ) S ∼ N (0 , Kss ) Z ∼ N (0 , Kzz ) X Y X Y 2 log | K xx + K zz | 2 log | K xx + K zz | C = 1 C = 1 | K zz | | K zz | • Capacities are the same if S is known non-causally at the transmitter. – Several other proofs have been found by Cohen and Lapidoth (’01), and Zamir, Shamai and Erez (’01) under different assumptions Wei Yu 15

  17. New Achievable Region Z n 1 ˆ X n 1 ( W 1 , X n W 1 ( Y n W 1 ∈ 2 nR 1 Y n 2 ) H 1 1 ) 1 X n Z n 2 ˆ W 2 ∈ 2 nR 2 X n Y n W 2 ( Y n 2 ( W 2 ) H 2 2 ) 2 2 log | H 1 K 1 H T 1 + K z 1 z 1 | I ( X 1 ; Y 1 | X 2 ) = 1 R 1 = | K z 1 z 1 | 2 log | H 2 K 2 H T 2 + H 2 K 1 H T = 1 2 + K z 2 z 2 | R 2 = I ( X 2 ; Y 2 ) | H 2 K 1 H T 2 + K z 2 z 2 | Wei Yu 16

  18. Converse • Broadcast capacity does not depend on noise correlation: Sato (’78). z ′ z ′ z 1 1 1 x 1 y 1 x 1 y 1 x 1 y 1 = z ′ z ′ ≤ z 2 2 2 x 2 y 2 x 2 y 2 x 2 y 2 � �� � � p ( z 1 ) = p ( z ′ 1 ) 2 ) , not necessarily p ( z 1 , z 2 ) = p ( z ′ 1 , z ′ if 2 ) . p ( z 2 ) = p ( z ′ • Thus, sum-capacity C ≤ min K nn max K xx I ( X ; Y ) . Wei Yu 17

  19. Strategy for Proving Achievability 1. Find the worst-case noise correlation z ∼ N (0 , K zz ) . 2. Design an optimal receiver for the vector channel with worst-case noise: y = H x + z 3. Precode x so that receiver coordination is not necessary. • Tools: – Convex optimization – Generalized Decision-Feedback Equalization (GDFE) Cioffi, Forney (’95), Varanasi, Guess (’97) Wei Yu 18

  20. Least Favorable Noise • Fix Gaussian input K xx : 2 log | HK xx H T + K zz | 1 minimize | K zz | � K z 1 z 1 � ⋆ subject to K zz = ⋆ K z 2 z 2 K zz ≥ 0 • Minimizing a convex function over convex constraints. � � Ψ 1 0 zz − ( HK xx H T + K zz ) − 1 = • Optimality condition: K − 1 . 0 Ψ 2 Wei Yu 19

  21. Generalized Decision Feedback Equalizer • Key idea: MMSE estimation is capacity-lossless z e y x ˆ x x ( H T H + K − 1 xx ) − 1 H T H xx ) − 1 = G − 1 ∆ − 1 G − T . • Channel can be triangularized: ( H T H + K − 1 z � x 1 � � � y x 1 ˆ ∆ − 1 G − T H T Decision H x 2 x 2 ˆ I − G Wei Yu 20

  22. GDFE with Transmit Filter z ∼ N (0 , Q Λ Q T ) x ∼ N (0 , K xx ) u y H T + I ) − 1 ˆ u ˜ ( ˜ H ˜ 1 H T Λ Q F H √ � �� � � �� � ˜ 1 H = Λ QHF MMSE estimation √ • Set z ∼ N (0 , K zz ) to be the least favorable noise. • Fix x ∼ N (0 , K xx ) , and u ∼ N (0 , I ) . Choose a transmit filter F . Wei Yu 21

  23. GDFE Precoder ˜ z � ˆ � u 1 ˜ ˜ ∆ − 1 G − T H T Decision u H u 2 ˆ � �� � feedforward filter I − G • Decision-feedback may be moved to the transmitter by precoding. • Least Favorable Noise ⇐ ⇒ Feedforward/whitening filter is diagonal! C = min K nn I ( X ; Y ) (i.e. with least favorable noise) is achievable. Wei Yu 22

  24. Gaussian Broadcast Channel Sum Capacity • Achievability: C ≥ max K xx min K zz I ( X ; Y ) . • Converse (Sato): C ≤ min K zz max K xx I ( X ; Y ) . • (Diggavi, Cover ’98): min K zz max K xx I ( X ; Y ) = max K xx min K zz I ( X ; Y ) . Theorem 1. Gaussian vector broadcast channel sum capacity is: 2 log | HK xx H T + K zz | 1 C = max K xx min | K zz | K zz Wei Yu 23

  25. Gaussian Mutual Information Game Z ∼ N (0 , K zz ) X ∼ N (0 , K xx ) H Y Strategy Objective { K xx : trace( K xx ) ≤ P } Signal Player max I ( X ; Y ) � K z 1 z 1 � � � Fictitious ⋆ K zz : K zz = ≥ 0 min I ( X ; Y ) Noise Player ⋆ K z 2 z 2 Nash equilibrium exists. Wei Yu 24

  26. Saddle-Point is the Broadcast Capacity C ( K xx , K zz ) • The optimum K ∗ xx is a water- filling covariance against K ∗ zz . • The optimum K ∗ zz is a least- favorable noise for K ∗ xx . K xx ( K ∗ xx , K ∗ zz ) K zz Broadcast Channel Sum Capacity = Nash Equilibrium Wei Yu 25

  27. The Value of Cooperation z 1 z 1 z 1 x 1 y 1 x 1 y 1 x 1 y 1 z 2 z 2 z 2 x 2 y 2 x 2 y 2 x 2 y 2 max K xx I ( X ; X + Z ) max K xx I ( X ; X + Z ) min K zz max K xx I ( X ; X + Z ) � � � � K 1 0 K z 1 z 1 ⋆ K xx = K zz = 0 K 2 ⋆ K z 2 z 2 s . t . trace( K xx ) ≤ P s . t . s . t . trace( K i ) ≤ P i , trace( K xx ) ≤ P Wei Yu 26

  28. Application: Vector Transmission in DSL Z X 1 Y 1 − − X 2 Y 2 X 3 Y 3 • If interference is known in advance, it can be pre-subtracted: – Send X ′ 1 = X 1 − X 2 − X 3 . 1 || 2 = || X 1 || 2 + || X 2 || 2 + || X 3 || 2 . • Problem: energy enhancement || X ′ Wei Yu 27

  29. Reducing Energy Enhancement: Tomlinson Precoder 3 M 2 S Z M Equivalent 2 − X Y Points ˆ Mod- M Mod- M U U − M 2 − 3 M 2 • Key idea: Use modulo operation to reduce energy enhancement – X is uniformly distributed in [ − M 2 , M 2 ] . • Capacity loss due to shaping: 1.53dB. (Erez, Shamai, Zamir ’00) Wei Yu 28

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend