outline
play

Outline I. Discrete Alphabets II. AWGN Channels III. Network - PowerPoint PPT Presentation

Outline I. Discrete Alphabets II. AWGN Channels III. Network Applications Gaussian Multiple-Access Channel Rate Region x 1 z w 1 E 1 R 1 < 1 1 + P 1 2 log y w 1 N D w 2 R 2 < 1 1 + P 2 x 2 2 log w 2 E 2


  1. Outline I. Discrete Alphabets II. AWGN Channels III. Network Applications

  2. Gaussian Multiple-Access Channel Rate Region x 1 z w 1 E 1 R 1 < 1 � � 1 + P 1 2 log y w 1 ˆ N D w 2 ˆ � � R 2 < 1 1 + P 2 x 2 2 log w 2 E 2 N R 1 + R 2 < 1 � 1 + P 1 + P 2 � 2 log Power constraints P 1 , P 2 . Noise variance N . N Successive Cancellation � 1 � � � �� P 1 , 1 1 + P 2 R 2 2 log 1 + 2 log N + P 2 N Corner Point 1. Decode x 1 , treating x 2 as noise. 2. Subtract x 1 from y . 3. Decode x 2 . R 1

  3. Lattice Achievability “Recipe” – Multiple-Access Corner Point Codebook Generation Tx 1 Select a nested lattice code: • Coarse lattice Λ = B Z n for shaping. • Fine lattice from q -ary linear code G for coding. Encoding Tx 2

  4. Lattice Achievability “Recipe” – Multiple-Access Corner Point Codebook Generation Tx 1 Select a nested lattice code: • Coarse lattice Λ = B Z n for shaping. • Fine lattice from q -ary linear code G for coding. t 1 = [ B γ Gw 1 ] mod Λ Encoding • Map messages w 1 , w 2 to lattice Tx 2 points t 1 , t 2 . t 2 = [ B γ Gw 2 ] mod Λ

  5. Lattice Achievability “Recipe” – Multiple-Access Corner Point Codebook Generation Tx 1 Select a nested lattice code: • Coarse lattice Λ = B Z n for shaping. • Fine lattice from q -ary linear code G for coding. t 1 = [ B γ Gw 1 ] mod Λ Encoding • Map messages w 1 , w 2 to lattice Tx 2 points t 1 , t 2 . • Choose independent dithers d 1 , d 2 uniformly over Voronoi region V . t 2 = [ B γ Gw 2 ] mod Λ

  6. Lattice Achievability “Recipe” – Multiple-Access Corner Point Codebook Generation Tx 1 Select a nested lattice code: • Coarse lattice Λ = B Z n for shaping. • Fine lattice from q -ary linear code G for coding. t 1 = [ B γ Gw 1 ] mod Λ x 1 = [ t 1 + d 1 ] mod Λ Encoding • Map messages w 1 , w 2 to lattice Tx 2 points t 1 , t 2 . • Choose independent dithers d 1 , d 2 uniformly over Voronoi region V . • Add dithers to lattice points and t 2 = [ B γ Gw 2 ] mod Λ take mod Λ to get transmitted x 2 = [ t 1 + d 2 ] mod Λ signals x 1 , x 2 .

  7. Lattice Achievability “Recipe” – Multiple-Access Corner Point Codebook Generation Tx 1 Select a nested lattice code: • Coarse lattice Λ = B Z n for shaping. • Fine lattice from q -ary linear code G for coding. t 1 = [ B γ Gw 1 ] mod Λ x 1 = [ t 1 + d 1 ] mod Λ Encoding • Map messages w 1 , w 2 to lattice Tx 2 points t 1 , t 2 . • Choose independent dithers d 1 , d 2 uniformly over Voronoi region V . • Add dithers to lattice points and t 2 = [ B γ Gw 2 ] mod Λ take mod Λ to get transmitted x 2 = [ t 1 + d 2 ] mod Λ signals x 1 , x 2 .

  8. Lattice Achievability “Recipe” – Multiple-Access Corner Point Codebook Generation Tx 1 Select a nested lattice code: • Coarse lattice Λ = B Z n for shaping. • Fine lattice from q -ary linear code G for coding. t 1 = [ B γ Gw 1 ] mod Λ x 1 = [ t 1 + d 1 ] mod Λ Encoding • Map messages w 1 , w 2 to lattice Tx 2 points t 1 , t 2 . • Choose independent dithers d 1 , d 2 uniformly over Voronoi region V . • Add dithers to lattice points and t 2 = [ B γ Gw 2 ] mod Λ take mod Λ to get transmitted x 2 = [ t 1 + d 2 ] mod Λ signals x 1 , x 2 .

  9. Lattice Achievability “Recipe” – Multiple-Access Corner Point Receiver observes y = x 1 + x 2 + z . Decoding Rx

  10. Lattice Achievability “Recipe” – Multiple-Access Corner Point Receiver observes y = x 1 + x 2 + z . Decoding Rx

  11. Lattice Achievability “Recipe” – Multiple-Access Corner Point Receiver observes y = x 1 + x 2 + z . Decoding Rx • Scale by α .

  12. Lattice Achievability “Recipe” – Multiple-Access Corner Point Receiver observes y = x 1 + x 2 + z . Decoding Rx • Scale by α . • Subtract dither d 1 .

  13. Lattice Achievability “Recipe” – Multiple-Access Corner Point Receiver observes y = x 1 + x 2 + z . Decoding Rx • Scale by α . • Subtract dither d 1 . • Take mod Λ .

  14. Lattice Achievability “Recipe” – Multiple-Access Corner Point Receiver observes y = x 1 + x 2 + z . Decoding Rx • Scale by α . • Subtract dither d 1 . • Take mod Λ . • Decode to nearest codeword. [ α y − d 1 ] mod Λ = [ α ( x 1 + x 2 + z ) − d 1 ] mod Λ = [ x 1 − d 1 + α z + α x 2 − (1 − α ) x 1 ] mod Λ � � = [ t 1 + d 1 ] mod Λ − d 1 + α z + α x 2 − (1 − α ) x 1 mod Λ = [ t 1 + α z + α x 2 − (1 − α ) x 1 ] Effective Noise

  15. Lattice Achievability “Recipe” – Multiple-Access Corner Point • Effective noise after scaling is N EFFEC = α 2 ( N + P 2 ) + (1 − α ) 2 P 1 . • Minimized by setting α to be the MMSE coefficient: P 1 α MMSE = N + P 1 + P 2 • Plugging in, we get N EFFEC = ( N + P 2 ) P 1 N + P 1 + P 2 • Resulting rate is R = 1 � P 1 � = 1 � P 1 � 2 log 2 log 1 + N EFFEC N + P 2 • To obtain different rates for x 1 and x 2 , use nested linear codes G 1 and G 2 inside Voronoi region V .

  16. AWGN Two-Way Relay Channel – Symmetric Rates Has w 1 Relay Has w 2 Wants w 2 Wants w 1

  17. AWGN Two-Way Relay Channel – Symmetric Rates z MAC x 1 x 2 w 1 w 2 y MAC • Equal power constraints P . User 1 User 2 • Equal noise variances N . Relay • Equal rates R . x BC w 2 ˆ w 1 ˆ z 1 z 2

  18. AWGN Two-Way Relay Channel – Symmetric Rates z MAC x 1 x 2 w 1 w 2 y MAC • Equal power constraints P . User 1 User 2 • Equal noise variances N . Relay • Equal rates R . x BC w 2 ˆ w 1 ˆ z 1 z 2 • Upper Bound: R ≤ 1 � 1 + P � 2 log N • Decode-and-Forward: Relay decodes w 1 , w 2 and transmits w 1 ⊕ w 2 . R = 1 � 1 + 2 P � 4 log N • Compress-and-Forward: Relay transmits quantized y . R = 1 � 1 + P P � 2 log N 3 P + N

  19. AWGN Two-Way Relay Channel – Symmetric Rates 3.5 Upper Bound 3 Compress 2.5 Decode Rate per User 2 1.5 1 0.5 0 0 5 10 15 20 SNR in dB

  20. Decoding the Sum of Lattice Codewords Encoders use the same nested x 1 z lattice codebook. t 1 E 1 y Transmit lattice codewords: D v ˆ x 1 = t 1 x 2 t 2 E 2 v = [ t 1 + t 2 ] mod Λ x 2 = t 2 Decoder recovers modulo sum. [ y ] mod Λ = [ x 1 + x 2 + z ] mod Λ = [ t 1 + t 2 + z ] mod Λ � � = [ t 1 + t 2 ] mod Λ + z mod Λ Distributive Law = [ v + z ] mod Λ � P R = 1 � 2 log N

  21. Decoding the Sum of Lattice Codewords – MMSE Scaling Encoders use the same nested x 1 z lattice codebook. t 1 E 1 Transmit dithered codewords: y D v ˆ x 1 = [ t 1 + d 1 ] mod Λ x 2 t 2 E 2 v = [ t 1 + t 2 ] mod Λ x 2 = [ t 2 + d 2 ] mod Λ Decoder scales by α , removes dithers, recovers modulo sum. [ α y − d 1 − d 2 ] mod Λ = [ α ( x 1 + x 2 + z ) − d 1 − d 2 ] mod Λ = [ x 1 + x 2 − (1 − α )( x 1 + x 2 ) + α z − d 1 − d 2 ] mod Λ � � = [ t 1 + t 2 ] mod Λ − (1 − α )( x 1 + x 2 ) + α z mod Λ = [ v − (1 − α )( x 1 + x 2 ) + α z ] mod Λ N EFFEC = (1 − α ) 2 2 P + α 2 N Effective Noise

  22. Decoding the Sum of Lattice Codewords – MMSE Scaling • Effective noise after scaling is N EFFEC = (1 − α ) 2 2 P + α 2 N . • Minimized by setting α to be the MMSE coefficient: 2 P α MMSE = N + 2 P • Plugging in, we get 2 NP N EFFEC = N + 2 P • Resulting rate is R = 1 � P � = 1 � 1 2 + P � 2 log 2 log N EFFEC N • Getting the full “one plus” term is an open challenge. Does not seem possible with nested lattices.

  23. From Messages to Lattice Points and Back • Map messages to lattice points t 1 = φ ( w 1 ) = [ B γ Gw 1 ] mod Λ t 2 = φ ( w 2 ) = [ B γ Gw 2 ] mod Λ • Mapping between finite field messages and lattice codewords preserves linearity: φ − 1 � � [ t 1 + t 2 ] mod Λ = w 1 ⊕ w 2 • This means that after decoding a mod Λ equation of lattice points we can immediately recover the finite field equation of the messages. See Nazer-Gastpar ’11 for more details.

  24. Finite Field Computation over a Gaussian MAC Map messages to lattice points: t 1 = φ ( w 1 ) x 1 z w 1 E 1 t 2 = φ ( w 2 ) y u ˆ D Transmit dithered codewords: x 2 w 2 u = w 1 ⊕ w 2 E 2 x 1 = [ t 1 + d 1 ] mod Λ x 2 = [ t 2 + d 2 ] mod Λ • If decoder can recover [ t 1 + t 2 ] mod Λ , it also can get the sum of the messages w 1 ⊕ w 2 = φ − 1 � � [ t 1 + t 2 ] mod Λ . � 1 � • Achievable rate R = 1 2 + P 2 log . N

  25. AWGN Two-Way Relay Channel – Symmetric Rates • Equal power constraints P . Has • Equal noise variances N . Relay w 1 Has w 2 • Equal rates R . Wants w 2 Wants w 1 • Upper Bound: � � R ≤ 1 1 + P 2 log N • Compute-and-Forward: Relay decodes w 1 ⊕ w 2 and retransmits. R = 1 � 1 2 + P � 2 log N • Wilson-Narayanan-Pfister-Sprintson ’10: Applies nested lattice codes to the two-way relay channel.

  26. AWGN Two-Way Relay Channel – Symmetric Rates z MAC x 1 x 2 w 1 w 2 • Equal power constraints P . y MAC User 1 User 2 • Equal noise variances N . Relay • Equal rates R . x BC w 2 ˆ w 1 ˆ z 1 z 2 • Upper Bound: � � R ≤ 1 1 + P 2 log N • Compute-and-Forward: Relay decodes w 1 ⊕ w 2 and retransmits. R = 1 � 1 2 + P � 2 log N • Wilson-Narayanan-Pfister-Sprintson ’10: Applies nested lattice codes to the two-way relay channel.

  27. AWGN Two-Way Relay Channel – Symmetric Rates 3.5 Upper Bound 3 Compute Compress 2.5 Decode Rate per User 2 1.5 1 0.5 0 0 5 10 15 20 SNR in dB

  28. Compute-and-Forward Illustration w 1 x 1 z y w 1 ⊕ w 2 x 2 w 2

  29. Compute-and-Forward Illustration w 1 x 1 z y w 1 ⊕ w 2 x 2 w 2

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend