capacity region of the gaussian arbitrarily varying
play

Capacity Region of the Gaussian Arbitrarily-Varying Broadcast - PowerPoint PPT Presentation

Capacity Region of the Gaussian Arbitrarily-Varying Broadcast Channel Fatemeh Hosseinigoki and Oliver Kosut A RIZONA S TATE U NIVERSITY ISIT 2020 1 / 12 Gaussian Arbitrarily-Varying Broadcast Channel Transmitter power constraint X 2


  1. Capacity Region of the Gaussian Arbitrarily-Varying Broadcast Channel Fatemeh Hosseinigoki and Oliver Kosut A RIZONA S TATE U NIVERSITY ISIT 2020 1 / 12

  2. Gaussian Arbitrarily-Varying Broadcast Channel Transmitter power constraint � X � 2 ≤ nP 2 / 12

  3. Gaussian Arbitrarily-Varying Broadcast Channel Transmitter power constraint � X � 2 ≤ nP Jammer power constraints � S j � 2 ≤ n Λ j , j = 1 , 2 2 / 12

  4. Gaussian Arbitrarily-Varying Broadcast Channel Transmitter power constraint � X � 2 ≤ nP Jammer power constraints � S j � 2 ≤ n Λ j , j = 1 , 2 We want to decode messages no matter what the jammers do 2 / 12

  5. Gaussian Arbitrarily-Varying Broadcast Channel Transmitter power constraint � X � 2 ≤ nP Jammer power constraints � S j � 2 ≤ n Λ j , j = 1 , 2 We want to decode messages no matter what the jammers do Assumptions: Oblivious adversary, but knows the code 2 / 12

  6. Gaussian Arbitrarily-Varying Channel Transmitter power constraint � X � 2 ≤ nP Jammer power constraint � S � 2 ≤ n Λ 3 / 12

  7. Gaussian Arbitrarily-Varying Channel Transmitter power constraint � X � 2 ≤ nP Jammer power constraint � S � 2 ≤ n Λ [Csisz´ ar-Narayan 1991]: Capacity is � � P  C Λ < P , where C ( x ) = 1  C = Λ + N 2 log(1 + x ) 0 , Λ ≥ P  3 / 12

  8. Gaussian Arbitrarily-Varying Channel Transmitter power constraint � X � 2 ≤ nP Jammer power constraint � S � 2 ≤ n Λ [Csisz´ ar-Narayan 1991]: Capacity is � � P  C Λ < P , where C ( x ) = 1  C = Λ + N 2 log(1 + x ) 0 , Λ ≥ P  As long as jammer’s power is less than legitimate transmitter’s, then the worst the jammer can do is send noise 3 / 12

  9. Symmetrizability If Λ ≥ P , then the jammer can choose a false message m ′ and transmit S = X ( m ′ ) 4 / 12

  10. Symmetrizability If Λ ≥ P , then the jammer can choose a false message m ′ and transmit S = X ( m ′ ) Receiver gets Y = X ( m ) + X ( m ′ ) + V Cannot tell which message is correct 4 / 12

  11. Gaussian AVC with Common Randomness 5 / 12

  12. Gaussian AVC with Common Randomness [Hughes-Narayan 1982]: If transmitter/receiver have common randomness (unknown to adversary) of at least O (log n ) bits, then � P � C = C N + Λ 5 / 12

  13. Gaussian AVC with Common Randomness [Hughes-Narayan 1982]: If transmitter/receiver have common randomness (unknown to adversary) of at least O (log n ) bits, then � P � C = C N + Λ Symmetrizability is not a problem because two possibilities can be distinguished via the common randomness 5 / 12

  14. Gaussian AVBC Capacity Region Let C no-adv ( S 1 , S 2 ) be the capacity region for the no-adversary BC with SNRs S 1 , S 2 6 / 12

  15. Gaussian AVBC Capacity Region Let C no-adv ( S 1 , S 2 ) be the capacity region for the no-adversary BC with SNRs S 1 , S 2 i.e., if S 1 ≥ S 2 , then   R 1 ≤ C ( αS 1 ) , � for some α ∈ [0 , 1]   C no-adv ( S 1 , S 1 ) =  ( R 1 , R 2 ) : � αS 2 ¯ R 2 ≤ C αS 2 + 1  6 / 12

  16. Gaussian AVBC Capacity Region Let C no-adv ( S 1 , S 2 ) be the capacity region for the no-adversary BC with SNRs S 1 , S 2 i.e., if S 1 ≥ S 2 , then   R 1 ≤ C ( αS 1 ) , � for some α ∈ [0 , 1]   C no-adv ( S 1 , S 1 ) =  ( R 1 , R 2 ) : � αS 2 ¯ R 2 ≤ C αS 2 + 1  Theorem ( R 1 , R 2 ) ∈ C GAV BC if and only if � � P P ( R 1 , R 2 ) ∈ C no-adv , N 1 + Λ 1 N 2 + Λ 2 If Λ 1 ≥ P , then R 1 = 0 If Λ 2 ≥ P , then R 2 = 0 6 / 12

  17. Gaussian AVBC Capacity Region Let C no-adv ( S 1 , S 2 ) be the capacity region for the no-adversary BC with SNRs S 1 , S 2 i.e., if S 1 ≥ S 2 , then   R 1 ≤ C ( αS 1 ) , � for some α ∈ [0 , 1]   C no-adv ( S 1 , S 1 ) =  ( R 1 , R 2 ) : � αS 2 ¯ R 2 ≤ C αS 2 + 1  Theorem ( R 1 , R 2 ) ∈ C GAV BC if and only if � � P P ( R 1 , R 2 ) ∈ C no-adv , N 1 + Λ 1 N 2 + Λ 2 If Λ 1 ≥ P , then R 1 = 0 If Λ 2 ≥ P , then R 2 = 0 Converse is straightforward: (a) adversaries send Gaussian noise, (b) symmetrizability 6 / 12

  18. Gaussian AVBC with Common Randomness Easy to show that, with common randomness, the capacity region is � P P � C no-adv , N 1 + Λ 1 N 2 + Λ 2 7 / 12

  19. Gaussian AVBC with Common Randomness Problem: How to send common randomness. . . 7 / 12

  20. Gaussian AVBC with Common Randomness Problem: How to send common randomness. . . 1 without the adversary preventing it, 7 / 12

  21. Gaussian AVBC with Common Randomness Problem: How to send common randomness. . . 1 without the adversary preventing it, 2 without conflicting with superposition coding 7 / 12

  22. Idea that doesn’t work #1 [Ahlswede 1978] 8 / 12

  23. Idea that doesn’t work #1 [Ahlswede 1978] Works for unconstrained discrete channels, e.g. [Jahn 1981] 8 / 12

  24. Idea that doesn’t work #1 [Ahlswede 1978] Works for unconstrained discrete channels, e.g. [Jahn 1981] Doesn’t work in the power-constrained setting, because the adversary can inject more power during the short common randomness transmission 8 / 12

  25. Idea that doesn’t work #2 Use the common message as common randomness 9 / 12

  26. Idea that doesn’t work #2 Use the common message as common randomness Codebooks X 1 ( m 1 , m 2 ) ∼ N (0 , αP ) , X 2 ( m 2 ) ∼ N (0 , ¯ αP ) 9 / 12

  27. Idea that doesn’t work #2 Use the common message as common randomness Codebooks X 1 ( m 1 , m 2 ) ∼ N (0 , αP ) , X 2 ( m 2 ) ∼ N (0 , ¯ αP ) X = X 1 ( m 1 , m 2 ) + X 2 ( m 2 ) 9 / 12

  28. Idea that doesn’t work #2 Use the common message as common randomness Codebooks X 1 ( m 1 , m 2 ) ∼ N (0 , αP ) , X 2 ( m 2 ) ∼ N (0 , ¯ αP ) X = X 1 ( m 1 , m 2 ) + X 2 ( m 2 ) Decoder 2 decodes m 2 , treats X 1 as noise 9 / 12

  29. Idea that doesn’t work #2 Use the common message as common randomness Codebooks X 1 ( m 1 , m 2 ) ∼ N (0 , αP ) , X 2 ( m 2 ) ∼ N (0 , ¯ αP ) X = X 1 ( m 1 , m 2 ) + X 2 ( m 2 ) Decoder 2 decodes m 2 , treats X 1 as noise Decoder 1 decodes m 2 , then decodes m 1 using m 2 as common randomness 9 / 12

  30. Idea that doesn’t work #2 Use the common message as common randomness Codebooks X 1 ( m 1 , m 2 ) ∼ N (0 , αP ) , X 2 ( m 2 ) ∼ N (0 , ¯ αP ) X = X 1 ( m 1 , m 2 ) + X 2 ( m 2 ) Decoder 2 decodes m 2 , treats X 1 as noise Decoder 1 decodes m 2 , then decodes m 1 using m 2 as common randomness Problem: To avoid symmetrizability, we need Λ 1 < ¯ αP, Λ 2 < ¯ αP 9 / 12

  31. Idea that doesn’t work #2 Use the common message as common randomness Codebooks X 1 ( m 1 , m 2 ) ∼ N (0 , αP ) , X 2 ( m 2 ) ∼ N (0 , ¯ αP ) X = X 1 ( m 1 , m 2 ) + X 2 ( m 2 ) Decoder 2 decodes m 2 , treats X 1 as noise Decoder 1 decodes m 2 , then decodes m 1 using m 2 as common randomness Problem: To avoid symmetrizability, we need Λ 1 < ¯ αP, Λ 2 < ¯ αP This idea does work for the interference channel 1 1 F. Hosseinigoki and O. Kosut, “The Gaussian Interference Channel in the Presence of a Malicious Jammer,” Allerton, 2016. 9 / 12

  32. Idea that works 10 / 12

  33. Idea that works Divide length- n block into √ n segments of length √ n 10 / 12

  34. Idea that works Divide length- n block into √ n segments of length √ n In each segment, transmit at either full power or 0 10 / 12

  35. Idea that works Divide length- n block into √ n segments of length √ n In each segment, transmit at either full power or 0 The main BC code is contained in the full power segments 10 / 12

  36. Idea that works Divide length- n block into √ n segments of length √ n In each segment, transmit at either full power or 0 The main BC code is contained in the full power segments The on/off sequence contains the common randomness 10 / 12

  37. Idea that works Divide length- n block into √ n segments of length √ n In each segment, transmit at either full power or 0 The main BC code is contained in the full power segments The on/off sequence contains the common randomness Wait. . . can’t the adversary mess up the on/off sequence? 10 / 12

  38. Common randomness coding At decoders, determine if the received power in each segment is high or low 11 / 12

  39. Common randomness coding At decoders, determine if the received power in each segment is high or low Transmitter power Adversary power Received power P anything high 0 high >P 0 <P low 11 / 12

  40. Common randomness coding At decoders, determine if the received power in each segment is high or low Transmitter power Adversary power Received power P anything high 0 high >P 0 <P low This is essentially a binary “or” channel 11 / 12

  41. Common randomness coding At decoders, determine if the received power in each segment is high or low Transmitter power Adversary power Received power P anything high 0 high >P 0 <P low This is essentially a binary “or” channel If Λ < P , the adversary can only send power >P so often 11 / 12

  42. Common randomness coding At decoders, determine if the received power in each segment is high or low Transmitter power Adversary power Received power P anything high 0 high >P 0 <P low This is essentially a binary “or” channel If Λ < P , the adversary can only send power >P so often Looks like a discrete AVC with cost-constrained adversary 11 / 12

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend