adaptive coding for two way lossy source channel
play

Adaptive Coding for Two-Way Lossy Source-Channel Communication - PowerPoint PPT Presentation

Adaptive Coding for Two-Way Lossy Source-Channel Communication Jian-Jia Weng, Fady Alajaji, and Tam as Linder Department of Mathematics and Statistics Queens University, Kingston, Canada IEEE International Symposium on Information Theory,


  1. Adaptive Coding for Two-Way Lossy Source-Channel Communication Jian-Jia Weng, Fady Alajaji, and Tam´ as Linder Department of Mathematics and Statistics Queen’s University, Kingston, Canada IEEE International Symposium on Information Theory, June 2020

  2. Two-Way Communication Channel [Shannon’61] • Shannon’s two-way channel (TWC) provides in-band full-duplex data transfer between two terminals T 1 T 2 TWC • Discrete memoryless two-way channel (DM-TWC): • Channel inputs and outputs: X j ∈ X j and Y j ∈ Y j , j = 1 , 2 • Channel transition probability: P Y 1 ,Y 2 | X 1 ,X 2 2 / 20

  3. The Capacity Region C of DM-TWCs • The region C can be fully characterized using limiting expressions [Shannon, 1961], [Kramer, 1998], but these are often incomputable • In general, it is only known that C I ⊆ C ⊆ C O : R 2 Outer Bound C O Inner Bound C I R 1 3 / 20

  4. Some Results on the Capacity Region of DM-TWCs • Inner bounds C I : • Shannon (1961): general TWCs • Schalkwijk (1982, 1983): binary multiplying TWC • Han (1984): general TWCs • Sabag and Permuter (2018): common-output TWCs • Outer bounds C O : • Shannon (1961): general TWCs • Zhang, Berger, and Schalkwijk (1986): general TWCs • Hekstra and Willems (1989): common-output TWCs • Tightness conditions for Shannon’s inner bound: [Shannon, 1961], [Hekstra et al. , 1989], [Varshney, 2013], [Chaaban et al. , 2017], [Weng et al. , 2019] 4 / 20

  5. Lossy Transmission of Correlated Sources over DM-TWCs S K X N X N S K 1 1 2 2 DM-TWC T1 T2 P Y 1 ,Y 2 | X 1 ,X 2 ˆ Y N Y N ˆ S K S K 2 1 2 1 • Correlated sources: • K : block length of source messages • { ( S 1 ,k , S 2 ,k ) } is a memoryless stationary process with S j,k ∈ S j for finite source alphabets S j , j = 1 , 2 • Reconstruction and average distortion: • ˆ S j,k ∈ S j : the reconstruction of S j,k • D j � K − 1 � K k =1 E [ d j ( S j,k , ˆ S j,k )], where d j is a single-letter distortion measure • Overall rate: K N (source symbol/channel use), where N is the total number of channel uses for the overall transmission 5 / 20

  6. Joint Source-Channel Codes • Adaptive encoding: for j = 1 , 2 and 1 ≤ n ≤ N , • f j = ( f j, 1 , f j, 2 , . . . , f j,N ) • X j,n = f j,n ( S K j , Y n − 1 ), where S K j = ( S j, 1 , S j, 2 , . . . , S j,K ) j • Decoding with side-information: for j, j ′ = 1 , 2 with j � = j ′ and 1 ≤ k ≤ K , • g j = ( g j, 1 , g j, 2 , . . . , g j,K ) • ˆ S j ′ ,k = g j,k ( S K j , Y N j ) S K S K f 1 f 2 1 2 DM-TWC ˆ ˆ S K g 1 g 2 S K 2 1 6 / 20

  7. Our Research Problem For a pair of correlated sources and given DM-TWC, we seek forward achiev- ability joint source-channel coding (JSCC) theorem for the transmissibility under fidelity constraints 1 , ˆ E [ d 1 ( S K S K 1 )] ≤ D 1 X N X N 2 ˆ 1 S K S K 1 1 T 1 T 2 DM-TWC ˆ S K S K 2 Y N Y N 2 1 2 2 , ˆ E [ d 2 ( S K S K 2 )] ≤ D 2 7 / 20

  8. Related Work • Correlation-preserving coding scheme for almost lossless transmission of correlated sources [G¨ und¨ uz et al ., 2009] • Two-way lossy transmission of correlated sources [Weng et al ., 2017, 2019] • separate source-channel coding (SSCC) scheme that combines Wyner-Ziv (WZ) source coding and Shannon’s channel coding • two-way hybrid analog/digital coding scheme • Two-way interactive lossy transmission of correlated sources: • noiseless TWCs [Kaspi, 1985] • orthogonal one-way noisy channels [Maor and Merhav, 2008] 8 / 20

  9. Contributions • We propose an adaptive coding scheme, which proves a forward JSCC theorem • We show that the proposed scheme strictly generalizes prior results • Our scheme also yields a simple SSCC scheme that combines Wyner-Ziv (WZ) source coding and Han’s adaptive channel coding 9 / 20

  10. Main Idea We couple the two terminals’ encoding and transmission processes through a stationary Markov chain Two-Way Coded Channel ( S 1 , U 1 , ˜ S 1 , ˜ U 1 , ˜ ( S 2 , U 2 , ˜ S 2 , ˜ U 2 , ˜ W 1 ) X 1 X 2 W 2 ) F 1 F 2 DM-TWC Y 1 Y 2 Markov Transmission Process 10 / 20

  11. Auxiliary Coded Two-Way Channels • Function F j transforms the inputs of the coded channel into physical inputs • S j and U j : current source message and its coded data • ˜ S j and ˜ U j : some prior source message and its coded data • ˜ W j : some prior channel inputs and outputs • Joint input distribution of the coded channel: P S 1 ,S 2 ,U 1 ,U 2 , ˜ W 2 = P S 1 ,S 2 P U 1 | S 1 P U 2 | S 2 P ˜ S 1 , ˜ S 2 , ˜ U 1 , ˜ U 2 , ˜ W 1 , ˜ S 1 , ˜ S 2 , ˜ U 1 , ˜ U 2 , ˜ W 1 , ˜ W 1 ( S 1 , U 1 , ˜ S 1 , ˜ U 1 , ˜ ( S 2 , U 2 , ˜ S 2 , ˜ U 2 , ˜ X 1 X 2 W 1 ) W 2 ) F 1 F 2 DM-TWC Y 1 Y 2 11 / 20

  12. Markov Transmission Process - State Space • A time-homogeneous Markov chain Z ( t ) is constructed with state space: S 1 × S 2 × U 1 × U 2 × ˜ S 1 × ˜ S 2 × ˜ U 1 × ˜ U 2 × ˜ W 1 × ˜ W 2 × X 1 × X 2 × Y 1 × Y 2 • For all t , ( S ( t ) 1 , S ( t ) 2 , U ( t ) 1 , U ( t ) S ( t ) S ( t ) U ( t ) U ( t ) W ( t ) W ( t ) 2 ) is independent of ( ˜ 1 , ˜ 2 , ˜ 1 , ˜ 2 , ˜ 1 , ˜ 2 ) • For t ≥ 2 and j = 1 , 2, we set S ( t ) ˜ = S ( t − 1) , ˜ U ( t ) = U ( t − 1) , and ˜ W ( t ) = ( X ( t − 1) , Y ( t − 1) ) j j j j j j j Two-Way Coded Channel ( S 1 , U 1 , ˜ S 1 , ˜ U 1 , ˜ ( S 2 , U 2 , ˜ S 2 , ˜ U 2 , ˜ W 1 ) X 1 X 2 W 2 ) F 1 F 2 DM-TWC Y 2 Y 1 Markov Transmission Process 12 / 20

  13. Markov Transmission Process - Transition Kernel • For t ≥ 2, the transition kernel of { Z ( t ) } is given by w 2 , x 1 , x 2 , y 1 , y 2 | s ′ 1 , s ′ 2 , u ′ 1 , u ′ P Z ( t ) | Z ( t − 1) ( s 1 , s 2 , u 1 , u 2 , ˜ s 1 , ˜ s 2 , ˜ u 1 , ˜ u 2 , ˜ w 1 , ˜ 2 , s ′ s ′ u ′ u ′ w ′ w ′ 2 , x ′ 1 , x ′ 2 , y ′ 1 , y ′ ˜ 1 , ˜ 2 , ˜ 1 , ˜ 2 , ˜ 1 , ˜ 2 ) = P S 1 ,S 2 ( s 1 , s 2 ) P U 1 | S 1 ( u 1 | s 1 ) P U 2 | S 2 ( u 2 | s 2 ) s 1 = s ′ s 2 = s ′ u 1 = u ′ u 2 = u ′ w 1 = ( x ′ 1 , y ′ w 2 = ( x ′ 2 , y ′ · ✶ { ˜ 1 } ✶ { ˜ 2 } ✶ { ˜ 1 } ✶ { ˜ 2 } ✶ { ˜ 1 ) } ✶ { ˜ 2 ) } · ✶ { x 1 = F 1 ( s 1 , u 1 , ˜ s 1 , ˜ u 1 , ˜ w 1 ) } ✶ { x 2 = F 2 ( s 2 , u 2 , ˜ s 2 , ˜ u 2 , ˜ w 2 ) } · P Y 1 ,Y 2 | X 1 ,X 2 ( y 1 , y 2 | x 1 , x 2 ) where ✶ {·} denotes the indicator function • Parameters: F 1 , F 2 , P U 1 | S 1 , P U 2 | S 2 , P ˜ 2 , and P ˜ S (1) S (1) U (1) S (1) W (1) W (1) S (1) S (1) U (1) U (1) , ˜ , ˜ , ˜ , ˜ | ˜ , ˜ , ˜ , ˜ 1 2 1 1 2 1 2 1 2 13 / 20

  14. Markov Transmission Process - Stationary Configuration • To obtain time-invariant achievability conditions, we only consider stationary chain, in which P ˜ = P S 1 ,S 2 P U 1 | S 1 P U 2 | S 2 for all t S ( t ) 1 , ˜ S ( t ) 2 , ˜ U ( t ) 1 , ˜ S ( t ) 2 • For given F j and P U j | S j , j = 1 , 2, we find appropriate P ˜ W (1) , ˜ W (1) | ˜ S (1) , ˜ S (1) , ˜ U (1) , ˜ U (1) 1 2 1 2 1 2 • For source reconstruction, we also need the consider decoding functions G j : ˜ U j ′ × S j × U j × ˜ S j × ˜ U j × ˜ W j × Y j → ˆ S j ′ • Stationary configuration: { P U 1 | S 1 , P U 2 | S 2 , P U 1 | S 1 , P ˜ U 2 P ˜ U 2 , F 1 , F 2 , G 1 , G 2 } S 1 , ˜ S 2 , ˜ U 1 , ˜ W 1 , ˜ W 2 | ˜ S 1 , ˜ S 2 , ˜ U 1 , ˜ S j , ˆ • Π Z ( D 1 , D 2 ): the set of all stationary configurations with E [ d j ( ˜ ˜ S j )] ≤ D j , j = 1 , 2 14 / 20

  15. Adaptive Joint Source-Channel Coding Theorem A distortion pair ( D 1 , D 2 ) is achievable for the rate-one lossy transmission of correlated sources over a DM-TWC if there exists a stationary configuration in Π Z ( D 1 , D 2 ) such that I ( ˜ S 1 ; ˜ U 1 ) < I ( ˜ U 1 ; S 2 , U 2 , ˜ S 2 , ˜ U 2 , ˜ W 2 , X 2 , Y 2 ) , I ( ˜ S 2 ; ˜ U 2 ) < I ( ˜ U 2 ; S 1 , U 1 , ˜ S 1 , ˜ U 1 , ˜ W 1 , X 1 , Y 1 ) . 15 / 20

  16. Special Cases • Uncoded transmission scheme • Correlation-preserving coding scheme • A SSCC scheme based on WZ source coding and Shannon’s channel coding • Two-way hybrid analog/digital coding • A SSCC scheme based on WZ source coding and Han’s adaptive channel coding 16 / 20

  17. Coding Scheme used in the Proof • Block-wise encoder structure (including three coding components): S (1) S ( 2) S ( 3) S ( B ) j j j j U (1) U ( 2) U ( 3) U ( B ) j j j j … X ( 3) X ( 4) X ( B +1) X (1) X ( 2) X ( B ) j j j j j j Y (1) Y ( 2) Y ( 3) Y ( B − 1) Y ( B ) Y ( B +1) j j j j j j Hybrid Analog/Digital Coding Superposition Coding Adaptive Channel Coding • Rate: B B +1 , which approaches 1 as B → ∞ • Sliding-window decoder with window size two blocks 17 / 20

  18. A Simple Example • Independent binary uniform sources S 1 and S 2 • Dueck’s DM-TWC model: X j = { 0 , 1 } 2 , Y j = { 0 , 1 } 3 , and j = 1 , 2 • Channel input: X j = ( X j 1 , X j 2 ) • Channel output: Y 1 = ( X 1 , 1 · X 2 , 1 , X 2 , 2 ⊕ N 1 , N 2 ) and Y 2 = ( X 1 , 1 · X 2 , 1 , X 1 , 2 ⊕ N 2 , N 1 ) where ⊕ is binary addition • N 1 and N 2 are correlated with joint distribution P N 1 ,N 2 (0 , 0) = 0 and P N 1 ,N 2 ( n 1 , n 2 ) = 1 / 3 for ( n 1 , n 2 ) � = (0 , 0); they are also independent of S j ’s and X j ’s • Hamming distortion measure • Let K = 1 and choose B large enough 18 / 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend