Adaptive Coding for Two-Way Lossy Source-Channel Communication - - PowerPoint PPT Presentation

adaptive coding for two way lossy source channel
SMART_READER_LITE
LIVE PREVIEW

Adaptive Coding for Two-Way Lossy Source-Channel Communication - - PowerPoint PPT Presentation

Adaptive Coding for Two-Way Lossy Source-Channel Communication Jian-Jia Weng, Fady Alajaji, and Tam as Linder Department of Mathematics and Statistics Queens University, Kingston, Canada IEEE International Symposium on Information Theory,


slide-1
SLIDE 1

Adaptive Coding for Two-Way Lossy Source-Channel Communication

Jian-Jia Weng, Fady Alajaji, and Tam´ as Linder

Department of Mathematics and Statistics Queen’s University, Kingston, Canada

IEEE International Symposium on Information Theory, June 2020

slide-2
SLIDE 2

Two-Way Communication Channel [Shannon’61]

  • Shannon’s two-way channel (TWC) provides in-band full-duplex data transfer

between two terminals

TWC

T1 T2

  • Discrete memoryless two-way channel (DM-TWC):
  • Channel inputs and outputs: Xj ∈ Xj and Yj ∈ Yj, j = 1, 2
  • Channel transition probability: PY1,Y2|X1,X2

2 / 20

slide-3
SLIDE 3

The Capacity Region C of DM-TWCs

  • The region C can be fully characterized using limiting expressions [Shannon, 1961],

[Kramer, 1998], but these are often incomputable

  • In general, it is only known that CI ⊆ C ⊆ CO:

R1 R2 Outer Bound CO Inner Bound CI

3 / 20

slide-4
SLIDE 4

Some Results on the Capacity Region of DM-TWCs

  • Inner bounds CI:
  • Shannon (1961): general TWCs
  • Schalkwijk (1982, 1983): binary multiplying TWC
  • Han (1984): general TWCs
  • Sabag and Permuter (2018): common-output TWCs
  • Outer bounds CO:
  • Shannon (1961): general TWCs
  • Zhang, Berger, and Schalkwijk (1986): general TWCs
  • Hekstra and Willems (1989): common-output TWCs
  • Tightness conditions for Shannon’s inner bound: [Shannon, 1961], [Hekstra et al.,

1989], [Varshney, 2013], [Chaaban et al., 2017], [Weng et al., 2019]

4 / 20

slide-5
SLIDE 5

Lossy Transmission of Correlated Sources over DM-TWCs

DM-TWC

T1 T2

SK

1

SK

2

XN

1

XN

2

Y N

2

Y N

1

ˆ SK

1

ˆ SK

2 PY1,Y2|X1,X2

  • Correlated sources:
  • K: block length of source messages
  • {(S1,k, S2,k)} is a memoryless stationary process with Sj,k ∈ Sj for finite source

alphabets Sj, j = 1, 2

  • Reconstruction and average distortion:
  • ˆ

Sj,k ∈ Sj: the reconstruction of Sj,k

  • Dj K−1 K

k=1 E[dj(Sj,k, ˆ

Sj,k)], where dj is a single-letter distortion measure

  • Overall rate: K

N (source symbol/channel use), where N is the total number of channel

uses for the overall transmission

5 / 20

slide-6
SLIDE 6

Joint Source-Channel Codes

  • Adaptive encoding: for j = 1, 2 and 1 ≤ n ≤ N,
  • fj = (fj,1, fj,2, . . . , fj,N)
  • Xj,n = fj,n(SK

j , Y n−1 j

), where SK

j = (Sj,1, Sj,2, . . . , Sj,K)

  • Decoding with side-information: for j, j′ = 1, 2 with j = j′ and 1 ≤ k ≤ K,
  • gj = (gj,1, gj,2, . . . , gj,K)
  • ˆ

Sj′,k = gj,k(SK

j , Y N j ) DM-TWC

f1 g2 SK

1

ˆ SK

1

ˆ SK

2

SK

2

g1 f2

6 / 20

slide-7
SLIDE 7

Our Research Problem

For a pair of correlated sources and given DM-TWC, we seek forward achiev- ability joint source-channel coding (JSCC) theorem for the transmissibility under fidelity constraints T1 T2

DM-TWC

E[d1(SK

1 , ˆ

SK

1 )] ≤ D1

E[d2(SK

2 , ˆ

SK

2 )] ≤ D2

SK

1

ˆ SK

2

ˆ SK

1

SK

2

XN

1

XN

2

Y N

1

Y N

2

7 / 20

slide-8
SLIDE 8

Related Work

  • Correlation-preserving coding scheme for almost lossless transmission of correlated

sources [G¨ und¨ uz et al., 2009]

  • Two-way lossy transmission of correlated sources [Weng et al., 2017, 2019]
  • separate source-channel coding (SSCC) scheme that combines Wyner-Ziv (WZ)

source coding and Shannon’s channel coding

  • two-way hybrid analog/digital coding scheme
  • Two-way interactive lossy transmission of correlated sources:
  • noiseless TWCs [Kaspi, 1985]
  • orthogonal one-way noisy channels [Maor and Merhav, 2008]

8 / 20

slide-9
SLIDE 9

Contributions

  • We propose an adaptive coding scheme, which proves a forward JSCC theorem
  • We show that the proposed scheme strictly generalizes prior results
  • Our scheme also yields a simple SSCC scheme that combines Wyner-Ziv (WZ) source

coding and Han’s adaptive channel coding

9 / 20

slide-10
SLIDE 10

Main Idea

We couple the two terminals’ encoding and transmission processes through a stationary Markov chain

DM-TWC

F1 F2

Y1 Y2 X2 X1

(S1, U1, ˜ S1, ˜ U1, ˜ W1) (S2, U2, ˜ S2, ˜ U2, ˜ W2)

Two-Way Coded Channel

Transmission Process Markov

10 / 20

slide-11
SLIDE 11

Auxiliary Coded Two-Way Channels

  • Function Fj transforms the inputs of the coded channel into physical inputs
  • Sj and Uj: current source message and its coded data
  • ˜

Sj and ˜ Uj: some prior source message and its coded data

  • ˜

Wj: some prior channel inputs and outputs

  • Joint input distribution of the coded channel:

PS1,S2,U1,U2, ˜

S1, ˜ S2, ˜ U1, ˜ U2, ˜ W1, ˜ W2 = PS1,S2PU1|S1PU2|S2P ˜ S1, ˜ S2, ˜ U1, ˜ U2, ˜ W1, ˜ W1

DM-TWC

F1 F2

Y1 Y2 X2 X1

(S1, U1, ˜ S1, ˜ U1, ˜ W1) (S2, U2, ˜ S2, ˜ U2, ˜ W2)

11 / 20

slide-12
SLIDE 12

Markov Transmission Process - State Space

  • A time-homogeneous Markov chain Z(t) is constructed with state space:

S1 × S2 × U1 × U2 × ˜ S1 × ˜ S2 × ˜ U1 × ˜ U2 × ˜ W1 × ˜ W2 × X1 × X2 × Y1 × Y2

  • For all t, (S(t)

1 , S(t) 2 , U(t) 1 , U(t) 2 ) is independent of ( ˜

S(t)

1 , ˜

S(t)

2 , ˜

U (t)

1 , ˜

U (t)

2 , ˜

W (t)

1 , ˜

W (t)

2 )

  • For t ≥ 2 and j = 1, 2, we set

˜ S(t)

j

= S(t−1)

j

, ˜ U (t)

j

= U (t−1)

j

, and ˜ W (t)

j

= (X(t−1)

j

, Y (t−1)

j

)

DM-TWC

F1 F2

Y1 Y2 X2 X1

(S1, U1, ˜ S1, ˜ U1, ˜ W1) (S2, U2, ˜ S2, ˜ U2, ˜ W2)

Two-Way Coded Channel

Transmission Process Markov

12 / 20

slide-13
SLIDE 13

Markov Transmission Process - Transition Kernel

  • For t ≥ 2, the transition kernel of {Z(t)} is given by

PZ(t)|Z(t−1)(s1, s2, u1, u2, ˜ s1, ˜ s2, ˜ u1, ˜ u2, ˜ w1, ˜ w2, x1, x2, y1, y2|s′

1, s′ 2, u′ 1, u′ 2,

˜ s′

1, ˜

s′

2, ˜

u′

1, ˜

u′

2, ˜

w′

1, ˜

w′

2, x′ 1, x′ 2, y′ 1, y′ 2)

= PS1,S2(s1, s2)PU1|S1(u1|s1)PU2|S2(u2|s2) ·✶{˜ s1 = s′

1}✶{˜

s2 = s′

2}✶{˜

u1 = u′

1}✶{˜

u2 = u′

2}✶{ ˜

w1 = (x′

1, y′ 1)}✶{ ˜

w2 = (x′

2, y′ 2)}

·✶{x1 = F1(s1, u1, ˜ s1, ˜ u1, ˜ w1)}✶{x2 = F2(s2, u2, ˜ s2, ˜ u2, ˜ w2)} ·PY1,Y2|X1,X2(y1, y2|x1, x2) where ✶{·} denotes the indicator function

  • Parameters: F1, F2, PU1|S1, PU2|S2, P ˜

S(1)

1

, ˜ S(1)

2

, ˜ U(1)

1

, ˜ S(1)

2 , and P ˜

W (1)

1

, ˜ W (1)

2

| ˜ S(1)

1

, ˜ S(1)

2

, ˜ U(1)

1

, ˜ U(1)

2 13 / 20

slide-14
SLIDE 14

Markov Transmission Process - Stationary Configuration

  • To obtain time-invariant achievability conditions, we only consider stationary chain,

in which P ˜

S(t)

1 , ˜

S(t)

2 , ˜

U(t)

1 , ˜

S(t)

2

= PS1,S2PU1|S1PU2|S2 for all t

  • For given Fj and PUj|Sj, j = 1, 2, we find appropriate P ˜

W (1)

1

, ˜ W (1)

2

| ˜ S(1)

1

, ˜ S(1)

2

, ˜ U(1)

1

, ˜ U(1)

2

  • For source reconstruction, we also need the consider decoding functions

Gj : ˜ Uj′ × Sj × Uj × ˜ Sj × ˜ Uj × ˜ Wj × Yj → ˆ Sj′

  • Stationary configuration:

{PU1|S1, PU2|S2, PU1|S1, P ˜

S1, ˜ S2, ˜ U1, ˜ U2P ˜ W1, ˜ W2| ˜ S1, ˜ S2, ˜ U1, ˜ U2, F1, F2, G1, G2}

  • ΠZ(D1, D2): the set of all stationary configurations with E[dj( ˜

Sj, ˆ ˜ Sj)] ≤ Dj, j = 1, 2

14 / 20

slide-15
SLIDE 15

Adaptive Joint Source-Channel Coding

Theorem A distortion pair (D1, D2) is achievable for the rate-one lossy transmission of correlated sources over a DM-TWC if there exists a stationary configuration in ΠZ(D1, D2) such that I( ˜ S1; ˜ U1) < I( ˜ U1; S2, U2, ˜ S2, ˜ U2, ˜ W2, X2, Y2), I( ˜ S2; ˜ U2) < I( ˜ U2; S1, U1, ˜ S1, ˜ U1, ˜ W1, X1, Y1).

15 / 20

slide-16
SLIDE 16

Special Cases

  • Uncoded transmission scheme
  • Correlation-preserving coding scheme
  • A SSCC scheme based on WZ source coding and Shannon’s channel coding
  • Two-way hybrid analog/digital coding
  • A SSCC scheme based on WZ source coding and Han’s adaptive channel coding

16 / 20

slide-17
SLIDE 17

Coding Scheme used in the Proof

  • Block-wise encoder structure (including three coding components):

S(1)

j

S(2)

j

S(3)

j

S(B)

j

U (B)

j

U (1)

j

U (2)

j

U (3)

j

X(1)

j

X(2)

j

X(3)

j

X(B)

j

X(B+1)

j

X(4)

j

Y (1)

j

Y (2)

j

Y (3)

j

Y (B)

j

Y (B+1)

j

Y (B−1)

j

Superposition Coding Adaptive Channel Coding Hybrid Analog/Digital Coding

  • Rate:

B B+1, which approaches 1 as B → ∞

  • Sliding-window decoder with window size two blocks

17 / 20

slide-18
SLIDE 18

A Simple Example

  • Independent binary uniform sources S1 and S2
  • Dueck’s DM-TWC model: Xj = {0, 1}2, Yj = {0, 1}3, and j = 1, 2
  • Channel input: Xj = (Xj1, Xj2)
  • Channel output:

Y1 = (X1,1 · X2,1, X2,2 ⊕ N1, N2) and Y2 = (X1,1 · X2,1, X1,2 ⊕ N2, N1) where ⊕ is binary addition

  • N1 and N2 are correlated with joint distribution PN1,N2(0, 0) = 0 and

PN1,N2(n1, n2) = 1/3 for (n1, n2) = (0, 0); they are also independent of Sj’s and Xj’s

  • Hamming distortion measure
  • Let K = 1 and choose B large enough

18 / 20

slide-19
SLIDE 19

Adaptive Coding with the Sliding-Window Decoder

  • It can be shown that no non-adaptive coding scheme can achieve (D1, D2) = (0, 0)
  • However, our coding scheme provides error-free transmission

Encoder

(b + 1)st block bth block

X(b)

12 = S(b−1) 1

⊕ N (b)

2

Y (b)

22 = X(b) 12 ⊕ N (b) 2

= S(b−1)

1

⊕ N (b)

2

X(b+1)

11

= Y (b)

13 = N (b) 2

X(b+1)

21

= Y (b)

23 = N (b) 1

  • Y (b+1)

21

= X(b+1)

12

· X(b+1)

11

= N (b)

1

· N (b)

2

Noise Decoder

Y (b)

23 = N (b) 1

Sliding-window Decoder at Terminal 2

N (b)

2

⊕ ˆ S(b−1)

1

= S(b−1)

1

Error-free reconstruction!

19 / 20

slide-20
SLIDE 20

Conclusion

  • We demonstrated a way to coordinate the terminals’ independent transmissions
  • Our coding scheme gives rise to various systems with differing complexity and

performance trade-offs

  • Future directions include
  • refining our achievability result
  • designing coding components that enable adaptive source compression

20 / 20