On the Role of Interaction in Network Information Theory Young-Han - - PowerPoint PPT Presentation
On the Role of Interaction in Network Information Theory Young-Han - - PowerPoint PPT Presentation
On the Role of Interaction in Network Information Theory Young-Han Kim University of California, San Diego Banff Workshop on Interactive Information Theory January Networked Information Processing System Communication network
Networked Information Processing System
Communication network
System: Internet, peer-to-peer network, sensor network, ... Sources: Data, speech, music, images, video, sensor data Nodes: Handsets, base stations, processors, servers, sensor nodes, ... Network: Wired, wireless, or a hybrid of the two Task: Communicate the sources, or compute/make decision based on them
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Network Information Theory
Communication network
Network information flow questions:
㶳 What is the limit on the amount of communication needed? 㶳 What are the coding schemes/techniques that achieve this limit?
Challenges:
㶳 Many networks inherently allow for two-way interactions 㶳 Most coding schemes are limited to one-way communications Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Objectives of the Talk
Review coding schemes that utilizes two-way interactions Focus on the channel coding side of the story (given yesterday’s talks) Draw mostly from a few classical examples and open problems (El Gamal–K )
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Discrete Memoryless Channel (DMC) with Feedback
M ̂ M Xi Yi Y i−1 p(y|x) Encoder Decoder
Feedback does not increase the capacity of a DMC (Shannon ): CFB = max
p(x) I(X; Y) = C
Nonetheless, feedback can help communication in several important ways
㶳 Feedback can simplify coding and improve reliability (Schalkwijk–Kailath ) 㶳 Feedback can increase the capacity of channels with memory (Butman ) 㶳 Feedback can enlarge the capacity region of DM multiuser channels (Gaarder–Wolf )
Insights on the fundamental limit of two-way interactive communication
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Iterative Refinement
Binary erasure channel: 1 1 e 1 − p 1 − p X Y
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Iterative Refinement
Binary erasure channel: 1 1 e 1 − p 1 − p X Y Basic idea:
㶳 First send a message at a rate higher than the channel capacity (without coding) 㶳 Then iteratively refine the receiver’s knowledge about the message
Examples:
㶳 Schalkwijk–Kailath coding scheme () 㶳 Horstein’s coding scheme () 㶳 Posterior matching scheme (Shayevitz–Feder ) 㶳 Block feedback coding scheme (Weldon , Ahlswede , Ooi–Wornell ) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Gaussian Channel with Feedback
X Y Z Expected average transmitted power constraint
n
堈
i=1
E(x2
i (m, Y i−1)) ≤ nP,
m ∈ [1 : 2nR] Schalkwijk–Kailath Coding Scheme (Schalkwijk–Kailath , Schalkwijk ): X1 ∝ θ, Xi ∝ θ − ̂ θi−1(Y i−1) Doubly exponentially small probability of error
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Posterior Matching Scheme (Shayevitz–Feder )
Recall the Schalkwijk–Kailath coding scheme: X1 ∝ Θ ∼ N(0, 1), Xi ∝ Θ − ̂ Θi−1(Y i−1) ∝ Xi−1 − E(Xi−1|Y i−1) ⊥ Y i−1
㶳 Y1, Y2, . . . are i.i.d.
Consider a general DMC p(y|x) with a capacity-achieving input pmf p(x): X1 = F−1
X (FΘ(Θ)),
Θ ∼ Unif[0, 1) Xi = F−1
X (FΘ|Y і−1(Θ|Y i−1)) ⊥ Y i−1
㶳 Y1, Y2, . . . are i.i.d.
Generalizes repetition for BEC, S–K for Gaussian, and Horstein for BSC Actual proof involves properties of iterated random functions Question: Elementary proof (say, for BSC)?
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Block Feedback Coding Scheme
1 1 1 − p 1 − p X Y X Y Z ∼ Bern(p) Implementation of iterative refinement at the block level (Weldon ):
㶳 Initially, transmit k bits uncoded 㶳 Learn the error (via feedback), compress it using kH(p) bits, and transmit the
compression index uncoded
㶳 Communicate the error about the error (kH2(p) bits) 㶳 Communicate the error about the error about the error
Achievable rate: k/(k + kH(p) + kH2(p) + kH3(p) + ⋅ ⋅ ⋅) = 1 − H(p) Extensions (Ahlswede , Ooi–Wornell )
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Multiple Access Channel (MAC) with Feedback
M1 M2 X1i X2i Encoder Encoder Decoder p(y|x1, x2) Yi Y i−1 Y i−1 ̂ M1, ̂ M2
Transmission cooperation: x1i(M1, Y i−1), xn
2(M2, Y i−1)
Capacity region C is not known in general
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Example: Binary Erasure MAC
X1 ∈ {0, 1} X2 ∈ {0, 1} Y ∈ {0, 1, 2} Capacity region without feedback: R1 ≤ 1, R2 ≤ 1, R1 + R2 ≤ 3/2 Block feedback coding scheme (Gaarder–Wolf ):
㶳 Rsym = 2/3: k uncoded transmissions + k/2 one-sided retransmissions 㶳 Rsym = 3/4: k uncoded transmissions + k/4 two-sided retransmissions + k/16 + ⋅ ⋅ ⋅ 㶳 Rsym = 0.7602: k uncoded transmissions + k/(2 log 3) cooperative retransmissions
R∗
sym = 0.7911 (Cover–Leung , Willems )
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Cover–Leung Coding Scheme
M1 M2 X1i X2i Encoder Encoder Decoder p(y|x1, x2) Yi Y i−1 Y i−1 ̂ M1, ̂ M2
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Cover–Leung Coding Scheme
M1 M2 X1i X2i Encoder Encoder Decoder p(y|x1, x2) Yi Y i−1 ̂ M1, ̂ M2
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Cover–Leung Coding Scheme
Encoder Encoder Decoder p(y|x1, x2) ̃ M2, j−1, M1j M2, j−1, M2j Xn
1 (j)
Xn
2 (j)
Y n(j − 1) Y n(j)
Block Markov coding Backward decoding (Willems–van der Meulen , Zeng–Kuhlmann–Buzo ) Willems condition (): Optimal when X1 is a function of (X2, Y) Not optimal for the Gaussian MAC (Ozarow ) Question: Posterior matching for MAC? Question: Optimality of Cover–Leung for one-sided feedback?
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Broadcast Channel (BC) with Feedback
M1, M2 Xi p(y1, y2|x) Y1i Y2i ̂ M1 ̂ M2 Y i−1
1
Y i−1
2
Encoder Decoder Decoder
Receivers operate separately (regardless of feedback) Physically degraded BC p(y1|x)p(y2|y1):
㶳 Feedback does not enlarge the capacity region (El Gamal )
How can feedback help?
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Dueck’s Example
X 怂 怒 怒 怒 怊 怒 怒 怒 怚 X0 X1 X2 Y2 = (X0, X2 ⊕ Z) Y1 = (X0, X1 ⊕ Z) Z ∼ Bern(1/2)
Capacity region without feedback: {(R1, R2) : R1 + R2 ≤ 1} Capacity region with feedback (Dueck ): {(R1, R2) : R1 ≤ 1, R2 ≤ 1}
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Dueck’s Example
Zi−1 X1i X2i Zi ∼ Bern(1/2) Y2i = (Zi−1, X2i ⊕ Zi) → X1,i−1 Y1i = (Zi−1, X1i ⊕ Zi) → X2,i−1
Capacity region without feedback: {(R1, R2) : R1 + R2 ≤ 1} Capacity region with feedback (Dueck ): {(R1, R2) : R1 ≤ 1, R2 ≤ 1} Feedback helps by letting the encoder broadcast common channel information
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Dueck’s Example
Zi−1 X1i X2i Zi ∼ Bern(1/2) Y2i = (Zi−1, X2i ⊕ Zi) → X1,i−1 Y1i = (Zi−1, X1i ⊕ Zi) → X2,i−1
Extension to general BC (Shayevitz–Wigger ) “Learn from the past, don’t predict the future” (Tse ) Gaussian BC: Schalkwijk–Kailath coding scheme to LQG control (Ozarow–Leung , Elia , Ardestanizadeh–Minero–Franceschetti ) Question: What’s going on with Gaussian? (Exactly why feedback helps?)
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Two-Way Channel
M1 M2 X1i X2i Y1i Y2i ̂ M1 ̂ M2 Encoder Encoder Decoder Decoder Node Node p(y1, y2|x1, x2) The first multiuser channel model (Shannon ) Capacity region C is not known in general Main difficulties:
㶳 Two information flows share the same channel, inflicting interference to each other 㶳 Each node has to play two competing roles of communicating its own message and
providing feedback to help the other node
Two-way channel with common output: Y1 = Y2 = Y
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Bounds on the Capacity Region
Simple inner bound (Shannon ): A rate pair (R1, R2) is achievable if R1 < I(X1; Y |X2), R2 < I(X2; Y |X1), for some p(x1)p(x2)
㶳 One-way communication Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Bounds on the Capacity Region
Simple inner bound (Shannon ): A rate pair (R1, R2) is achievable if R1 < I(X1; Y |X2, Q), R2 < I(X2; Y |X1, Q) for some p(q)p(x1|q)p(x2|q)
㶳 One-way communication 㶳 Can be improved using time sharing 㶳 Not tight in general (Dueck , Schalkwijk ) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Bounds on the Capacity Region
Simple inner bound (Shannon ): A rate pair (R1, R2) is achievable if R1 < I(X1; Y |X2, Q), R2 < I(X2; Y |X1, Q) for some p(q)p(x1|q)p(x2|q) Simple outer bound (Shannon ): If a rate pair (R1, R2) is achievable, R1 ≤ I(X1; Y |X2), R2 ≤ I(X2; Y |X1) for some p(x1, x2) Dependence balance bound (Hekstra–Willems ): R1 ≤ I(X1; Y|X2, U), R2 ≤ I(X2; Y|X1, U) for some p(u, x1, x2) such that I(X1; X2|U) ≤ I(X1; X2|Y, U)
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Multiletter Characterization of the Capacity Region
Causally conditional pmf: p(xk||yk−1) = ∏n
i=1 p(xi|xi−1, yi−1)
Causally conditional directed information (Marko , Massey ): I(Xn → Yn‖Zn) =
n
堈
i=1
I(Xi; Yi |Y i−1, Zi) Capacity region (Kramer ): Let Ck be the set of rate pairs (R1, R2) such that R1 ≤ 1 k I(Xk
1 → Y k||Xk 2 ),
R2 ≤ 1 k I(Xk
2 → Y k||Xk 1 )
for some p(xk
1 ||yk−1)p(xk 2||yk−1). Then C = ∪kCk
㶳 Similar characterizations can be found for general TWC and MAC with feedback 㶳 Each choice of k and p(xk
1||yk−1)p(xk 2||yk−1) leads to an inner bound
㶳 Not computable Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Interactive Coding Scheme
Xk
1,1
X2k
1,k+1
X3k
1,2k+1
Xnk
1,(n−1)k+1
Y k
1
Y 2k
k+1
Y 3k
2k+1
Y nk
(n−1)k+1
S1j 怂 怊 怚 Block j
Code over interleaved blocks (block j = times j, k + j, 2k + j, . . . , (n − 1)k + j) Block j: input X1j, output (Xk
2, Yk j ), causal channel state (X j−1 1
, Yj−1) R1j < I(X1j; Xk
2 , Y k j |X j−1 1
, Y j−1) is achievable Summing over blocks shows that ∑k
j=1 R1j < I(Xk 1 → Y k‖Xk 2 ) is achievable
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Example: Shannon–Blackwell Binary Multiplying Channel
X1 X2 Y Y Simple bounds on the symmetric capacity (Shannon ): max
p(x1)p(x2)
1 2(I(X1; Y |X2) + I(X2; Y|X1)) ≤ Csym ≤ max
p(x1,x2)
1 2(I(X1; Y |X2) + I(X2; Y|X1))
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Example: Shannon–Blackwell Binary Multiplying Channel
X1 X2 Y Y Simple bounds on the symmetric capacity (Shannon ): 0.6170 ≤ Csym ≤ 0.6942 DB bound + channel augmentation (Hekstra–Willems ): Csym ≤ 0.6463 Schalkwijk’s lower bounds:
㶳 Iterative refinement coding scheme (Schalkwijk ): 0.6191 ≤ Csym 㶳 + Slepian–Wolf (Schalkwijk ): 0.6306 ≤ Csym 㶳 Further extension (Meeuwissen–Schalkwijk–Bloemen ): 0.6307 ≤ Csym
Directed information inner bound: 1
2k (I(Xk 1 → Y k‖Xk 2 ) + I(Xk 2 → Y k‖Xk 1 ))
㶳 Ardestanizadeh (): 0.6191 ≤ Csym
Question: Can we outperform Schalkwijk (via directed information expression)?
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Intermission: Interactive Source Coding and Computing
Node Node Xn
1
Xn
2
Ml(Xn
1 , Ml−1)
Ml+1(Xn
2 , Ml)
̂ Zn
1
̂ Zn
2
Two-way lossless source coding:
㶳 Interaction does not enlarge the optimal rate region 㶳 One-way Slepian–Wolf coding is optimal (Csisz´
ar–Narayan )
Two-way lossy source coding:
㶳 Interaction enlarges the rate–distortion region for correlated sources 㶳 q-round interactions (Kaspi )
Two-way lossless computing:
㶳 Interaction enlarges the optimal rate region even for independent sources 㶳 Infinite-round interactions (Ma–Ishwar , ) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Relay Network
p(y1, . . . , yN|x1, . . . , xN) M ̂ Mj ̂ Mk ̂ MN 1 2 3 j k N
Topology of the network is defined through p(yN|xN)
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Relay Network
p(y1, . . . , yN|x1, . . . , xN) M ̂ MN 1 2 3 j k N
Topology of the network is defined through p(yN|xN) Unicast
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Relay Network
p(y1, . . . , yN|x1, . . . , xN) M ̂ Mj ̂ Mk ̂ MN ̂ M3 ̂ M2 1 2 3 j k N
Topology of the network is defined through p(yN|xN) Unicast vs. broadcast
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Relay Network
p(y1, . . . , yN|x1, . . . , xN) M ̂ Mj ̂ Mk ̂ MN 1 2 3 j k N
Topology of the network is defined through p(yN|xN) Unicast vs. broadcast vs. multicast
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Relay Network
p(y1, . . . , yN|x1, . . . , xN) M ̂ Mj ̂ Mk ̂ MN 1 2 3 j k N
Topology of the network is defined through p(yN|xN) Unicast vs. broadcast vs. multicast Capacity is not known in general Many coding schemes have been proposed
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Dictionary of Coding Schemes
Standard parlance: decode–forward, compress–forward, amplify–forward Extended vocabulary: partial decode–forward, noncoherent decode–forward, coherent compress–forward, generalized amplify–forward Recent coinages: hash–forward, compute–forward, quantize–map–forward, rematch–forward Loanwords: analog network coding, noisy network coding, hybrid coding Dialects: calculate–forward, clean–forward, combine–forward, demodulate–forward, denoise–forward, detect–forward, estimate–forward, flip–forward, mix–forward, quantize–forward, rotate–forward, scale–forward, (randomly) select–forward, sum–forward, truncate–forward
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Basic Coding Schemes
Decode–forward (Cover–El Gamal )
X1 Y2 : X2 Y3 (Mj−1, Mj) ̃ Mj ̃ Mj−1 ̂ Mj−1
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Basic Coding Schemes
Decode–forward (Cover–El Gamal ) Compress–forward (Cover–El Gamal )
X1 Y2 : X2 Y3 Mj ̂ Y n
2j
̂ Y n
2, j−1
̂ Mj−1
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Basic Coding Schemes
Decode–forward (Cover–El Gamal ) Compress–forward (Cover–El Gamal ) Amplify–forward (Schein–Gallager )
X1 Y2 : X2 Y3 M Y2i x2(Y2,i−1) ̂ M
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Basic Coding Schemes
Decode–forward (Cover–El Gamal ) Compress–forward (Cover–El Gamal ) Amplify–forward (Schein–Gallager ) *–forward and extensions (Ahlswede–Cai–Li–Yeung , Kramer–Gastpar–Gupta , Avestimehr–Diggavi–Tse , Lim–Kim–El Gamal–Chung ): no/limited interaction
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Broadcast Relay Channel (BRC)
M Encoder Xn
1
p(y2, y3|x1) Y n
2
Y n
3
Decoder Decoder R2 R3 ̂ M2 ̂ M3
A common message M is to be broadcast to both receivers (Draper–Frey–Kschischang ) Dual to MAC with partially cooperating encoders (Willems )
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Broadcast Relay Channel (BRC)
M Encoder Xn
1
p(y2, y3|x1) Y n
2
Y n
3
Decoder Decoder R2 R3 ̂ M2 ̂ M3
A common message M is to be broadcast to both receivers (Draper–Frey–Kschischang ) Dual to MAC with partially cooperating encoders (Willems ) Capacity C(R2 + R3) is not known in general
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Example: Binary BRC (Xiang–Wang–K )
1 1 1 1 1 X1 Y2 Y3 3 − 2倂2 倂2 − 1 倂2 − 1
X1 = Y2 ⋅ Y3
1 1 X1 Y2, Y3 倂2 − 1
C(0) = 0.3941 (Z channel capacity) C(2) = 1 C(R) = ?
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Example: Binary BRC (Xiang–Wang–K )
C(R) R 1 2 0.3941 1.2338 Cutset Partial decode–forward ? R∗
Cutset: maxp(x1) min{I(X1; Y2) + R/2, I(X1; Y2, Y3)} (C(R) = 1 for R ≥ 1.2338) Partial decode–forward: C(0) R∗: Interactive computing of X1 = Y2 ⋅ Y3
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Example: Binary BRC (Xiang–Wang–K )
R Cutset CF∞ 1.4346 1.7449 1.2338 1.4893 CF⋅DF CF2 CF 2
Compress–forward (Orlitsky–Roche ): HG(Y2|Y3) + HG(Y3|Y2) = 1.7449 Interactive relaying:
㶳 Compress–forward and decode–forward (Draper–Frey–Kschischang ):
1 − I(X1; Y2) + HG(Y2|Y3) = H(Y2) + H(X1|Y3) = 1.4893
㶳 Two-round compress–forward: H(Y2) + H(X1|Y3) = 1.4893 㶳 Three-round compress–forward: 1.4488 㶳 Four-round compress–forward: 1.4427
Infinite-round compress–forward (Ma–Ishwar , ): (1 + p)H(p) + p log(pe1−p) 儨 儨 儨 儨p=1/倂2 = 1.4346 < CFq−1 ⋅ DF = CFq Questions: Optimality? Generalizations? Implications?
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
Concluding Remarks
Interaction enables richer cooperation among network users
㶳 Coherent transmission (MAC with feedback) 㶳 Channel information broadcasting (BC with feedback) 㶳 Sequential coding (two-way channel) 㶳 Cooperative decoding (broadcast relay channel)
Theoretical challenges:
㶳 Capacity still open for many basic problems 㶳 Inherently multiletter solutions
(Permuter–Cuff–Van Roy–Weissman , Ma–Ishwar , , K )
Practical relevance:
㶳 How to use feedback (beyond channel estimation, ARQ) 㶳 Coordinated multipoint (CoMP) transmission/reception Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
References
Ahlswede, R. (). A constructive proof of the coding theorem for discrete memoryless channels in case of complete feedback. In Trans. th Prague Conf. Inf. Theory, Statist. Decision Functions, Random Processes (Tech Univ., Prague, ), pp. –. Academia, Prague. Ahlswede, R., Cai, N., Li, S.-Y. R., and Yeung, R. W. (). Network information flow. IEEE Trans. Inf. Theory, (), –. Ardestanizadeh, E. (). Feedback communication systems: Fundamental limits and control-theoretic
- approach. Ph.D. thesis, University of California, San Diego, La Jolla, CA.
Ardestanizadeh, E., Minero, P., and Franceschetti, M. (). LQG control approach to Gaussian broadcast channels with feedback. Avestimehr, A. S., Diggavi, S. N., and Tse, D. N. C. (). Wireless network information flow: A deterministic approach. IEEE Trans. Inf. Theory, (), –. Butman, S. (). A general formulation of linear feedback communication systems with solutions. IEEE
- Trans. Inf. Theory, (), –.
Cover, T. M. and El Gamal, A. (). Capacity theorems for the relay channel. IEEE Trans. Inf. Theory, (), –. Cover, T. M. and Leung, C. S. K. (). An achievable rate region for the multiple-access channel with
- feedback. IEEE Trans. Inf. Theory, (), –.
Csisz´ ar, I. and Narayan, P. (). Secrecy capacities for multiple terminals. IEEE Trans. Inf. Theory, (), –.
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
References (cont.)
Draper, S. C., Frey, B. J., and Kschischang, F. R. (). Interactive decoding of a broadcast message. In
- Proc. st Ann. Allerton Conf. Comm. Control Comput., Monticello, IL.
Dueck, G. (). The capacity region of the two-way channel can exceed the inner bound. Inf. Control, (), –. Dueck, G. (). Partial feedback for two-way and broadcast channels. Inf. Control, (), –. El Gamal, A. (). The feedback capacity of degraded broadcast channels. IEEE Trans. Inf. Theory, (), –. El Gamal, A. and Kim, Y.-H. (). Network Information Theory. Cambridge University Press, Cambridge. Elia, N. (). When Bode meets Shannon: Control-oriented feedback communication schemes. IEEE
- Trans. Automat. Control, (), –.
Gaarder, N. T. and Wolf, J. K. (). The capacity region of a multiple-access discrete memoryless channel can increase with feedback. IEEE Trans. Inf. Theory, (), –. Hekstra, A. P. and Willems, F. M. J. (). Dependence balance bounds for single-output two-way
- channels. IEEE Trans. Inf. Theory, (), –.
Horstein, M. (). Sequential transmission using noiseless feedback. IEEE Trans. Inf. Theory, (), –. Kaspi, A. H. (). Two-way source coding with a fidelity criterion. IEEE Trans. Inf. Theory, (), –. Kim, Y.-H. (). Feedback capacity of stationary Gaussian channels. IEEE Trans. Inf. Theory, (), –. Kramer, G. (). Capacity results for the discrete memoryless network. IEEE Trans. Inf. Theory, (), –.
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
References (cont.)
Kramer, G., Gastpar, M., and Gupta, P. (). Cooperative strategies and capacity theorems for relay
- networks. IEEE Trans. Inf. Theory, (), –.
Lim, S. H., Kim, Y.-H., El Gamal, A., and Chung, S.-Y. (). Noisy network coding. IEEE Trans. Inf. Theory, (), –. Ma, N. and Ishwar, P. (). Two-terminal distributed source coding with alternating messages for function computation. In Proc. IEEE Int. Symp. Inf. Theory, Toronto, Canada, pp. –. Ma, N. and Ishwar, P. (). Infinite-message distributed source coding for two-terminal interactive
- computing. In Proc. th Ann. Allerton Conf. Comm. Control Comput., Monticello, IL, pp. –.
Marko, H. (). The bidirectional communication theory: A generalization of information theory. IEEE
- Trans. Comm., (), –.
Massey, J. L. (). Causality, feedback, and directed information. In Proc. IEEE Int. Symp. Inf. Theory Appl., Honolulu, HI, pp. –. Meeuwissen, H. B., Schalkwijk, J. P. M., and Bloemen, A. H. A. (). Extension of the achievable rate region of Schalkwijk’s coding strategy for the binary multiplying channel. In Proc. IEEE Int. Symp.
- Inf. Theory, Whistler, BC, pp. .
Ooi, J. M. and Wornell, G. W. (). Fast iterative coding techniques for feedback channels. IEEE Trans. Inf. Theory, (), –. Orlitsky, A. and Roche, J. R. (). Coding for computing. IEEE Trans. Inf. Theory, (), –.
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
References (cont.)
Ozarow, L. H. (). The capacity of the white Gaussian multiple access channel with feedback. IEEE
- Trans. Inf. Theory, (), –.
Ozarow, L. H. and Leung, C. S. K. (). An achievable region and outer bound for the Gaussian broadcast channel with feedback. IEEE Trans. Inf. Theory, (), –. Permuter, H. H., Cuff, P., Van Roy, B., and Weissman, T. (). Capacity of the trapdoor channel with
- feedback. IEEE Trans. Inf. Theory, (), –.
Schalkwijk, J. P. M. (). A coding scheme for additive noise channels with feedback—II: Band-limited
- signals. IEEE Trans. Inf. Theory, (), –.
Schalkwijk, J. P. M. (). The binary multiplying channel: A coding scheme that operates beyond Shannon’s inner bound region. IEEE Trans. Inf. Theory, (), –. Schalkwijk, J. P. M. (). On an extension of an achievable rate region for the binary multiplying
- channel. IEEE Trans. Inf. Theory, (), –.
Schalkwijk, J. P. M. and Kailath, T. (). A coding scheme for additive noise channels with feedback—I: No bandwidth constraint. IEEE Trans. Inf. Theory, (), –. Schein, B. and Gallager, R. G. (). The Gaussian parallel relay channel. In Proc. IEEE Int. Symp. Inf. Theory, Sorrento, Italy, pp. . Shannon, C. E. (). The zero error capacity of a noisy channel. IRE Trans. Inf. Theory, (), –. Shannon, C. E. (). Two-way communication channels. In Proc. th Berkeley Symp. Math. Statist. Probab., vol. I, pp. –. University of California Press, Berkeley.
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /
References (cont.)
Shayevitz, O. and Feder, M. (). Optimal feedback communication via posterior matching. IEEE Trans.
- Inf. Theory, (), –.
Shayevitz, O. and Wigger, M. A. (). An achievable region for the discrete memoryless broadcast channel with feedback. In Proc. IEEE Int. Symp. Inf. Theory, Austin, TX, pp. –. Tse, D. N. C. (). Feedback in networks: Learn from the past, don’t predict the future. In Proc. UCSD Inf. Theory Appl. Workshop, La Jolla, CA. Weldon, E. J., Jr. (). Asymptotic error coding bounds for the binary symmetric channel with feedback. Ph.D. thesis, University of Florida, Gainesville, FL. Willems, F. M. J. (). The feedback capacity region of a class of discrete memoryless multiple access
- channels. IEEE Trans. Inf. Theory, (), –.
Willems, F. M. J. (). The discrete memoryless multiple access channel with partially cooperating
- encoders. IEEE Trans. Inf. Theory, (), –.
Willems, F. M. J. and van der Meulen, E. C. (). The discrete memoryless multiple-access channel with cribbing encoders. IEEE Trans. Inf. Theory, (), –. Xiang, Y., Wang, L., and Kim, Y.-H. (). Information flooding. In Proc. th Ann. Allerton Conf. Comm. Control Comput., Monticello, IL. Zeng, C.-M., Kuhlmann, F., and Buzo, A. (). Achievability proof of some multiuser channel coding theorems using backward decoding. IEEE Trans. Inf. Theory, (), –.
Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January /