Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint - - PowerPoint PPT Presentation

capacity bounds for diamond networks
SMART_READER_LITE
LIVE PREVIEW

Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint - - PowerPoint PPT Presentation

Technische Universitt Mnchen Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ December 15, 2015 Institute for


slide-1
SLIDE 1

Technische Universität München Institute for Communications Engineering

Capacity Bounds for Diamond Networks

DIMACS Workshop on Network Coding Rutgers University, NJ December 15, 2015 Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford)

slide-2
SLIDE 2

Technische Universität München

  • Cascade of a 2-receiver broadcast channel (BC) and a 2-transmitter multi-

access channel (MAC)

  • Simplifications: (1) MAC is two bit-pipes; (2) BC is two bit-pipes

What is a “Diamond Network” ?

Src X Y1 R1 X2 X1 Y W Ŵ Enc BC R2 Y2 MAC Dec Sink

slide-3
SLIDE 3

Technische Universität München

  • Cascade of a 2-receiver broadcast channel (BC) and a 2-transmitter multi-

access channel (MAC)

  • Simplifications: (1) MAC is two bit-pipes; (2) BC is two bit-pipes

What is a “Diamond Network” ?

Src X Y1 R1 X2 X1 Y W Ŵ Enc BC R2 Y2 MAC Dec Sink B bits n symbols R = B/n

slide-4
SLIDE 4

Technische Universität München

General Problem

  • B. E. Schein, Distributed coordination in network information theory. PhD

Dissertation, MIT, 2001 MAC is 2 Bit Pipes

  • A. Sanderovich, S. Shamai, Y. Steinberg, G. Kramer, “Communication via

decentralized processing,” IEEE Trans. IT, 2008 BC is 2 Bit Pipes

  • D. Traskov, G. Kramer, “Reliable communication in networks with multi-

access interference,” ITW 2007

  • W. Kang, N. Liu, and W. Chong, “The Gaussian multiple access diamond

channel,” arxiv 2011 (v1) and 2015 (v2)

Background

slide-5
SLIDE 5

Technische Universität München

  • Capacity limitations C1 and C2. Problem seems difficult!
  • Gaussian MAC partially solved by Kang-Liu (2011) using Ozarow’s trick (1980)
  • Contribution: new capacity upper bound for discrete MACs
  • Contribution: solved binary adder MAC capacity by extending Mrs. Gerber’s

Lemma

Here: BC is two bit pipes

Src V1 R1 X2 X1 Y W Ŵ Enc R2 V2 MAC Dec Sink

slide-6
SLIDE 6

Outline

The Problem Setup A Lower Bound An Upper-Bound Examples

The Gaussian MAC The binary adder MAC

3 / 28

slide-7
SLIDE 7

The Problem Setup

Source Encoder Relay 1 Relay 2 MAC p(y|x1,x2) Decoder Sink W ˆ W Xn

1

Xn

2

Y n

I W message of rate R

4 / 28

slide-8
SLIDE 8

The Problem Setup

Source Encoder Relay 1 Relay 2 MAC p(y|x1,x2) Decoder Sink W ˆ W Xn

1

Xn

2

Y n

I W message of rate R I Bit-pipes of capacities C1, C2

4 / 28

slide-9
SLIDE 9

The Problem Setup

Source Encoder Relay 1 Relay 2 MAC p(y|x1,x2) Decoder Sink W ˆ W Xn

1

Xn

2

Y n

I W message of rate R I Bit-pipes of capacities C1, C2 I Goal: What is the highest rate R such that

Pr(W 6= ˆ W) ! 0?

4 / 28

slide-10
SLIDE 10

A Lower Bound

Source Encoder Relay 1 Relay 2 MAC p(y|x1,x2) Decoder Sink W ˆ W Xn

1

Xn

2

Y n

I Rate splitting: W = (W12, W1, W2) I Superposition Coding:

W12 encoded in V n. Xn

1 , Xn 2 superposed on V n. I Marton’s Coding

5 / 28

… a sophisticated superposition … a sophisticated superposition … a sophisticated superposition … a sophisticated superposition

slide-11
SLIDE 11

Technische Universität München

  • Rate-splitting bounds:

Rate Bounds

  • Now apply Fourier-Motzkin elimination
slide-12
SLIDE 12

A Lower Bound (Cont.)

Theorem (Lower Bound)

The rate R is achievable if it satisfies the following condition for some pmf p(v, x1, x2, y) = p(v, x1, x2)p(y|x1, x2): Rmin 8 > > > > < > > > > : C1 + C2 I(X1; X2|V ) C2 + I(X1; Y |X2V ) C1 + I(X2; Y |X1V )

1 2(C1+C2+I(X1X2; Y |V )I(X1; X2|V ))

I(X1X2; Y ) 9 > > > > = > > > > ; V 2V, |V|min{ |X1| |X2|+2, |Y|+4 }

6 / 28

  • S. Saeedi Bidokhti, G. Kramer, “Capacity bounds for a class of diamond networks,” ISIT 2014
  • W. Kang, N. Liu, W. Chong, “The Gaussian multiple access diamond channel,” arxiv 1104.3300, v2, 2015
  • S. Saeedi Bidokhti, G. Kramer, “Capacity bounds for a class of diamond networks,” ISIT 2014
  • W. Kang, N. Liu, W. Chong, “The Gaussian multiple access diamond channel,” arxiv 1104.3300, v2, 2015
  • S. Saeedi Bidokhti, G. Kramer, “Capacity bounds for a class of diamond networks,” ISIT 2014
  • W. Kang, N. Liu, W. Chong, “The Gaussian multiple access diamond channel,” arxiv 1104.3300, v2, 2015
  • S. Saeedi Bidokhti, G. Kramer, “Capacity bounds for a class of diamond networks,” ISIT 2014
  • W. Kang, N. Liu, W. Chong, “The Gaussian multiple access diamond channel,” arxiv 1104.3300, v2, 2015
slide-13
SLIDE 13

The Cut-Set Bound

Cut-Set bound: R is achievable only if it satisfies the following bounds for some p(x1, x2): R  C1 + C2 R  C1 + I(X2; Y |X1) R  C2 + I(X1; Y |X2) R  I(X1X2; Y ).

source X2 X1 sink Y C1 C2

7 / 28

Four Cuts: Four Cuts: Four Cuts: Four Cuts:

slide-14
SLIDE 14

Example I: binary adder MAC

I X1 = X2 = {0, 1},

Y = {0, 1, 2}

I Y = X1 + X2

0.74 0.76 0.78 0.8 0.82 0.84 0.86 1.46 1.48 1.5 1.52 1.54 1.56 1.58 Link Capacity C Rate R Cut-Set bound Lower bound

8 / 28

slide-15
SLIDE 15

Example I: binary adder MAC

I X1 = X2 = {0, 1},

Y = {0, 1, 2}

I Y = X1 + X2

0.74 0.76 0.78 0.8 0.82 0.84 0.86 1.46 1.48 1.5 1.52 1.54 1.56 1.58 Link Capacity C Rate R Cut-Set bound Lower bound and capacity

8 / 28

slide-16
SLIDE 16

Example II: Gaussian MAC

I Y = X1 + X2 + Z,

Z ⇠ N(0, 1)

I 1 n

Pn

i=1 E(X2 1,i)  P1, 1 n

Pn

i=1 E(X2 2,i)  P2,

P1 = P2 = 1

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 Link Capacity C Rate R Cut-Set bound

9 / 28

slide-17
SLIDE 17

Example II: Gaussian MAC

I Y = X1 + X2 + Z,

Z ⇠ N(0, 1)

I 1 n

Pn

i=1 E(X2 1,i)  P1, 1 n

Pn

i=1 E(X2 2,i)  P2,

P1 = P2 = 1

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 Link Capacity C Rate R Cut-Set bound Lower bound (no superposition coding)

9 / 28

slide-18
SLIDE 18

Example II: Gaussian MAC

I Y = X1 + X2 + Z,

Z ⇠ N(0, 1)

I 1 n

Pn

i=1 E(X2 1,i)  P1, 1 n

Pn

i=1 E(X2 2,i)  P2,

P1 = P2 = 1

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 Link Capacity C Rate R Cut-Set bound Lower bound (no superposition coding) Lower bound (Joint Gaussian dist.)

9 / 28

slide-19
SLIDE 19

Example II: Gaussian MAC

I Y = X1 + X2 + Z,

Z ⇠ N(0, 1)

I 1 n

Pn

i=1 E(X2 1,i)  P1, 1 n

Pn

i=1 E(X2 2,i)  P2,

P1 = P2 = 1

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 Link Capacity C Rate R Cut-Set bound Lower bound (no superposition coding) Lower bound (Joint Gaussian dist.) Lower bound (Mixture of two Gaussian dist.)

9 / 28

slide-20
SLIDE 20

Example II: Gaussian MAC

I Y = X1 + X2 + Z,

Z ⇠ N(0, 1)

I 1 n

Pn

i=1 E(X2 1,i)  P1, 1 n

Pn

i=1 E(X2 2,i)  P2,

P1 = P2 = 1

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2

Lower bound is tight

  • Lower bound

is tight

  • !

Link Capacity C Rate R Cut-Set bound Lower bound (no superposition coding) Lower bound (Joint Gaussian dist.) Lower bound (Mixture of two Gaussian dist.)

9 / 28

slide-21
SLIDE 21

Is the Cut-Set Bound Tight?

Cut-Set bound: R  C1 + C2 R  C1 + I(X2; Y |X1) R  C2 + I(X1; Y |X2) R  I(X1X2; Y ). Maximize over p(x1, x2).

source X2 X1 sink Y C1 C2

10 / 28

slide-22
SLIDE 22

Is the Cut-Set Bound Tight?

Cut-Set bound: R  C1 + C2 R  C1 + I(X2; Y |X1) R  C2 + I(X1; Y |X2) R  I(X1X2; Y ). Maximize over p(x1, x2).

source X2 X1 sink Y C1 C2

It turns out that the cut-set bound is not tight.

10 / 28

One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink}) One culprit is the cut ({source}, {R1,R2,sink})

slide-23
SLIDE 23

Refining the Cut-Set Bound

I Motivated by [Ozarow’80, KangLiu’11]

11 / 28

(cf. [TraskovKramer’07]) (cf. [TraskovKramer’07]) (cf. [TraskovKramer’07]) (cf. [TraskovKramer’07]) (cf. [TraskovKramer’07]) (cf. [TraskovKramer’07]) (cf. [TraskovKramer’07]) (cf. [TraskovKramer’07])

slide-24
SLIDE 24

Refining the Cut-Set Bound

I Motivated by [Ozarow’80, KangLiu’11]

nR  nC1 + nC2 I(Xn

1 ; Xn 2 )

11 / 28

slide-25
SLIDE 25

Refining the Cut-Set Bound

I Motivated by [Ozarow’80, KangLiu’11]

nR  nC1 + nC2 I(Xn

1 ; Xn 2 ) I For any U n:

I(Xn

1 ; Xn 2 ) = I(Xn 1 Xn 2 ; U n) I(Xn 1 ; U n|Xn 2 ) I(Xn 2 ; U n|Xn 1 )

+ I(Xn

1 ; Xn 2 |U n)

11 / 28

Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06) Basically the Hekstra-Willems Dependence Balance Bound (IT’89)! See Gastpar-Kramer (ITW’06)

slide-26
SLIDE 26

Refining the Cut-Set Bound

I Motivated by [Ozarow’80, KangLiu’11]

nR  nC1 + nC2 I(Xn

1 ; Xn 2 ) I For any U n:

I(Xn

1 ; Xn 2 ) I(Xn 1 Xn 2 ; U n) I(Xn 1 ; U n|Xn 2 ) I(Xn 2 ; U n|Xn 1 )

11 / 28

slide-27
SLIDE 27

Refining the Cut-Set Bound (Cont.)

nRnC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

12 / 28

slide-28
SLIDE 28

Refining the Cut-Set Bound (Cont.)

nRnC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

choose Ui as follows:

Yi Ui pU|Y

12 / 28

slide-29
SLIDE 29

Refining the Cut-Set Bound (Cont.)

nRnC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

choose Ui as follows:

Yi Ui pU|Y

nR I(Xn

1 Xn 2 ; Y n)

nR nC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

12 / 28

slide-30
SLIDE 30

Refining the Cut-Set Bound (Cont.)

nRnC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

choose Ui as follows:

Yi Ui pU|Y

nR I(Xn

1 Xn 2 ; Y n)

+ nR nC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

12 / 28

slide-31
SLIDE 31

Refining the Cut-Set Bound (Cont.)

nRnC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

choose Ui as follows:

Yi Ui pU|Y

nR I(Xn

1 Xn 2 ; Y n)

+ nR nC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

2nR nC1+nC2+I( Xn

1 Xn 2 ; Y n|U n)+I(

Xn

1 ; U n|Xn 2)+I(

Xn

2 ; U n|Xn 1)

12 / 28

slide-32
SLIDE 32

Refining the Cut-Set Bound (Cont.)

nRnC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

choose Ui as follows:

Yi Ui pU|Y

nR I(Xn

1 Xn 2 ; Y n)

+ nR nC1 + nC2 I(Xn

1 Xn 2 ; U n) + I(Xn 1 ; U n|Xn 2 ) + I(Xn 2 ; U n|Xn 1 )

2nR nC1+nC2+I( Xn

1 Xn 2 ; Y n|U n)+I(

Xn

1 ; U n|Xn 2)+I(

Xn

2 ; U n|Xn 1)

. . . n (C1 + C2 + I(X1X2; Y |U) + I(X1; U|X2) + I(X2; U|X1))

12 / 28

slide-33
SLIDE 33

New Upper-Bounds (1)

Theorem (Upper Bound I)

The rate R is achievable only if there exists a joint distribution p(x1, x2) for which the following inequalities hold for every auxiliary channel p(u|x1, x2, y) = p(u|y) R  C1 + C2 R  C2 + I(X1; Y |X2) R  C1 + I(X2; Y |X1) R  I(X1X2; Y ) 2R  C1 + C2 + I(X1X2; Y |U) + I(X1; U|X2) + I(X2; U|X1)

13 / 28

slide-34
SLIDE 34

New Upper-Bounds (1)

Theorem (Upper Bound I)

The rate R is achievable only if there exists a joint distribution p(x1, x2) for which the following inequalities hold for every auxiliary channel p(u|x1, x2, y) = p(u|y) R  C1 + C2 R  C2 + I(X1; Y |X2) R  C1 + I(X2; Y |X1) R  I(X1X2; Y ) 2R  C1 + C2 + I(X1X2; Y |U) + I(X1; U|X2) + I(X2; U|X1)

13 / 28

slide-35
SLIDE 35

New Upper-Bounds (1)

Theorem (Upper Bound I)

The rate R is achievable only if there exists a joint distribution p(x1, x2) for which the following inequalities hold for every auxiliary channel p(u|x1, x2, y) = p(u|y) R  C1 + C2 R  C2 + I(X1; Y |X2) R  C1 + I(X2; Y |X1) R  I(X1X2; Y ) 2R  C1 + C2 + I(X1X2; Y |U) + I(X1; U|X2) + I(X2; U|X1)

I max-min problem

13 / 28

slide-36
SLIDE 36

New Upper-Bounds (1)

Theorem (Upper Bound I)

The rate R is achievable only if there exists a joint distribution p(x1, x2) for which the following inequalities hold for every auxiliary channel p(u|x1, x2, y) = p(u|y) R  C1 + C2 R  C2 + I(X1; Y |X2) R  C1 + I(X2; Y |X1) R  I(X1X2; Y ) 2R  C1 + C2 + I(X1X2; Y |U) + I(X1; U|X2) + I(X2; U|X1)

I max-min problem I 2R  C1 + C2 + I(X1X2; Y ) I(X1; X2) + I(X1; X2|U)

13 / 28

slide-37
SLIDE 37

New Upper-Bounds (2)

Theorem (Upper Bound II)

The capacity is bounded from above by max

p( x1,x2)

min

p( u|x1,x2,y) =p(u|y)

max

p( q|x1,x2,y,u ) =p(q|x1,x2)

min 8 > > > > < > > > > : C1+C2, C1+I(X2; Y |X1Q), C2+I(X1; Y |X2Q), I(X1X2; Y |Q), C1+C2I( X1; X2|Q )+I( X1; X2|UQ ) 9 > > > > = > > > > ;

I |Q|  |X1||X2| + 3.

14 / 28

Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure Don’t drop the mutual information term and use Y-to-U channel structure

slide-38
SLIDE 38

New Upper-Bounds (2)

Theorem (Upper Bound II)

The capacity is bounded from above by max

p( x1,x2)

min

p( u|x1,x2,y) =p(u|y)

max

p( q|x1,x2,y,u ) =p(q|x1,x2)

min 8 > > > > < > > > > : C1+C2, C1+I(X2; Y |X1Q), C2+I(X1; Y |X2Q), I(X1X2; Y |Q), C1+C2I( X1; X2|Q )+I( X1; X2|UQ ) 9 > > > > = > > > > ;

I |Q|  |X1||X2| + 3. I last term is related to the Hekstra-Willems dependence

balance bound and can be written as R  C1+C2I(X1X2; U|Q)+I(X2; U|X1Q)+I(X1; U|X2Q)

14 / 28

slide-39
SLIDE 39

New Upper-Bounds (2)

Theorem (Upper Bound II)

The capacity is bounded from above by max

p( x1,x2)

min

p( u|x1,x2,y) =p(u|y)

max

p( q|x1,x2,y,u ) =p(q|x1,x2)

min 8 > > > > > < > > > > > : C1+C2, C1+I(X2; Y |X1Q), C2+I(X1; Y |X2Q), I(X1X2; Y |Q) , C1+C2I( X1; X2|Q )+I( X1; X2|UQ ) 9 > > > > > = > > > > > ;

I |Q|  |X1||X2| + 3. I last term is related to the Hekstra-Willems dependence

balance bound and can be written as R  C1+C2 I(X1X2; U|Q) +I(X2; U|X1Q)+I(X1; U|X2Q)

14 / 28

slide-40
SLIDE 40

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

15 / 28

slide-41
SLIDE 41

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

R  2C R  C + I(X1; Y |X2Q) R  C + I(X2; Y |X1Q) R  I(X1X2; Y |Q) R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q) Max-Min-Max problem

15 / 28

slide-42
SLIDE 42

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

R  2C R  C + I(X1; Y |X2Q) R  C + I(X2; Y |X1Q) R  I(X1X2; Y |Q) R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q) Choose U = Y + ZN ZN ⇠ N(0, N) N to be optimized.

15 / 28

slide-43
SLIDE 43

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

R  2C R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  C + I(X2; Y |X1Q) R  I(X1X2; Y |Q) R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q) Choose U = Y + ZN ZN ⇠ N(0, N) N to be optimized.

15 / 28

slide-44
SLIDE 44

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

R  2C R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  I(X1X2; Y |Q) R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q) Choose U = Y + ZN ZN ⇠ N(0, N) N to be optimized.

15 / 28

slide-45
SLIDE 45

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

R  2C R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  log (1 + 2P(1 + ⇢)) /2 R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q) Choose U = Y + ZN ZN ⇠ N(0, N) N to be optimized.

15 / 28

slide-46
SLIDE 46

The Gaussian MAC

Y = X1 + X2 + Z Z ⇠ N(0, 1),

1 n

Pn

i=1 E(X2 1,i)  P, 1 n

Pn

i=1 E(X2 2,i)  P

R  2C R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  C + log

  • 1 + P(1 ⇢2)
  • /2

R  log (1 + 2P(1 + ⇢)) /2 R  C1 + C2 I(X1X2; U|Q) + log 1 + N + P

  • 1 ⇢2

1 + N ! Choose U = Y + ZN ZN ⇠ N(0, N) N to be optimized.

15 / 28

slide-47
SLIDE 47

The Gaussian MAC (Cont.)

I U = Y + ZN, ZN ⇠ N(0, N)

I(X1X2; U|Q) = h(U|Q) h(U|X1X2)

EPI

1 2 log ⇣ 2⇡eN + 22h(Y |Q)⌘ 1 2 log (2⇡e(1 + N)) I(X1X2; Y |Q) = h(Y |Q) 1 2 log (2⇡e) R

16 / 28

slide-48
SLIDE 48

The Gaussian MAC (Cont.)

I U = Y + ZN, ZN ⇠ N(0, N)

I(X1X2; U|Q) = h(U|Q) h(U|X1X2)

EPI

1 2 log ⇣ 2⇡eN + 22h(Y |Q)⌘ 1 2 log (2⇡e(1 + N)) I(X1X2; Y |Q) = h(Y |Q) 1 2 log (2⇡e) R R  C1 + C2 1 2 log

  • N + 22R

1 2 log (1 + N) + log

  • 1 + N + P
  • 1 ⇢2

16 / 28

slide-49
SLIDE 49

The Gaussian MAC (Cont.)

I U = Y + ZN, ZN ⇠ N(0, N)

I(X1X2; U|Q) = h(U|Q) h(U|X1X2)

EPI

1 2 log ⇣ 2⇡eN + 22h(Y |Q)⌘ 1 2 log (2⇡e(1 + N)) I(X1X2; Y |Q) = h(Y |Q) 1 2 log (2⇡e) R R  C1 + C2 1 2 log

  • N + 22R

1 2 log (1 + N) + log

  • 1 + N + P
  • 1 ⇢2

I Strictly tighter than [KangLiu’11]

16 / 28

slide-50
SLIDE 50

The Gaussian MAC (Cont.)

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 Link Capacity C Rate R

Cut-Set bound Lower bound (Mixture of two Gaussian dist.) Upper bound I Upper bound II 17 / 28

slide-51
SLIDE 51

The Gaussian MAC (Cont.)

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 no cooperation Link Capacity C Rate R

Cut-Set bound Lower bound (Mixture of two Gaussian dist.) Upper bound I Upper bound II 17 / 28

slide-52
SLIDE 52

The Gaussian MAC (Cont.)

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 no cooperation full cooperation Link Capacity C Rate R

Cut-Set bound Lower bound (Mixture of two Gaussian dist.) Upper bound I Upper bound II 17 / 28

slide-53
SLIDE 53

The Gaussian MAC (Cont.)

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 no cooperation partial cooperation full cooperation Link Capacity C Rate R

Cut-Set bound Lower bound (Mixture of two Gaussian dist.) Upper bound I Upper bound II 17 / 28

slide-54
SLIDE 54

The Gaussian MAC (Cont.)

0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 0.7 0.75 0.8 0.85 0.9 0.95 1 1.05 1.1 1.15 1.2 no cooperation partial cooperation full cooperation Link Capacity C Rate R

Cut-Set bound Lower bound (Mixture of two Gaussian dist.) Upper bound I Upper bound II 17 / 28

slide-55
SLIDE 55

On The Capacity of The Gaussian MAC

Theorem

For a symmetric Gaussian diamond network, the upper bound meets the lower bound for all C such that C 1

2 log(1 + 4P), or

C  1 4 log 1 + 2P(1 + ⇢(2)) 1

  • ⇢(2)2

where ⇢(2) = r 1 + 1 4P 2 1 2P

18 / 28

slide-56
SLIDE 56

The Optimal Choice of N

I U = Y + ZN (motivated by [Ozarow’80, KangLiu’11]) I (X1, X2) an optimal jointly Gaussian input for the lower

bound  P ?P ?P P

  • .

I N =

  • P

1

? ?

1 +

I P

1

? ?

1 0: X1 U X2 forms a Markov chain– new upper-bound

I P

1

? ?

1  0: the cut-set bound

19 / 28

slide-57
SLIDE 57

The Binary Adder MAC

Y = X1 + X2, X1 = X = {0, 1}, Y = {0, 1, 2} R  C1 + C2 R  C2 + I(X1; Y |X2Q) R  C1 + I(X2; Y |X1Q) R  I(X1X2; Y |Q) R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q)

20 / 28

slide-58
SLIDE 58

The Binary Adder MAC

Y = X1 + X2, X1 = X = {0, 1}, Y = {0, 1, 2}

Y U ˜ Y 2 q 1 q 1 1 1

1 2 1 2

1 1 − α α α 1 − α

R  C1 + C2 R  C2 + I(X1; Y |X2Q) R  C1 + I(X2; Y |X1Q) R  I(X1X2; Y |Q) R  C1 + C2 I(X1X2; U|Q) + I(X1; U|X2Q) + I(X2; U|X1Q)

20 / 28

slide-59
SLIDE 59

The Binary Adder MAC

Y = X1 + X2, X1 = X = {0, 1}, Y = {0, 1, 2}

Y U ˜ Y 2 q 1 q 1 1 1

1 2 1 2

1 1 − α α α 1 − α

R  C1 + C2 R  C2 + h2(q) R  C1 + h2(q) R  1 + h2(q) q R  C1 + C2 I(X1X2; U|Q) + 2h2(q 2 ? ↵) 2(1 q)h2(↵) 2q

20 / 28

slide-60
SLIDE 60

The Interplay in the upper bound

I(X1X2; U|Q) = H(U|Q) H(U|X1X2)

MGL

h2 ⇣ ↵ ? h−1

2

⇣ H( ˜ Y |Q) ⌘⌘ (1 q)h2(↵) q I(X1X2; Y |Q) = H( ˜ Y |Q) + h2(q) q R

21 / 28

slide-61
SLIDE 61

The Binary Adder MAC (Cont.)

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58 Link Capacity C Rate R

Cut-Set bound Lower bound 22 / 28

slide-62
SLIDE 62

The Binary Adder MAC (Cont.)

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58

Cut-Set bound is tight

  • !

Cut-Set bound is tight

  • Link Capacity C

Rate R

Cut-Set bound Lower bound 22 / 28

slide-63
SLIDE 63

The Binary Adder MAC (Cont.)

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58

Cut-Set bound is tight

  • !

Cut-Set bound is tight

  • Link Capacity C

Rate R

Cut-Set bound Lower bound Upper bound I 22 / 28

slide-64
SLIDE 64

The Binary Adder MAC (Cont.)

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58

Cut-Set bound is tight

  • !

Upper Bound I is tight

  • !

Cut-Set bound is tight

  • Link Capacity C

Rate R

Cut-Set bound Lower bound Upper bound I 22 / 28

slide-65
SLIDE 65

The Binary Adder MAC (Cont.)

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58

Cut-Set bound is tight

  • !

Upper Bound I is tight

  • !

Cut-Set bound is tight

  • Link Capacity C

Rate R

Cut-Set bound Lower bound Upper bound I Upper bound II and Capacity 22 / 28

slide-66
SLIDE 66

The Binary Adder MAC (Cont.)

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58

Cut-Set bound is tight

  • !

Upper Bound I is tight

  • !

Upper Bound II with Mrs. Gerber’s lemma is tight

  • !

Cut-Set bound is tight

  • Link Capacity C

Rate R

Cut-Set bound Lower bound Upper bound I Upper bound II and Capacity 22 / 28

slide-67
SLIDE 67

The interplay in the upper bounds

R I(X1X2; Y |Q) R C1 + C2 I(X1X2; U|Q) + I(X2; U|X1Q) + I(X1; U|X2Q)

23 / 28

slide-68
SLIDE 68

The interplay in the upper bounds

R I(X1X2; Y |Q) H(Y |Q) H(Y |X1X2) R C1 + C2 I(X1X2; U|Q) + I(X2; U|X1Q) + I(X1; U|X2Q)

23 / 28

slide-69
SLIDE 69

The interplay in the upper bounds

R I(X1X2; Y |Q) H(Y |Q) H(Y |X1X2) R C1 + C2 I(X1X2; U|Q) + I(X2; U|X1Q) + I(X1; U|X2Q) C1 + C2 H(U|Q) H(U|X1X2) + H(U|X1Q) + H(U|X2Q)

23 / 28

slide-70
SLIDE 70

The interplay in the upper bounds

R I(X1X2; Y |Q)  H(Y |Q) H(Y |X1X2) R C1 + C2 I(X1X2; U|Q) + I(X2; U|X1Q) + I(X1; U|X2Q) C1 + C2 H(U|Q) H(U|X1X2) + H(U|X1Q) + H(U|X2Q)

I Up to now: Entropy Power Inequality, Mrs. Gerber’s

Lemma

  • 1. min {H(U)|H(Y ) = t} f(t)
  • 2. f(t) is convex in t

23 / 28

slide-71
SLIDE 71

The interplay in the upper bounds

R I(X1X2; Y |Q)  H(Y |Q) H(Y |X1X2) R C1 + C2 I(X1X2; U|Q) + I(X2; U|X1Q) + I(X1; U|X2Q) C1 + C2 H(U|Q) H(U|X1X2) + H(U|X1Q) + H(U|X2Q)

I Up to now: Entropy Power Inequality, Mrs. Gerber’s

Lemma

  • 1. min {H(U)|H(Y ) = t} f(t)
  • 2. f(t) is convex in t

23 / 28

slide-72
SLIDE 72

The interplay in the upper bounds

R I(X1X2; Y |Q)  H(Y |Q) H(Y |X1X2) R C1 + C2 I(X1X2; U|Q) + I(X2; U|X1Q) + I(X1; U|X2Q) C1 + C2 H(U|Q) H(U|X1X2) + H(U|X1Q) + H(U|X2Q)

I Up to now: Entropy Power Inequality, Mrs. Gerber’s

Lemma

  • 1. min {H(U)|H(Y ) = t} f(t)
  • 2. f(t) is convex in t

I What we want to do:

  • 1. min {H(U) H(U|X1)H(U|X2)|H(Y ) = t} f(t)
  • 2. f(t) is convex in t

23 / 28

slide-73
SLIDE 73

The Binary Adder MAC: Upper Bound

R  2C R  C + h2(q) R  1 + h2(q) q R  2C h2 ✓ ↵ ? ✓q 2 + (1 q)h−1

2

✓ min ✓ 1, (R h2(q))+ 1 q ◆◆◆◆ (1 q)h2(↵) q + 2h2 ⇣ ↵ ? q 2 ⌘

24 / 28

RHS is jointly concave (note signs) in (R,q) RHS is jointly concave (note signs) in (R,q) RHS is jointly concave (note signs) in (R,q) RHS is jointly concave (note signs) in (R,q)

slide-74
SLIDE 74

Capacity of The Binary Adder MAC

Theorem

The capacity of diamond networks with binary adder MACs is max

0≤p≤ 1

2

min 8 > > < > > : C1 + C2 1 + h2(p) C1 + h2(p) C2 + h2(p) h2(p) + 1 p.

25 / 28

slide-75
SLIDE 75

The optimal Choice of α

I Let (X1, X2) be an optimizing doubly symmetric binary

pmf with parameter p? for the lower bound

I ↵ is such that

↵(1 ↵) = ✓ p? 2(1 p?) ◆2 and it makes the following Markov chain X1 U X2.

26 / 28

slide-76
SLIDE 76

Capacity of The Binary Adder MAC

0.73 0.74 0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 1.46 1.48 1.5 1.52 1.54 1.56 1.58

Cut-Set bound is tight

  • !

Upper Bound I is tight

  • !

Upper Bound II with Mrs. Gerber’s lemma is tight

  • !

Cut-Set bound is tight

  • Upper Bound II with Generalized Mrs. Gerber’s lemma is tight
  • !

Link Capacity C Rate R

Cut-Set bound Lower bound Upper bound I Upper bound II and Capacity 27 / 28

slide-77
SLIDE 77

Summary and Work in Progress

I Lower and Upper bounds on the capacity of a class of

diamond networks

I A new upper bound which is in the form of a max-min

problem

I Gaussian MACs:

I improved previous lower and upper bounds I characterized the capacity for interesting ranges of bit-pipe

capacities.

I Binary adder MAC: fully characterized the capacity I Work in progress: the general class of 2-relay diamond

networks, n-relay diamond networks with orthogonal BC components

28 / 28