Implicit Communication for Control Gireeja Ranade Works of Pulkit - - PowerPoint PPT Presentation

implicit communication for control
SMART_READER_LITE
LIVE PREVIEW

Implicit Communication for Control Gireeja Ranade Works of Pulkit - - PowerPoint PPT Presentation

Implicit Communication for Control Gireeja Ranade Works of Pulkit Grover, Anant Sahai, Se Yong Park Slides courtesy Pulkit Grover Decentralized control: observer-controller problem An abstraction C w 2 C w 1 Evolving C w Plant 3 C


slide-1
SLIDE 1

Implicit Communication for Control

Gireeja Ranade Works of Pulkit Grover, Anant Sahai, Se Yong Park

Slides courtesy Pulkit Grover

slide-2
SLIDE 2

/32

Decentralized control: “observer-controller” problem

2

An abstraction

Cw

6

Cw

5

Cw

4

Cw

3

Cw

1

Cw

2

Plant Evolving

slide-3
SLIDE 3

/22

Decentralized control: a simple example

3

x

C2 C1

coordination

Weak Blurry

Cw Cw

2 1

Cw

3

Cw

4

Cw

5

Cw

6

System

slide-4
SLIDE 4

/22

4

Coordination via explicit communication

C2

x

C1

External channel

separate estimation and control

[Shannon ’48]

point-to-point communication

Weak

estimation

Blurry

control

Cw Cw

2 1

Cw

3

Cw

4

Cw

5

Cw

6

System External channel

slide-5
SLIDE 5

/22

Coordination via implicit communication?

5

C2 C1

message

Explicit communication Implicit communication

  • 1. channel: the system itself
  • 2. messages: endogenously generated
slide-6
SLIDE 6

/22

x0

Implicit communication: an example

6

C2 C1

Weak Blurry

x1

+ +

+

  • x1

x2 z ∼ N(0, 1)

∼ N(0, σ2

0)

u1

min

  • k2E
  • u2

w

  • + E
  • x2

2

  • x2

2

u2

1

C1 C2

x0 x0 u2

+

[Witsenhausen ’68]

Cw Cw

2 1

Cw

3

Cw

4

Cw

5

Cw

6

System

Implicit channel

slide-7
SLIDE 7

/22

Toy implicit communication problem: Witsenhausen’s counterexample

7

+ +

+

  • x1

x2 z ∼ N(0, 1)

∼ N(0, σ2

0)

u1

min

  • k2E
  • u2

w

  • + E
  • x2

2

  • x2

2

u2

1

C1 C2

x0 x0 u2

+ + +

z

x1

E D

x0 u1 u2

E ⇥ u2

1

⇤ ≤ P u2

1

MMSE = E h (x1 − b x1)2i x1 b x1

Implicit communication interpretation :

Implicit channel [Witsenhausen ’68] Implicit message source

slide-8
SLIDE 8

/22

A brief history of Witsenhausen’s counterexample

8

+ +

+

  • x1

x2 z ∼ N(0, 1)

∼ N(0, σ2

0)

u1

min

  • k2E
  • u2

w

  • + E
  • x2

2

  • x2

2

u2

1

C1 C2

x0 x0 u2

+

Linear, Quadratic, Gaussian Nonlinear strategies can outperform linear [Witsenhausen ’68] . . . by an unbounded factor [Mitter, Sahai ’99] Finding optimal strategy is NP-hard [Papadimitriou, Tsitsiklis ’84] Semi-exhaustive search techniques [Baglietto, Parisini, Zoppoli ’01] [Lee, Lau, Ho ’01] [Lee, Marden, Shamma ’09]

slide-9
SLIDE 9

/22

u1

Understanding Witsenhausen’s counterexample: A deterministic abstraction

9

+ +

z

x1

E D

x0 u1 u2

E ⇥ u2

1

⇤ ≤ P u2

1

MMSE = E h (x1 − b x1)2i x1 b x1

Implicit communication interpretation :

z

b1 b2 b3 b4 b5

x0 E D b x1

x1

[Avestimehr, Diggavi, Tse ’08] [Grover, Sahai ’09]

slide-10
SLIDE 10

/22

Strategies for the deterministic abstraction

10

b1 b2 b3 b4 b5

x1

E x0 D

  • x1

z

x0

0.01011

bits forced to zero by

u1

0.01

u1

0.10

  • 0.01

quantization!

slide-11
SLIDE 11

/22

Asymptotically infinite-length extension of Witsenhausen’s counterexample

11

hope : can use laws of large numbers to simplify

min ⇢k2 m E ⇥ kum

1 k2⇤

+ 1 mE ⇥ kxm

1 b

xm

1 k2⇤

um

1

xm

1

b xm

1

+ +

z

x1

  • x1

E D

x0 u1

min

  • k2E

⇥ u2

1

⇤ + E ⇥ (x1 − b x1)2⇤ u2

1

b x1 x1

xm

um

1

xm

1

b xm

1

zm

∼ N(0, I)

∼ N(0, σ2

0I)

xm

slide-12
SLIDE 12

/22

ms2

x0

A strategy : vector quantization

12

¯ C = k2 + 0

Asymptotic upper bound :

¯ C = k2P + MMSE MMSE ¯ C = k2P + MMSE MMSE

Noise sphere

for blurry

+ + E D

=

x1

  • x1

z

ub

∼ N(0, I)

∼ N(0, σ2

0I)

x0 u1 y

x1 u1 uw

u1

x0

slide-13
SLIDE 13

/22

Ratio of upper and lower bounds

13

∼ N(0, σ2

0I)

xm

log10(σ0)

log10(k)

Ratio

Upper Bound Lower Bound < 4.13

min ⇢k2 m E ⇥ kum

1 k2⇤

+ 1 mE ⇥ kxm

1 b

xm

1 k2⇤

um

1

xm

1

b xm

1

+ +

+

  • C1

C2

+ xm um

1

xm

1

um

2

xm

2

zm

[Grover, Sahai ’08]

slide-14
SLIDE 14

/22

. . . with dirty-paper coding strategy

14

−4 −2 2 −2 −1 1 2 1 2 3 4 5 log10(k) log10(σ0

2)

Ratio

Upper Bound Lower Bound < 2

slide-15
SLIDE 15

/22

15 x0

x1

[Baglietto, Parisini, Zoppoli ’01] [Lee, Lau and Ho ’01]

x0 x0

x0 u1 x1

αx0

αx0

  • m(P + α2σ2

0)

Conjectured optimal strategy: Dirty-paper coding

slide-16
SLIDE 16

/22

Finite-vector lengths

16

slide-17
SLIDE 17

/22

Quantization upper bounds

17

x0

ζ = rc rp rc

rp

rc

slide-18
SLIDE 18

/22

−2 −1.5 −1 −0.5 0.5 1 1.5 10 20 30 40 log10(k) log10(σ0) ratio of upper bound (quantization/linear) and vector lower bound

diverges to infinity! ratio of scalar upper bound to vector lower bound

Use old lower bound

Lower bound

18

Need tighter lower bounds for tiny blocklengths

[Shannon ’59][Shannon, Gallager, Berlekamp ’67] [Blahut ’74][Pinsker ’67][Sahai ’06][Sahai, Grover ’07][Polyanskiy, Poor, Verdu ’08]

Theorem

¯ Cmin ≥ inf

P ≥0 k2P +

  • κ(P) −

√ P +2

κ(P) = σ2 (σ0 + √ P)2 + 1

[Grover, Sahai ’08]

Need bounds that work in a distortion setting!

slide-19
SLIDE 19

/22

Large-deviation technique to tighten the lower bound

19

Noise sphere

zm

xm

1

x1

Atypical noise behaviors are typical under a different distribution

Lower bound reality

Noise can behave atypically!

κ(P) = σ2

0σG2

(σ0 + √ P)2 + σG2 ¯ Cmin ≥ inf

P ≥0 k2P +

  • κ(P) −

√ P +2

slide-20
SLIDE 20

/22

“Sphere-packing” extension of lower bound

1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 !G = 1.25 1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 !G = 1.25 1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 !G = 1.25 !G = 2.1 1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 !G = 1.25 !G = 2.1 !G = 3.0 1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 !G = 1.25 !G = 2.1 !G = 3.0 !G = 3.9 1 1.5 2 2.5 3 10

!6

10

!4

10

!2

Power P MMSE !G = 1 !G = 1.25 !G = 2.1 !G = 3.0 !G = 3.9 Scalar lower bound

P = E[u2

1]

MMSE = E[(x1 − b x1)2]

+ +

z

x1

  • x1

E D

x0 u1

∼ N(0, 1)

20

slide-21
SLIDE 21

/22

Summary

21

Implicit communication promises substantial gains . . . can be understood using information theory Deterministic abstractions yield useful insights Large-deviation techniques are needed to obtain finite-length results

slide-22
SLIDE 22

/22

The finite-length lower bound

22

For σ2

G ⇤ 1 and L > 0

¯ Jmin(m, k2, σ2

0) ⇤ inf P⇤0 k2P + η(P, σ2 0, σ2 G, L),

η(P, σ2

0, σ2 G, L) =

σm

G

cm(L) exp ⇧ mL2(σ2

G 1)

2 ⌃ ⌥⇧ κ2(P, σ2

0, σ2 G, L)

⇧ P ⌃+2 ,

⇧ ⌃ where κ2(P, σ2

0, σ2 G, L) := σ2

0σ2 G

c

2 m m (L)e1dm(L)((σ0+

⌃ P)2+dm(L)σ2

G)

,

m+2 2 2

  • )

, cm(L) :=

1 Pr(⇧Zm⇧2⇥mL2) = (1 ψ(m, L⇧m)) 1,

  • (

dm(L) := Pr(⇧Zm+2⇧2⇥mL2)

Pr(⇧Zm⇧2⇥mL2) = 1ψ(m+2,L⌃m) 1ψ(m,L⌃m) ,