Implicit Communication for Control
Gireeja Ranade Works of Pulkit Grover, Anant Sahai, Se Yong Park
Slides courtesy Pulkit Grover
Implicit Communication for Control Gireeja Ranade Works of Pulkit - - PowerPoint PPT Presentation
Implicit Communication for Control Gireeja Ranade Works of Pulkit Grover, Anant Sahai, Se Yong Park Slides courtesy Pulkit Grover Decentralized control: observer-controller problem An abstraction C w 2 C w 1 Evolving C w Plant 3 C
Gireeja Ranade Works of Pulkit Grover, Anant Sahai, Se Yong Park
Slides courtesy Pulkit Grover
/32
2
Cw
6
Cw
5
Cw
4
Cw
3
Cw
1
Cw
2
Plant Evolving
/22
3
x
C2 C1
coordination
Weak Blurry
Cw Cw
2 1
Cw
3
Cw
4
Cw
5
Cw
6
System
/22
4
C2
x
C1
External channel
separate estimation and control
[Shannon ’48]
point-to-point communication
Weak
estimation
Blurry
control
Cw Cw
2 1
Cw
3
Cw
4
Cw
5
Cw
6
System External channel
/22
5
C2 C1
message
Explicit communication Implicit communication
/22
x0
Implicit communication: an example
6
C2 C1
Weak Blurry
x1
+
x2 z ∼ N(0, 1)
∼ N(0, σ2
0)
u1
min
w
2
2
u2
1
C1 C2
x0 x0 u2
[Witsenhausen ’68]
Cw Cw
2 1
Cw
3
Cw
4
Cw
5
Cw
6
System
Implicit channel
/22
7
+
x2 z ∼ N(0, 1)
∼ N(0, σ2
0)
u1
min
w
2
2
u2
1
C1 C2
x0 x0 u2
x1
x0 u1 u2
E ⇥ u2
1
⇤ ≤ P u2
1
MMSE = E h (x1 − b x1)2i x1 b x1
Implicit communication interpretation :
Implicit channel [Witsenhausen ’68] Implicit message source
/22
8
+ +
+
x2 z ∼ N(0, 1)
∼ N(0, σ2
0)
u1
min
w
2
2
u2
1
C1 C2
x0 x0 u2
+
Linear, Quadratic, Gaussian Nonlinear strategies can outperform linear [Witsenhausen ’68] . . . by an unbounded factor [Mitter, Sahai ’99] Finding optimal strategy is NP-hard [Papadimitriou, Tsitsiklis ’84] Semi-exhaustive search techniques [Baglietto, Parisini, Zoppoli ’01] [Lee, Lau, Ho ’01] [Lee, Marden, Shamma ’09]
/22
u1
9
x1
x0 u1 u2
E ⇥ u2
1
⇤ ≤ P u2
1
MMSE = E h (x1 − b x1)2i x1 b x1
Implicit communication interpretation :
z
b1 b2 b3 b4 b5
x0 E D b x1
x1
[Avestimehr, Diggavi, Tse ’08] [Grover, Sahai ’09]
/22
10
b1 b2 b3 b4 b5
x1
x0
0.01011
bits forced to zero by
u1
0.01
u1
0.10
quantization!
/22
11
hope : can use laws of large numbers to simplify
min ⇢k2 m E ⇥ kum
1 k2⇤
+ 1 mE ⇥ kxm
1 b
xm
1 k2⇤
um
1
xm
1
b xm
1
+ +
x1
E D
x0 u1
min
⇥ u2
1
⇤ + E ⇥ (x1 − b x1)2⇤ u2
1
b x1 x1
xm
um
1
xm
1
b xm
1
zm
∼ N(0, I)
∼ N(0, σ2
0I)
xm
/22
ms2
x0
12
¯ C = k2 + 0
Asymptotic upper bound :
¯ C = k2P + MMSE MMSE ¯ C = k2P + MMSE MMSE
Noise sphere
for blurry
+ + E D
=
x1
z
ub
∼ N(0, I)
∼ N(0, σ2
0I)
x0 u1 y
x1 u1 uw
u1
x0
/22
13
∼ N(0, σ2
0I)
xm
log10(σ0)
log10(k)
Ratio
Upper Bound Lower Bound < 4.13
min ⇢k2 m E ⇥ kum
1 k2⇤
+ 1 mE ⇥ kxm
1 b
xm
1 k2⇤
um
1
xm
1
b xm
1
+ +
+
C2
+ xm um
1
xm
1
um
2
xm
2
zm
[Grover, Sahai ’08]
/22
14
−4 −2 2 −2 −1 1 2 1 2 3 4 5 log10(k) log10(σ0
2)
Ratio
Upper Bound Lower Bound < 2
/22
15 x0
x1
[Baglietto, Parisini, Zoppoli ’01] [Lee, Lau and Ho ’01]
x0 x0
x0 u1 x1
αx0
αx0
0)
/22
16
/22
17
x0
ζ = rc rp rc
rp
rc
/22
−2 −1.5 −1 −0.5 0.5 1 1.5 10 20 30 40 log10(k) log10(σ0) ratio of upper bound (quantization/linear) and vector lower bound
diverges to infinity! ratio of scalar upper bound to vector lower bound
Use old lower bound
18
Need tighter lower bounds for tiny blocklengths
[Shannon ’59][Shannon, Gallager, Berlekamp ’67] [Blahut ’74][Pinsker ’67][Sahai ’06][Sahai, Grover ’07][Polyanskiy, Poor, Verdu ’08]
Theorem
¯ Cmin ≥ inf
P ≥0 k2P +
√ P +2
κ(P) = σ2 (σ0 + √ P)2 + 1
[Grover, Sahai ’08]
Need bounds that work in a distortion setting!
/22
Large-deviation technique to tighten the lower bound
19
Noise sphere
zm
xm
1
x1
Atypical noise behaviors are typical under a different distribution
Lower bound reality
Noise can behave atypically!
κ(P) = σ2
0σG2
(σ0 + √ P)2 + σG2 ¯ Cmin ≥ inf
P ≥0 k2P +
√ P +2
/22
1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 !G = 1.25 1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 !G = 1.25 1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 !G = 1.25 !G = 2.1 1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 !G = 1.25 !G = 2.1 !G = 3.0 1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 !G = 1.25 !G = 2.1 !G = 3.0 !G = 3.9 1 1.5 2 2.5 3 10
!6
10
!4
10
!2
Power P MMSE !G = 1 !G = 1.25 !G = 2.1 !G = 3.0 !G = 3.9 Scalar lower bound
P = E[u2
1]
MMSE = E[(x1 − b x1)2]
+ +
z
x1
E D
x0 u1
∼ N(0, 1)
20
/22
21
Implicit communication promises substantial gains . . . can be understood using information theory Deterministic abstractions yield useful insights Large-deviation techniques are needed to obtain finite-length results
/22
22
For σ2
G ⇤ 1 and L > 0
¯ Jmin(m, k2, σ2
0) ⇤ inf P⇤0 k2P + η(P, σ2 0, σ2 G, L),
η(P, σ2
0, σ2 G, L) =
σm
G
cm(L) exp ⇧ mL2(σ2
G 1)
2 ⌃ ⌥⇧ κ2(P, σ2
0, σ2 G, L)
⇧ P ⌃+2 ,
⇧ ⌃ where κ2(P, σ2
0, σ2 G, L) := σ2
0σ2 G
c
2 m m (L)e1dm(L)((σ0+
⌃ P)2+dm(L)σ2
G)
,
m+2 2 2
⌃
, cm(L) :=
1 Pr(⇧Zm⇧2⇥mL2) = (1 ψ(m, L⇧m)) 1,
dm(L) := Pr(⇧Zm+2⇧2⇥mL2)
Pr(⇧Zm⇧2⇥mL2) = 1ψ(m+2,L⌃m) 1ψ(m,L⌃m) ,