Dahleh, MTNS, Kyoto 2006
Fundamental Limitation of Noise Cancellation Imposed by Causality, Stability, and Channel Capacity
Munther A. Dahleh
Laboratory for Information and Decision Systems Massachusetts Institute of Technology
Fundamental Limitation of Noise Cancellation Imposed by Causality, - - PowerPoint PPT Presentation
Fundamental Limitation of Noise Cancellation Imposed by Causality, Stability, and Channel Capacity Munther A. Dahleh Laboratory for Information and Decision Systems Massachusetts Institute of Technology Dahleh, MTNS, Kyoto 2006 Acknowledgment
Dahleh, MTNS, Kyoto 2006
Laboratory for Information and Decision Systems Massachusetts Institute of Technology
Dahleh, MTNS, Kyoto 2006
This work is based on the PhD work as well as subsequent collaborations with Nuno Martins.
N.C. Martins and M.A. Dahleh. Feedback Control of Noisy Channels: Bode-Like Fundamental Limitation of Performance, Revised and resubmitted to IEEE
Attenuation with Side Information. Accepted for publication in IEEE Trans. A-C.
Dahleh, MTNS, Kyoto 2006
Outline Outline
Dahleh, MTNS, Kyoto 2006
+ d e y Encoder Decoder A Simple Network Problem Objective:
Requires:
u
Dahleh, MTNS, Kyoto 2006
Dahleh, MTNS, Kyoto 2006
IT: Entropy (1)
{ }
M ,..., 1 = Z
⎡ ⎤
[ ]
log
2 z
E
log
z
z
p E H =
bits to each z and the expected size becomes:
( )
⎡ ⎤
[ ]
log z
z
p E −
(valid asymptotically for sequences)
( )
⎡ ⎤
z pz log −
Dahleh, MTNS, Kyoto 2006
IT: Entropy (2)
Properties of Entropy:
≥ z H
M M M
z pz
( )
= z H
z z z
( )
M H
2
log = z
z pz
Measure of randomness
z pz
Z can be represented as
) (
z H
unif dist. symbols
Dahleh, MTNS, Kyoto 2006
IT: Differential Entopy
“Continuous” random variables Assume that w is a random variable with alphabet
n
( ) ( ) ( )
w w
Δ
→ Δ f H h
Differential entropy or “entropy density”:
Δ
is a quantizer This many elements unif. dist.
sensitivity of the quantizer
Dahleh, MTNS, Kyoto 2006
IT: Mutual Information
“Continuous” random variables
( ) ( ) ( ) ( ) ( ) ( )
2 1 1 2 1 2 1
| , lim , w w w w w w w h h f f I I
def
− = =
Δ Δ → Δ
The function of the quantizer is to extract as “much” information as possible from the continuous random variables.
( ) ( ) ( ) ( ) ( )
w z z w z w z | , lim , h h f I I − = =
Δ → Δ
“Continuous” and discrete random variables
Dahleh, MTNS, Kyoto 2006
IT: Information Rate
Definition in terms of rates: Consider two “continuous” or discrete stochastic processes:
) ( ),..., ( k
k
z z z =
( )
) ( ),..., ( k
k
w w w =
(Information rate)
( )
( )
k I I
k k k
w z z w , lim ,
∞ → ∞
=
Maximum reliable bit-rate: used to define capacity.
(Entropy rate)
( )
( )
k h h
k k
z z
∞ → ∞
= lim
Dahleh, MTNS, Kyoto 2006
IT: Properties of Mutual Information
Properties:
) , | ( ) | ( ) , ( ) ), , ( ( | , v w z v z z v z v w v z w h h I I I − = − =
“On average”, given v, how much more information about z can I get from w?
( ) ( ) [ ]
− ∞
≤
π π
ω ω π π d eS h
z
2 log 4 1 z
( ) ( ))
det 2 log( 2 1
k k
Cov e h z z π ≤
Dahleh, MTNS, Kyoto 2006
( ) ( ) ( ) ( )
[ ]
2 1 | 2 2 1 2 1
| log , |
2 1
z z z z z z z
Z Z
p E H H H − = − = Conditional Entropy
Mutual Information: Uncertainty reduction
Properties of Entropy: Mutual Information: How much information does z2 carry about z1 ?
( ) ( ) ( )
2 1 1 2 1
| , z z z z z H H I − =
Dahleh, MTNS, Kyoto 2006
Source Compression Channel Encoder Information Transmission: Shannon Channel Z Channel Decoder Y P(Y|Z) Decompress X X ^
) (
Z P
Dahleh, MTNS, Kyoto 2006
Dahleh, MTNS, Kyoto 2006
Bode’s Integral Limitation (LTI)
Linear Feedback Scheme:
( ) ( ) ( )
ω ω ω
d e
S S S =
Sensitivity Function
( ) ( )
( )
> −
≥
1
log log 2 1
A
A
i
i
d S
λ π π
λ ω ω π
Assume that d is asymptotically stationary:
Basic Result:
Dahleh, MTNS, Kyoto 2006
Bode’s Integral Limitation (Extensions)
Extensions: multivariable, time-varying … Information Theoretic Interpretation: Extensions to classes of Non-Linear Systems
Under the assumption of convergence of the Fourier Transform (Deterministic) Question:
P P
Causal
d e u y
Using can we beat ( ) ( )
( )
∑ ∫
> −
≥
1
log log 2 1
A
A
i
i
d S
λ π π
λ ω ω π
Dahleh, MTNS, Kyoto 2006
Bode’s Integral Limitation (Extension 1)
Arbitrary deterministic Causal w/ delay
d e u
k k
h h d e =
d e
∞ ∞
= h h
( ) ( ) [ ]
∫
− ∞
≤
π π
ω ω π π d eS h
z
2 log 4 1 z
( ) ( )
− −
≥
π π π π
ω ω π ω ω π d S d S
d e
log 4 1 log 4 1
( )
log 2 1 ≥
− π π
ω ω π d S
( ) ( ) ( )
ω ω ω
d e
S S S =
Property III
Dahleh, MTNS, Kyoto 2006
Proof of Extension (1)
Arbitrary deterministic Causal w/ delay
d e u
k k
h h d e =
Proof:
| ), (
1 = − k k
k I d u d
) , | ) ( ( ) | ) ( ( | ), (
1 1 1 k k k k k
k h k h k I u d d d d d u d
− − −
− = ) | ) ( ( ) , | ) ( ( ) , | ) ( ( ) , | ) ( (
1 1 1 1 − − − −
= = =
k k k k k k k
k h k h k h k h e e u e e u d e u d d ) | ) ( ( ) | ) ( (
1 1 − −
=
k k
k h k h e e d d
Dahleh, MTNS, Kyoto 2006
Limitations Imposed By Stability
.
( ) ( )
e x ,
∞
I ( )
≥
unstable i A
λ log
Arbitrary
e
X(0)
A necessary condition for asymptotic stability: Rate of Transmission
≥
Dahleh, MTNS, Kyoto 2006
Arbitrary
d e Bode’s Integral Limitation (Extension 2)
How can Bode be derived from Information Theory?
( ) ( )
≥
− unstable i
d S A λ ω ω π
π π
log log 2 1
( ) ( ) ( )
ω ω ω
d e def
S S S =
Bode I.T. Bode’s inequality
( ) ( ) ( ) ( )
e x d e ,
∞ ∞ ∞
+ ≥ I h h
( )
∑
≥
unstable i A
λ log
( )
≥
− π π
ω ω π π d eSe 2 log 4 1 X(0)
Dahleh, MTNS, Kyoto 2006
Bode Integral Limitation: Push Bode Integral Limitation: Push-
Pop
d e
S S log
Comparison with Bode’s integral formula:
( )
t d
( )
t e
−
≥
unstable L O d e
poles d S S
. .
log log 4 1
π π
ω ω ω π
Dahleh, MTNS, Kyoto 2006
Dahleh, MTNS, Kyoto 2006
Feedforward Feedforward Noise Cancellation Noise Cancellation
Gaussian Noise
) (z G
( )
k d
m delay m delay
Disturbance
+
+
( )
k c
Gaussian Channel
( )
k u
( )
k v
( )
k r
( )
k q
2 ≤
Dahleh, MTNS, Kyoto 2006
Feedforward Feedforward Noise Cancellation Noise Cancellation
1 − ∗ = G
c m 2
− ∗
− ∈
⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + − =
π π
σ ω ω ω π
2 ,
1 1 log 2 1 log 4 1 min
c d u LTI D E
d S S
− ∞
=
π π
ω ω π π d eS u h
u
2 log 4 1
∞ ∞
Dahleh, MTNS, Kyoto 2006
Observations ( Observations (Zang Zang & & Iglesias Iglesias) )
Dahleh, MTNS, Kyoto 2006
Feedforward Feedforward Noise Cancellation Noise Cancellation-
Early Warning
− −
+ =
π π π π
ω ω ω π ω ω ω π
unstable L O d u D E d e D E
poles d S S d S S
. . , ,
log log 4 1 inf log 4 1 inf
Gaussian Noise
How much can we attenuate Entropy of e? From previous observation:
) ( z G
( )
k d
m delay m delay
Disturbance
PK + 1 1
+
+
( )
k c
( )
k e Gaussian Channel
( )
k u
( )
k v
( )
k r
( )
k q
Dahleh, MTNS, Kyoto 2006
Feedforward Feedforward Noise Cancellation Noise Cancellation-
Early Warning
Gaussian Noise
) (z G
( )
k d
m delay m delay
Disturbance
PK + 1 1
+
+
( )
k c
( )
k e
Gaussian Channel
( )
k u
( )
k v
( )
k r
( )
k q
− ∈
⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + − =
π π
σ ω ω ω π
2 ,
1 1 log 2 1 log 4 1 min
c d u LTI D E
d S S
Capacity=C
C poles d S S
unstable L O d e LTI D E
− = ∑
− ∈ . . ,
log log 4 1 inf
π π
ω ω ω π
Final result is:
Dahleh, MTNS, Kyoto 2006
Fundamental Limitation Fundamental Limitation-
Early Warning
) (z G ( )
k d m delay m delay
Disturbance
PK + 1 1
+
E E
+
( )
k c
D D
( )
k e
Gaussian Channel
Gaussian Noise
( )
k u
( )
k v
( )
k r
( )
k q
With side information:
C poles d S S
unstable L O d e
− ≥ ∑
− . .
log log 4 1
π π
ω ω ω π
d e
S S log
Comparison with Bode’s integral formula:
( )
t d
( )
t e
−
≥
unstable L O d e
poles d S S
. .
log log 4 1
π π
ω ω ω π
Dahleh, MTNS, Kyoto 2006
Feedforward Feedforward Noise Cancellation Noise Cancellation-
General Channel Still, we get:
C poles d S S
unstable L O d e D E
− ≥ ∑
− . . ,
log log 4 1 inf
π π
ω ω ω π
Proof:
∞ ∞ ∞ ∞
∞
What if we extend the search to non-linear and time-varying E and D as well as an arbitrary channel?
E E ) (z G
( )
k d m delay m delay
Disturbance
PK + 1 1
+
D D
( )
k e
( )
k u
( )
k v
( )
k r
( )
k q Noise
Dahleh, MTNS, Kyoto 2006
Feedforward Feedforward Noise Cancellation Noise Cancellation-
Most General
k d
m delay m delay
Disturbance
+
( )
k e
Gaussian Noise
k u
k v
k r
+ Noise K-Arbitrary, with two inputs
unstable P d e K E
− . ,
π π
Dahleh, MTNS, Kyoto 2006
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation Feedback Noise Cancellation P P + +
( )
k c
G G
k e
k d
k w
k u
k z
k v
White Gaussian White Gaussian
How is the channel in the loop affecting what we can do in terms
d e
S S S =
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation (Gaussian) Feedback Noise Cancellation (Gaussian) Information flux equality for the linear and time-invariant case:
P P + + ( )
k c
D D E E
G G
( )
k e
( )
k d
( )
k w
( )
k u
( )
k z
( )
k v
White Gaussian White Gaussian
+ =
∞ ∞ unstable L O
poles u w I z v I
. .
| | log ; ;
Proof: ( )
ω σ π π
π π
d EPD EPG e z v I
c
− − ∞
⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + =
2 2 2 1
1 1 | | 1 2 log 4 1 ;
( )
ω σ π π
π π
d EPG e u w I
c
− − ∞
⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + =
2 2 1 |
| 1 2 log 4 1 ;
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation Feedback Noise Cancellation What does this mean in terms of the sensitivity ?
d e
S S S =
Fact 1 (not obvious):
2 1
1 1 | | min
c D def
EPG S S σ
− ∗
+ = =
It follows:
( )
ω π
π π
d S u w I
− ∗ ∞
− = log 4 1 ;
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation Feedback Noise Cancellation
P P + + ( )
k c
D D E E
G G
( )
k e
( )
k d
( )
k w
( )
k u
( )
k z
( )
k v
White Gaussian White Gaussian
Reminder
( )
≥ +
− ∞ unstable L O
poles d S z v I
. .
| | log log 4 1 ; ω π
π π
Bode anyway … After substitutions, we get: We can strengthen the bound by using:
2 1
1 1 , 1 max
c
EPG S σ
−
+ ≤ ⎭ ⎬ ⎫ ⎩ ⎨ ⎧
After substitutions, we get:
( ) [ ]
≥ +
− − ∞ unstable L O
poles d S z v I
. .
| | log log 4 1 ; ω π
π π
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation Feedback Noise Cancellation Contrasting Bode with the new limitation
( ) [ ] ( ) [ ]
≥ +
− − − + unstable P j j
poles d e S d e S
.
log log 4 1 log 4 1
π π ω π π ω
ω π ω π
Bode
( )
( ) [ ]
≥ +
− − ∞ unstable P j
poles d e S z v I
.
log log 4 1 ;
π π ω
ω π
New Inequality
ω
( )
ω S log
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation (General Case) Feedback Noise Cancellation (General Case) P P +
G G
k e
k d
k w
k u
k z
k v
What if we have a general memoryless channel and allow for non-linear, time-varying encoding and decoding?
e X I u w I z v I ; ; ;
∞ ∞ ∞
+ ≥
Stability
≥ ⇒
∞ unstable P
poles e X I
.
| | log ;
Dahleh, MTNS, Kyoto 2006
Feedback Noise Cancellation (General Case) Feedback Noise Cancellation (General Case) P P +
G G
k e
k d
k w
k u
k z
k v
Substitution gives:
∞ ∞ unstable P
.
And the following remains valid:
≥ +
− − ∞ unstable P
poles d S z v I
.
| | log log 4 1 ; ω π
π π
Dahleh, MTNS, Kyoto 2006
( )
10 1 1
5 . 1 1
− −
− z z
Plant +
) ( 1 z P z −
+ d(t)=ε(t)
) (t c
Channel u(t)
) (t e
( ) ( )
{ }
( ) [ ]
− − ∞
− ≥ −
π π
ω ω π λ d S A z v I
i i
log 4 1 log , max ;
Function 1 Function 2 Numerical Example Feedback Noise Cancellation Feedback Noise Cancellation
Dahleh, MTNS, Kyoto 2006
2 4 6 8 10 1.2 1.4 1.6 1.8 2 2.2 2.4
( ) ( )
( ) [ ]
− − ∞
− ≥ − →
π π ω
ω π λ d e S A z w I
j unstable i
log 2 1 log
2 4 6 8 10 0.05 0.1 0.15 0.2 0.25 0.3
2 c
2 c
2 1 2 1 Function Function Function Function + − Feedback Noise Cancellation Feedback Noise Cancellation
( ) ( )
{ }
( ) [ ]
− − ∞
− ≥ −
π π
ω ω π λ d S A z v I
i i
log 4 1 log , max ;
Function 1 Function 2
Dahleh, MTNS, Kyoto 2006
Conclusions Conclusions
stability, and finite-capacity channels.
capacity in feedforward and feedback settings.
the capacity of the channel.
Channel.