Fundamental Limitation of Noise Cancellation Imposed by Causality, - - PowerPoint PPT Presentation

fundamental limitation of noise cancellation imposed by
SMART_READER_LITE
LIVE PREVIEW

Fundamental Limitation of Noise Cancellation Imposed by Causality, - - PowerPoint PPT Presentation

Fundamental Limitation of Noise Cancellation Imposed by Causality, Stability, and Channel Capacity Munther A. Dahleh Laboratory for Information and Decision Systems Massachusetts Institute of Technology Dahleh, MTNS, Kyoto 2006 Acknowledgment


slide-1
SLIDE 1

Dahleh, MTNS, Kyoto 2006

Fundamental Limitation of Noise Cancellation Imposed by Causality, Stability, and Channel Capacity

Munther A. Dahleh

Laboratory for Information and Decision Systems Massachusetts Institute of Technology

slide-2
SLIDE 2

Dahleh, MTNS, Kyoto 2006

Acknowledgment:

This work is based on the PhD work as well as subsequent collaborations with Nuno Martins.

References:

N.C. Martins and M.A. Dahleh. Feedback Control of Noisy Channels: Bode-Like Fundamental Limitation of Performance, Revised and resubmitted to IEEE

  • Trans. AC.
  • N. Martins, M.A. Dahleh, and J. Doyle. Fundamental Limitations of Disturbance

Attenuation with Side Information. Accepted for publication in IEEE Trans. A-C.

slide-3
SLIDE 3

Dahleh, MTNS, Kyoto 2006

Outline Outline

  • Information Theory and its relations to control theory
  • Limitations Imposed by Causality
  • Limitations Imposed by Stability
  • Noise Cancellation with noisy measurements
  • Measurements are communicated through a noisy channel
  • Noise reduction over all possible encoders and decoders
  • Feedforward and Feedback setups:
  • Gaussian Case
  • General Case
  • Fundamental “flux” inequality
  • New lower bound
slide-4
SLIDE 4

Dahleh, MTNS, Kyoto 2006

PLANT

+ d e y Encoder Decoder A Simple Network Problem Objective:

  • Stability and disturbance rejection

Requires:

  • Stability Channel sends reliable estimate of the state
  • Performance Channel has to send info about disturbance

u

slide-5
SLIDE 5

Dahleh, MTNS, Kyoto 2006

Information theory

slide-6
SLIDE 6

Dahleh, MTNS, Kyoto 2006

IT: Entropy (1)

  • Information Theory Basic Quantities: Entropy
  • Entropy is the expected size of the most bit-economic representation:

{ }

M ,..., 1 = Z

⎡ ⎤

[ ]

log

2 z

E

  • Consider a random variable z with alphabet
  • What is the expected size of its binary expansion?

( ) ( ) [ ]

log

  • z

z

z

p E H =

  • There is another representation where we assign

bits to each z and the expected size becomes:

( )

⎡ ⎤

[ ]

log z

z

p E −

(valid asymptotically for sequences)

( )

⎡ ⎤

z pz log −

slide-7
SLIDE 7

Dahleh, MTNS, Kyoto 2006

IT: Entropy (2)

Properties of Entropy:

( )

≥ z H

M M M

( )

z pz

( )

= z H

z z z

( )

M H

2

log = z

( )

z pz

<

Measure of randomness

( )

z pz

<

Z can be represented as

) (

2

z H

unif dist. symbols

slide-8
SLIDE 8

Dahleh, MTNS, Kyoto 2006

IT: Differential Entopy

“Continuous” random variables Assume that w is a random variable with alphabet

n

ℜ = W

( ) ( ) ( )

[ ]

w w

Δ

Δ =

→ Δ f H h

2 lim 2

Differential entropy or “entropy density”:

Δ

f Δ

is a quantizer This many elements unif. dist.

  • unif. dist. volume

sensitivity of the quantizer

slide-9
SLIDE 9

Dahleh, MTNS, Kyoto 2006

IT: Mutual Information

“Continuous” random variables

( ) ( ) ( ) ( ) ( ) ( )

2 1 1 2 1 2 1

| , lim , w w w w w w w h h f f I I

def

− = =

Δ Δ → Δ

The function of the quantizer is to extract as “much” information as possible from the continuous random variables.

( ) ( ) ( ) ( ) ( )

w z z w z w z | , lim , h h f I I − = =

Δ → Δ

“Continuous” and discrete random variables

slide-10
SLIDE 10

Dahleh, MTNS, Kyoto 2006

IT: Information Rate

Definition in terms of rates: Consider two “continuous” or discrete stochastic processes:

( )

) ( ),..., ( k

k

z z z =

( )

) ( ),..., ( k

k

w w w =

(Information rate)

( )

( )

k I I

k k k

w z z w , lim ,

∞ → ∞

=

Maximum reliable bit-rate: used to define capacity.

(Entropy rate)

( )

( )

k h h

k k

z z

∞ → ∞

= lim

slide-11
SLIDE 11

Dahleh, MTNS, Kyoto 2006

IT: Properties of Mutual Information

Properties:

( )

) , ( , ≥ = w z z w I I

( )

) , | ( ) | ( ) , ( ) ), , ( ( | , v w z v z z v z v w v z w h h I I I − = − =

“On average”, given v, how much more information about z can I get from w?

I (Positivity) II (Kolmogorov’s Formula) III (spectral bound)

( ) ( ) [ ]

− ∞

π π

ω ω π π d eS h

z

2 log 4 1 z

( ) ( ))

det 2 log( 2 1

k k

Cov e h z z π ≤

slide-12
SLIDE 12

Dahleh, MTNS, Kyoto 2006

( ) ( ) ( ) ( )

[ ]

2 1 | 2 2 1 2 1

| log , |

2 1

z z z z z z z

Z Z

p E H H H − = − = Conditional Entropy

Mutual Information: Uncertainty reduction

Properties of Entropy: Mutual Information: How much information does z2 carry about z1 ?

( ) ( ) ( )

2 1 1 2 1

| , z z z z z H H I − =

slide-13
SLIDE 13

Dahleh, MTNS, Kyoto 2006

Source Compression Channel Encoder Information Transmission: Shannon Channel Z Channel Decoder Y P(Y|Z) Decompress X X ^

) , ( sup ) (

) (

Y Z I X H

Z P

<

slide-14
SLIDE 14

Dahleh, MTNS, Kyoto 2006

Bode Integral Formula: Causality Constraints

slide-15
SLIDE 15

Dahleh, MTNS, Kyoto 2006

Bode’s Integral Limitation (LTI)

Linear Feedback Scheme:

( ) ( ) ( )

ω ω ω

d e

S S S =

Sensitivity Function

( ) ( )

( )

∑ ∫

> −

1

log log 2 1

A

A

i

i

d S

λ π π

λ ω ω π

Assume that d is asymptotically stationary:

Basic Result:

slide-16
SLIDE 16

Dahleh, MTNS, Kyoto 2006

Bode’s Integral Limitation (Extensions)

  • Freudenberg, Middleton, Seron, Braslavsky, Jie Chen, …
  • Pablo Iglesias, et all

Extensions: multivariable, time-varying … Information Theoretic Interpretation: Extensions to classes of Non-Linear Systems

  • Goncalves and Doyle

Under the assumption of convergence of the Fourier Transform (Deterministic) Question:

P P

Causal

d e u y

Using can we beat ( ) ( )

( )

∑ ∫

> −

1

log log 2 1

A

A

i

i

d S

λ π π

λ ω ω π

?

slide-17
SLIDE 17

Dahleh, MTNS, Kyoto 2006

Bode’s Integral Limitation (Extension 1)

Arbitrary deterministic Causal w/ delay

d e u

( ) ( )

k k

h h d e =

( ) ( )

d e

∞ ∞

= h h

( ) ( ) [ ]

− ∞

π π

ω ω π π d eS h

z

2 log 4 1 z

( ) ( )

∫ ∫

− −

π π π π

ω ω π ω ω π d S d S

d e

log 4 1 log 4 1

( )

log 2 1 ≥

− π π

ω ω π d S

( ) ( ) ( )

ω ω ω

d e

S S S =

Property III

slide-18
SLIDE 18

Dahleh, MTNS, Kyoto 2006

Proof of Extension (1)

Arbitrary deterministic Causal w/ delay

d e u

( ) ( )

k k

h h d e =

Proof:

( )

| ), (

1 = − k k

k I d u d

( )

) , | ) ( ( ) | ) ( ( | ), (

1 1 1 k k k k k

k h k h k I u d d d d d u d

− − −

− = ) | ) ( ( ) , | ) ( ( ) , | ) ( ( ) , | ) ( (

1 1 1 1 − − − −

= = =

k k k k k k k

k h k h k h k h e e u e e u d e u d d ) | ) ( ( ) | ) ( (

1 1 − −

=

k k

k h k h e e d d

slide-19
SLIDE 19

Dahleh, MTNS, Kyoto 2006

Limitations Imposed By Stability

.

( ) ( )

e x ,

I ( )

unstable i A

λ log

P P

Arbitrary

e

X(0)

A necessary condition for asymptotic stability: Rate of Transmission

slide-20
SLIDE 20

Dahleh, MTNS, Kyoto 2006

P P

Arbitrary

d e Bode’s Integral Limitation (Extension 2)

How can Bode be derived from Information Theory?

( ) ( )

∑ ∫

− unstable i

d S A λ ω ω π

π π

log log 2 1

( ) ( ) ( )

ω ω ω

d e def

S S S =

Bode I.T. Bode’s inequality

( ) ( ) ( ) ( )

e x d e ,

∞ ∞ ∞

+ ≥ I h h

( )

unstable i A

λ log

( )

− π π

ω ω π π d eSe 2 log 4 1 X(0)

slide-21
SLIDE 21

Dahleh, MTNS, Kyoto 2006

Bode Integral Limitation: Push Bode Integral Limitation: Push-

  • Pop

Pop

ω

d e

S S log

Comparison with Bode’s integral formula:

P K

( )

t d

( )

t e

( ) ( )

∑ ∫

unstable L O d e

poles d S S

. .

log log 4 1

π π

ω ω ω π

slide-22
SLIDE 22

Dahleh, MTNS, Kyoto 2006

Feedforward Noise Cancellation

slide-23
SLIDE 23

Dahleh, MTNS, Kyoto 2006

Feedforward Feedforward Noise Cancellation Noise Cancellation

Gaussian Noise

) (z G

( )

k d

m delay m delay

Disturbance

+

E E

+

( )

k c

D D

Gaussian Channel

( )

k u

( )

k v

( )

k r

( )

k q

( )

u h

D E ∞ ,

inf

D E, Causal, LTI with 1

2 ≤

Ev

slide-24
SLIDE 24

Dahleh, MTNS, Kyoto 2006

Feedforward Feedforward Noise Cancellation Noise Cancellation

,

1 − ∗ = G

E

  • If we search over linear, causal and time-invariant E, D, we get:

G z D

c m 2

1 σ + =

− ∗

  • Optimal noise cancellation:

( ) ( )

− ∈

⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + − =

π π

σ ω ω ω π

2 ,

1 1 log 2 1 log 4 1 min

c d u LTI D E

d S S

  • Observe:

( ) ( )

− ∞

=

π π

ω ω π π d eS u h

u

2 log 4 1

( )

C d h u h − =

∞ ∞

) (

  • Equivalently:
slide-25
SLIDE 25

Dahleh, MTNS, Kyoto 2006

  • How does Entropy change by stable filtering?
  • Consider LTI , causal, with
  • If are the unstable zeros of H

Observations ( Observations (Zang Zang & & Iglesias Iglesias) )

slide-26
SLIDE 26

Dahleh, MTNS, Kyoto 2006

Feedforward Feedforward Noise Cancellation Noise Cancellation-

  • Early Warning

Early Warning

( ) ( ) ( ) ( )

∫ ∑ ∫

− −

+ =

π π π π

ω ω ω π ω ω ω π

unstable L O d u D E d e D E

poles d S S d S S

. . , ,

log log 4 1 inf log 4 1 inf

Gaussian Noise

How much can we attenuate Entropy of e? From previous observation:

) ( z G

( )

k d

m delay m delay

Disturbance

PK + 1 1

+

E E

+

( )

k c

D D

( )

k e Gaussian Channel

( )

k u

( )

k v

( )

k r

( )

k q

slide-27
SLIDE 27

Dahleh, MTNS, Kyoto 2006

Feedforward Feedforward Noise Cancellation Noise Cancellation-

  • Early Warning

Early Warning

Gaussian Noise

) (z G

( )

k d

m delay m delay

Disturbance

PK + 1 1

+

E E

+

( )

k c

D D

( )

k e

Gaussian Channel

( )

k u

( )

k v

( )

k r

( )

k q

( ) ( )

− ∈

⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + − =

π π

σ ω ω ω π

2 ,

1 1 log 2 1 log 4 1 min

c d u LTI D E

d S S

Capacity=C

( ) ( )

C poles d S S

unstable L O d e LTI D E

− = ∑

− ∈ . . ,

log log 4 1 inf

π π

ω ω ω π

Final result is:

slide-28
SLIDE 28

Dahleh, MTNS, Kyoto 2006

Fundamental Limitation Fundamental Limitation-

  • Early Warning

Early Warning

) (z G ( )

k d m delay m delay

Disturbance

PK + 1 1

+

E E

+

( )

k c

D D

( )

k e

Gaussian Channel

Gaussian Noise

( )

k u

( )

k v

( )

k r

( )

k q

With side information:

( ) ( )

C poles d S S

unstable L O d e

− ≥ ∑

− . .

log log 4 1

π π

ω ω ω π

ω

d e

S S log

Comparison with Bode’s integral formula:

P K

( )

t d

( )

t e

( ) ( )

∑ ∫

unstable L O d e

poles d S S

. .

log log 4 1

π π

ω ω ω π

slide-29
SLIDE 29

Dahleh, MTNS, Kyoto 2006

Feedforward Feedforward Noise Cancellation Noise Cancellation-

  • General Channel

General Channel Still, we get:

( ) ( )

C poles d S S

unstable L O d e D E

− ≥ ∑

− . . ,

log log 4 1 inf

π π

ω ω ω π

Proof:

( ) ( ) ( )

d q I d h d q d h u h ; ) ( |

∞ ∞ ∞ ∞

− = + ≥

( )

C q d I ≤

;

What if we extend the search to non-linear and time-varying E and D as well as an arbitrary channel?

E E ) (z G

( )

k d m delay m delay

Disturbance

PK + 1 1

+

D D

( )

k e

( )

k u

( )

k v

( )

k r

( )

k q Noise

slide-30
SLIDE 30

Dahleh, MTNS, Kyoto 2006

Feedforward Feedforward Noise Cancellation Noise Cancellation-

  • Most General

Most General

) (z G

( )

k d

m delay m delay

Disturbance

P

+

E E K K

( )

k e

Gaussian Noise

( )

k u

( )

k v

( )

k r

+ Noise K-Arbitrary, with two inputs

( ) ( )

C poles d S S

unstable P d e K E

− ≥ ∑

− . ,

log log 4 1 inf

π π

ω ω ω π

slide-31
SLIDE 31

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation

slide-32
SLIDE 32

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation Feedback Noise Cancellation P P + +

( )

k c

D D E E

G G

( )

k e

( )

k d

( )

k w

( )

k u

( )

k z

( )

k v

White Gaussian White Gaussian

How is the channel in the loop affecting what we can do in terms

  • f ?

d e

S S S =

slide-33
SLIDE 33

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation (Gaussian) Feedback Noise Cancellation (Gaussian) Information flux equality for the linear and time-invariant case:

P P + + ( )

k c

D D E E

G G

( )

k e

( )

k d

( )

k w

( )

k u

( )

k z

( )

k v

White Gaussian White Gaussian

( ) ( )

+ =

∞ ∞ unstable L O

poles u w I z v I

. .

| | log ; ;

Proof: ( )

ω σ π π

π π

d EPD EPG e z v I

c

− − ∞

⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + =

2 2 2 1

1 1 | | 1 2 log 4 1 ;

( )

ω σ π π

π π

d EPG e u w I

c

− − ∞

⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + =

2 2 1 |

| 1 2 log 4 1 ;

slide-34
SLIDE 34

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation Feedback Noise Cancellation What does this mean in terms of the sensitivity ?

d e

S S S =

Fact 1 (not obvious):

2 1

1 1 | | min

c D def

EPG S S σ

− ∗

+ = =

It follows:

( )

ω π

π π

d S u w I

− ∗ ∞

− = log 4 1 ;

slide-35
SLIDE 35

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation Feedback Noise Cancellation

P P + + ( )

k c

D D E E

G G

( )

k e

( )

k d

( )

k w

( )

k u

( )

k z

( )

k v

White Gaussian White Gaussian

Reminder

( )

∑ ∫

≥ +

− ∞ unstable L O

poles d S z v I

. .

| | log log 4 1 ; ω π

π π

Bode anyway … After substitutions, we get: We can strengthen the bound by using:

2 1

1 1 , 1 max

c

EPG S σ

+ ≤ ⎭ ⎬ ⎫ ⎩ ⎨ ⎧

After substitutions, we get:

( ) [ ]

∑ ∫

≥ +

− − ∞ unstable L O

poles d S z v I

. .

| | log log 4 1 ; ω π

π π

slide-36
SLIDE 36

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation Feedback Noise Cancellation Contrasting Bode with the new limitation

( ) [ ] ( ) [ ]

∑ ∫ ∫

≥ +

− − − + unstable P j j

poles d e S d e S

.

log log 4 1 log 4 1

π π ω π π ω

ω π ω π

Bode

( )

( ) [ ]

∑ ∫

≥ +

− − ∞ unstable P j

poles d e S z v I

.

log log 4 1 ;

π π ω

ω π

New Inequality

ω

( )

ω S log

slide-37
SLIDE 37

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation (General Case) Feedback Noise Cancellation (General Case) P P +

D D E E

G G

( )

k e

( )

k d

( )

k w

( )

k u

( )

k z

( )

k v

What if we have a general memoryless channel and allow for non-linear, time-varying encoding and decoding?

( ) ( ) ( ) ( )

e X I u w I z v I ; ; ;

∞ ∞ ∞

+ ≥

Stability

( ) ( )

≥ ⇒

∞ unstable P

poles e X I

.

| | log ;

slide-38
SLIDE 38

Dahleh, MTNS, Kyoto 2006

Feedback Noise Cancellation (General Case) Feedback Noise Cancellation (General Case) P P +

D D E E

G G

( )

k e

( )

k d

( )

k w

( )

k u

( )

k z

( )

k v

Substitution gives:

( ) ( )

+ ≥

∞ ∞ unstable P

poles u w I z v I

.

| | log ; ;

And the following remains valid:

( )

[ ]

∑ ∫

≥ +

− − ∞ unstable P

poles d S z v I

.

| | log log 4 1 ; ω π

π π

slide-39
SLIDE 39

Dahleh, MTNS, Kyoto 2006

( )

10 1 1

5 . 1 1

− −

− z z

Plant +

) ( 1 z P z −

+ d(t)=ε(t)

) (t c

Channel u(t)

) (t e

( ) ( )

{ }

( ) [ ]

∫ ∑

− − ∞

− ≥ −

π π

ω ω π λ d S A z v I

i i

log 4 1 log , max ;

Function 1 Function 2 Numerical Example Feedback Noise Cancellation Feedback Noise Cancellation

slide-40
SLIDE 40

Dahleh, MTNS, Kyoto 2006

2 4 6 8 10 1.2 1.4 1.6 1.8 2 2.2 2.4

( ) ( )

( ) [ ]

∫ ∑

− − ∞

− ≥ − →

π π ω

ω π λ d e S A z w I

j unstable i

log 2 1 log

2 4 6 8 10 0.05 0.1 0.15 0.2 0.25 0.3

2 c

σ

2 c

σ

2 1 2 1 Function Function Function Function + − Feedback Noise Cancellation Feedback Noise Cancellation

( ) ( )

{ }

( ) [ ]

∫ ∑

− − ∞

− ≥ −

π π

ω ω π λ d S A z v I

i i

log 4 1 log , max ;

Function 1 Function 2

slide-41
SLIDE 41

Dahleh, MTNS, Kyoto 2006

Conclusions Conclusions

  • The role of Information Theory in capturing limitations: Causality,

stability, and finite-capacity channels.

  • Highlighted the interplay between noise cancellation and channel

capacity in feedforward and feedback settings.

  • In feedforward systems, Entropy cannot be reduced by more than

the capacity of the channel.

  • In feedback systems, attenuation is limited by the capacity of the

Channel.

  • Information Flux Inequality is fundamental to the analysis.