Detection of Gauss-Markov Random Field on Nearest-Neighbor Graph A. - - PowerPoint PPT Presentation

detection of gauss markov random field on nearest
SMART_READER_LITE
LIVE PREVIEW

Detection of Gauss-Markov Random Field on Nearest-Neighbor Graph A. - - PowerPoint PPT Presentation

Detection of Gauss-Markov Random Field on Nearest-Neighbor Graph A. Anandkumar 1 L. Tong 1 A. Swami 2 1 School of Electrical and Computer Engineering Cornell University, Ithaca, NY 14853 2 Army Research Laboratory, Adelphi MD 20783 2007


slide-1
SLIDE 1

Detection of Gauss-Markov Random Field

  • n Nearest-Neighbor Graph
  • A. Anandkumar1
  • L. Tong1
  • A. Swami2

1School of Electrical and Computer Engineering

Cornell University, Ithaca, NY 14853

2Army Research Laboratory, Adelphi MD 20783

2007 International Conference on Acoustics, Speech and Signal Processing

. Supported by the Army Research Laboratory CTA

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 1 / 24

slide-2
SLIDE 2

Introduction: Distributed Detection

Setup

Sensors: transmit local decisions Fusion center: Global Decision Classical data model: Conditionally IID

Sensor signal field

Correlated sensor readings Large coverage area Large number of sensors Arbitrary sensor placement Influence of correlation structure on detection performance

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 2 / 24

slide-3
SLIDE 3

Detection of Correlation

Binary hypothesis testing

H1: Correlated data vs. H0: Independent observations

Questions

How to model correlation? Is there an analytically tractable performance metric? How does correlation affect performance? How does node density affect performance? New tradeoffs not encountered in IID scenario

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 3 / 24

slide-4
SLIDE 4

Detection of Correlation

Binary hypothesis testing

H1: Correlated data vs. H0: Independent observations

Questions

How to model correlation? Is there an analytically tractable performance metric? How does correlation affect performance? How does node density affect performance? New tradeoffs not encountered in IID scenario

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 3 / 24

slide-5
SLIDE 5

Detection of Correlation

Binary hypothesis testing

H1: Correlated data vs. H0: Independent observations

Questions

How to model correlation? Is there an analytically tractable performance metric? How does correlation affect performance? How does node density affect performance? New tradeoffs not encountered in IID scenario

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 3 / 24

slide-6
SLIDE 6

Summary of Results

Questions Answered

How to model correlation?

◮ Gauss-Markov random field

Is there an analytically tractable performance metric?

◮ Closed-form detection error exponent for Neyman Pearson

How does correlation affect performance?

◮ Depends on variance ratio ⋆ If signal under H1 is weak (low variance), correlation helps ⋆ If signal under H1 is strong (high variance), correlation hurts

How does node density affect performance?

◮ More node density more correlation as edge length is reduced

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 4 / 24

slide-7
SLIDE 7

Summary of Results

Questions Answered

How to model correlation?

◮ Gauss-Markov random field

Is there an analytically tractable performance metric?

◮ Closed-form detection error exponent for Neyman Pearson

How does correlation affect performance?

◮ Depends on variance ratio ⋆ If signal under H1 is weak (low variance), correlation helps ⋆ If signal under H1 is strong (high variance), correlation hurts

How does node density affect performance?

◮ More node density more correlation as edge length is reduced

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 4 / 24

slide-8
SLIDE 8

Summary of Results

Questions Answered

How to model correlation?

◮ Gauss-Markov random field

Is there an analytically tractable performance metric?

◮ Closed-form detection error exponent for Neyman Pearson

How does correlation affect performance?

◮ Depends on variance ratio ⋆ If signal under H1 is weak (low variance), correlation helps ⋆ If signal under H1 is strong (high variance), correlation hurts

How does node density affect performance?

◮ More node density more correlation as edge length is reduced

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 4 / 24

slide-9
SLIDE 9

Summary of Results

Questions Answered

How to model correlation?

◮ Gauss-Markov random field

Is there an analytically tractable performance metric?

◮ Closed-form detection error exponent for Neyman Pearson

How does correlation affect performance?

◮ Depends on variance ratio ⋆ If signal under H1 is weak (low variance), correlation helps ⋆ If signal under H1 is strong (high variance), correlation hurts

How does node density affect performance?

◮ More node density more correlation as edge length is reduced

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 4 / 24

slide-10
SLIDE 10

Previous Results on Detection Error Exponent

I.I.D case

Closed-form for optimal detector and threshold Error exponent - Stein’s lemma

Correlated case

Stationary Gaussian process (Donsker & Varadhan, 85) General formulas for Neyman-Pearson exponent (Chen, 96) Closed-form for Gauss-Markov random process (Sung & etal, 06)

Limitations of the closed form

Requires causality, valid in 1-D case Cannot handle random placement of nodes

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 5 / 24

slide-11
SLIDE 11

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Statistical Inference

4

Results on Error Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 6 / 24

slide-12
SLIDE 12

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Statistical Inference

4

Results on Error Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 7 / 24

slide-13
SLIDE 13

Model for Correlated Data : Graphical Model

X(i) X(i − 1) X(i + 1)

Xi−1 ⊥ Xi+1|Xi Linear graph corresponding to autoregressive process of order 1

Temporal signals

Conditional independence based on ordering Fixed number of neighbors Causal (random processes)

16 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Graph of German states and states with common borders are neighbors

Spatial signals

Conditional independence based on (undirected) Dependency Graph Variable set of neighbors Maybe acausal

Remark

Dependency graph is NOT related to communication capabilities, but to the correlation structure of data!

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 8 / 24

slide-14
SLIDE 14

Markov Random Field

Definition : MRF with Dependency Graph Gd(V, E)

Y(V) = {Yi : i ∈ V} is MRF with Gd(V, E) if Y is Gaussian random field, PDF satisfies positivity condition and Markov property

A A C B

Markov Property

A, B, C are disjoint A, B non-empty C separates A, B YA ⊥ YB|YC

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 9 / 24

slide-15
SLIDE 15

Markov Random Field

Definition : MRF with Dependency Graph Gd(V, E)

Y(V) = {Yi : i ∈ V} is MRF with Gd(V, E) if Y is Gaussian random field, PDF satisfies positivity condition and Markov property

B C A A

Markov Property

A, B, C are disjoint A, B non-empty C separates A, B YA ⊥ YB|YC

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 9 / 24

slide-16
SLIDE 16

Markov Random Field

Definition : MRF with Dependency Graph Gd(V, E)

Y(V) = {Yi : i ∈ V} is MRF with Gd(V, E) if Y is Gaussian random field, PDF satisfies positivity condition and Markov property

A B C

Markov Property

A, B, C are disjoint A, B non-empty C separates A, B YA ⊥ YB|YC

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 9 / 24

slide-17
SLIDE 17

Likelihood Function of MRF

Hammersley-Clifford Theorem (1971)

For a MRF Y with dependency graph Gd(V, Ed), log P(Y; Gd) = Z +

  • c∈C

Ψc(Yc), Z ∆ =e

  • Y
  • c∈C

Ψc(Yc)

, where C is the set of all cliques in Gd and ΨC the clique potential

Dependency Graph

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 10 / 24

slide-18
SLIDE 18

Potential Matrix of GMRF

Potential Matrix

Inverse of covariance matrix of a GMRF Non-zero elements of Potential matrix correspond to graph edges

8 2 3 4 5 6 7 1

Dependency Graph

           × × × × × × × × × × × × × × × × × × × ×           

× : Non-zero element of Potential Matrix

Form of Log-Likelihood of zero-mean GMRF with potential matrix A

− log P(Yn; Gd, A) = 1

2

  • −n log 2π+log |A|+
  • (i,j)∈Ed

A(i, j)YiYj +

i∈V

A(i, i)Y 2

i

  • Acyclic Dependency Graph

Given Covariance matrix, closed-form expression of likelihood

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 11 / 24

slide-19
SLIDE 19

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Statistical Inference

4

Results on Error Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 12 / 24

slide-20
SLIDE 20

Hypothesis Testing for Independence

H1 : GMRF with dependency graph Gd H0 : Independent observations

Model for Dependency Graph Gd under H1

Dependency graph is a proximity graph (edges between nearby points) Simplest proximity graph: nearest-neighbor graph

Definition of Nearest-Neighbor Graph

In NNG, (i, j) is an edge if i is nearest neighbor of j or vice versa

Additional assumptions

Random placement of nodes (Uniform or Poisson distribution) Correlation function g : function of spatial distance

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 13 / 24

slide-21
SLIDE 21

Optimal Detection

Log Likelihood Ratio (LLR) Detector

log P[Yn, V; H1] P[Yn, V; H0]

H0

H1

τn

Neyman-Pearson Detection

Minimize Miss Probability PM

=P[Decision = H0|H1] with false alarm constraint PF = P[Decision = H1|H0] ≤ α

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 14 / 24

slide-22
SLIDE 22

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Statistical Inference

4

Results on Error Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 15 / 24

slide-23
SLIDE 23

Error Exponent D

Closed-form of error probability not tractable

PM (linear scale) Number of samples

exp(−nD)

PM ≈ e−nD

PM (log scale) Number of samples

−nD

Slope = -D

log PM ≈ −nD

Sensors Placed in region with constant node density λ

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 16 / 24

slide-24
SLIDE 24

Error Exponent D

Closed-form of error probability not tractable

PM (linear scale) Number of samples

exp(−nD)

PM ≈ e−nD

PM (log scale) Number of samples

−nD

Slope = -D

log PM ≈ −nD

Sensors Placed in region with constant node density λ

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 16 / 24

slide-25
SLIDE 25

Error Exponent D

Closed-form of error probability not tractable

PM (linear scale) Number of samples

exp(−nD)

PM ≈ e−nD

PM (log scale) Number of samples

−nD

Slope = -D

log PM ≈ −nD

Sensors Placed in region with constant node density λ

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 16 / 24

slide-26
SLIDE 26

Our Methodology

Approaches

LLR as sum of node and edge functionals of dependency graph Error exponent through limit of LLR Evaluate limit using Law of Large Numbers for graph functionals Error exponent for performance analysis

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 17 / 24

slide-27
SLIDE 27

Detailed Methodology

LLR as sum of node and edge functionals of dependency graph

LLR(Yn, Gd) = n log σ1 σ0 + 1 2

  • i∈V
  • 1

σ2

1 −

1 σ2

  • Y 2

i

+

  • (i,j)∈Ed

i<j

  • log[1 − g2(Rij)] +

g2(Rij) 1 − g2(Rij) Y 2

i + Y 2 j

σ2

1

− 2g(Rij) 1 − g2(Rij) YiYj σ2

1

  • Error exponent through limit of LLR

D = lim

n→∞

1 nLLR(Yn; Gd), H0

LLR is sum of graph functionals of a Marked process

Yi are independent under H0

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 18 / 24

slide-28
SLIDE 28

Detailed Methodology

LLR as sum of node and edge functionals of dependency graph

LLR(Yn, Gd) = n log σ1 σ0 + 1 2

i∈V

1 σ2

1

− 1 σ2

  • Y 2

i

+

  • (i,j)∈Ed

i<j

  • log[1 − g2(Rij)] +

g2(Rij) 1 − g2(Rij) Y 2

i + Y 2 j

σ2

1

− 2g(Rij) 1 − g2(Rij) YiYj σ2

1

  • Error exponent through limit of LLR

D = lim

n→∞

1 nLLR(Yn; Gd), H0

LLR is sum of graph functionals of a Marked process

Yi are independent under H0

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 18 / 24

slide-29
SLIDE 29

Detailed Methodology

LLR as sum of node and edge functionals of dependency graph

LLR(Yn, Gd) = n log σ1 σ0 + 1 2

i∈V

1 σ2

1

− 1 σ2

  • Y 2

i

+

  • (i,j)∈Ed

i<j

  • log[1 − g2(Rij)] +

g2(Rij) 1 − g2(Rij) Y 2

i + Y 2 j

σ2

1

− 2g(Rij) 1 − g2(Rij) YiYj σ2

1

  • Error exponent through limit of LLR

D = lim

n→∞

1 nLLR(Yn; Gd), H0

LLR is sum of graph functionals of a Marked process

Yi are independent under H0

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 18 / 24

slide-30
SLIDE 30

Detailed Methodology

LLR as sum of node and edge functionals of dependency graph

LLR(Yn, Gd) = n log σ1 σ0 + 1 2

i∈V

1 σ2

1

− 1 σ2

  • Y 2

i

+

  • (i,j)∈Ed

i<j

  • log[1 − g2(Rij)] +

g2(Rij) 1 − g2(Rij)

Y 2

i +Y 2 j

σ2

1

− 2g(Rij) 1 − g2(Rij) YiYj σ2

1

  • Error exponent through limit of LLR

D = lim

n→∞

1 nLLR(Yn; Gd), H0

LLR is sum of graph functionals of a Marked process

Yi are independent under H0

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 18 / 24

slide-31
SLIDE 31

LLN for graph functionals (Penrose & Yukich, 02)

Pictorial Representation of result

n → ∞ Origin Normalized sum of edge weights Expectation of edges

  • f origin of Poisson process
  • e∈E

Φ(Re) n

E

  • X∈Pλ

φ(R0,X)

Remarks

LLN states that limit is a localized effect around origin

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 19 / 24

slide-32
SLIDE 32

Result on Error Exponent D

Applying LLN (Penrose & Yukich, 02)

D = 1 2

  • E
  • X∈Pλ

f(g(R0,X)) + log K + 1 K − 1

  • ,

f(x)

= log[1 − x2] + 2x2 K[1 − x2], K∆ = σ2

1

σ2 R0,X : edge-lengths in a NNG of origin of a homogeneous Poisson process of intensity λ

Closed-form Expression for D

D = 1 2

  • Ef(g(Z1)) − π

2ω Ef(g(Z2)) + log K + 1 K − 1

  • Z1, Z2 : Rayleigh distributed with Variances (2πλ)−1, (2ωλ)−1

ω ≈ 5.06 : area of union of two unit radii circles, with centers unit distant apart

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 20 / 24

slide-33
SLIDE 33

Numerical Analysis

Questions

How does correlation affect performance?

◮ Depends on variance ratio ⋆ If signal under H1 is weak (low variance), correlation helps ⋆ If signal under H1 is strong (high variance), correlation hurts

How does node density affect performance?

◮ More node density more correlation as edge length is reduced

−10 −5 5 10 1 2 3 4 5 6

K in dB Error Exponent D a = 0 a → ∞ a = 0.5 Correlation coefficient a, M=0.5, λ = 1

Exponential Correlation Function

g(r) = Me−ar, a > 0, 0 < M < 1

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 21 / 24

slide-34
SLIDE 34

Minimum Energy Routing for Optimal Inference1

Edge in NNG Edge in DTG Edge in AG Aggregator Fusion center

Transmission scheme delivering LLR

Minimum Energy Routing for Inference

Minimize total energy of routing such that LLR is delivered to fusion center

Summary of Results

Concept of dependency graph based routing

◮ Exploit correlation to fuse data

Proposed 2-approximation algorithm

1A.Anandkumar, L.Tong, A. Swami, “Energy Efficient Routing for Statistical Inference of Markov Random Fields,” Conference on Information Sciences and Systems, March 2007

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 22 / 24

slide-35
SLIDE 35

Conclusion

Summary

Derived a closed-form expression for error exponent of detection a GMRF with nearest-neighbor dependency Studied effect of correlation and node density on performance

Outlook

Relax assumptions Extend to other dependency models Study Performance-Routing Energy tradeoff

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 23 / 24

slide-36
SLIDE 36

Thank You !

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection of GMRF on NNG ICASSP 2007 24 / 24