Detection of Gauss Markov Random Fields under Routing Energy - - PowerPoint PPT Presentation

detection of gauss markov random fields under routing
SMART_READER_LITE
LIVE PREVIEW

Detection of Gauss Markov Random Fields under Routing Energy - - PowerPoint PPT Presentation

Detection of Gauss Markov Random Fields under Routing Energy Constraint A. Anandkumar 1 L. Tong 1 A. Swami 2 1 School of Electrical and Computer Engineering Cornell University, Ithaca, NY 14853 2 Army Research Laboratory, Adelphi MD 20783


slide-1
SLIDE 1

Detection of Gauss Markov Random Fields under Routing Energy Constraint

  • A. Anandkumar1
  • L. Tong1
  • A. Swami2

1School of Electrical and Computer Engineering

Cornell University, Ithaca, NY 14853

2Army Research Laboratory, Adelphi MD 20783

Forty-fifth Annual Allerton Conference on Communication, Control, and Computing

. Supported by the Army Research Laboratory CTA

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 1 / 26

slide-2
SLIDE 2

Detection-Energy Tradeoff

Distributed Detection

Quantization rule @ sensors Inference rule @ fusion center

Classical Routing

Generic Performance Metric Layered architecture

Shortcomings of Classical Detection

For sensors in a large field, multi-hop routing is needed For energy-constrained networks, loss in detection performance

Shortcomings of Classical Routing

Need only likelihood ratio for inference, not raw data at fusion center Tradeoff between Routing and Detection in Wireless Sensor Networks

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 2 / 26

slide-3
SLIDE 3

Tradeoff: Optimal Detection under Energy Constraint

Optimal Detection of Binary Hypothesis

Neyman Pearson: Min. miss detection subject to false alarm

Large Networks: n → ∞

max − 1

n log PM subject to false alarm and avg. routing energy ¯

E

Optimal Node Density λ∗

λ∗

= arg max

λ>0 Dλ

subject to ¯ C ≤ ¯ E Dλ: Neyman-Pearson error exponent ¯ C: Average Routing Energy per node

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 3 / 26

slide-4
SLIDE 4

Node Deployment Strategies for Optimal Tradeoff

Node deployments

Random

Setup

Random: Nodes drawn from uniform or Poisson distribution Constant density λ: n nodes in area n

λ

Factors

Signal & Energy Model Nature of Tradeoff

Implications

λ∗ → 0 or ∞: Large/Small area λ∗ ∈ (0, ∞): Careful Deployment

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 4 / 26

slide-5
SLIDE 5

Node Deployment Strategies for Optimal Tradeoff

Node deployments

Random

Setup

Random: Nodes drawn from uniform or Poisson distribution Constant density λ: n nodes in area n

λ

Factors

Signal & Energy Model Nature of Tradeoff

Implications

λ∗ → 0 or ∞: Large/Small area λ∗ ∈ (0, ∞): Careful Deployment

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 4 / 26

slide-6
SLIDE 6

Node Deployment Strategies for Optimal Tradeoff

Node deployments

  • Const. Density λ

Setup

Random: Nodes drawn from uniform or Poisson distribution Constant density λ: n nodes in area n

λ

Factors

Signal & Energy Model Nature of Tradeoff

Implications

λ∗ → 0 or ∞: Large/Small area λ∗ ∈ (0, ∞): Careful Deployment

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 4 / 26

slide-7
SLIDE 7

Node Deployment Strategies for Optimal Tradeoff

Node deployments

  • Const. Density λ

Setup

Random: Nodes drawn from uniform or Poisson distribution Constant density λ: n nodes in area n

λ

Factors

Signal & Energy Model Nature of Tradeoff

Implications

λ∗ → 0 or ∞: Large/Small area λ∗ ∈ (0, ∞): Careful Deployment

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 4 / 26

slide-8
SLIDE 8

Node Deployment Strategies for Optimal Tradeoff

Node deployments

λ∗ → 0 or R → ∞ λ∗ ∈ (0, ∞) λ∗ → ∞ or R → 0

Setup

Random: Nodes drawn from uniform or Poisson distribution Constant density λ: n nodes in area n

λ

Factors

Signal & Energy Model Nature of Tradeoff

Implications

λ∗ → 0 or ∞: Large/Small area λ∗ ∈ (0, ∞): Careful Deployment

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 4 / 26

slide-9
SLIDE 9

Example: Same Variances, No Energy Constraint

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Assumptions

Uniform signal field: same variance at every node, under H0 and H1 Correlation decays with distance under H1 Only way to distinguish H0 and H1: Correlation

Intuition: to maximize correlation : Minimize inter-node distance

In this case, λ∗ → ∞. What happens when variances are different?

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 5 / 26

slide-10
SLIDE 10

Example: Same Variances, No Energy Constraint

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Assumptions

Uniform signal field: same variance at every node, under H0 and H1 Correlation decays with distance under H1 Only way to distinguish H0 and H1: Correlation

Intuition: to maximize correlation : Minimize inter-node distance

In this case, λ∗ → ∞. What happens when variances are different?

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 5 / 26

slide-11
SLIDE 11

Example: Same Variances, No Energy Constraint

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Assumptions

Uniform signal field: same variance at every node, under H0 and H1 Correlation decays with distance under H1 Only way to distinguish H0 and H1: Correlation

Intuition: to maximize correlation : Minimize inter-node distance

In this case, λ∗ → ∞. What happens when variances are different?

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 5 / 26

slide-12
SLIDE 12

Results on Optimal Node Density λ∗

Variance Ratio K

K is ratio of variances under alternative and null hypotheses

  • Avg. Energy vs. λ

Density λ ¯ C

Exponent vs. λ

K < Kt K > K′

t

D Density λ

No Energy constraint Feasible Energy

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 6 / 26

slide-13
SLIDE 13

Results on Optimal Node Density λ∗

Variance Ratio K

K is ratio of variances under alternative and null hypotheses

  • Avg. Energy vs. λ

Density λ ¯ C

Exponent vs. λ

K < Kt K > K′

t

D Density λ

No Energy constraint

Optimal density λ∗

Kt

Variance ratio K

Feasible Energy

Optimal density λ∗

Kt

Variance ratio K

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 6 / 26

slide-14
SLIDE 14

Results on Optimal Node Density λ∗

Variance Ratio K

K is ratio of variances under alternative and null hypotheses

  • Avg. Energy vs. λ

Density λ ¯ C

Exponent vs. λ

K < Kt K > K′

t

D Density λ

No Energy constraint

+∞

Optimal density λ∗

Kt

Variance ratio K

Feasible Energy

+∞

Optimal density λ∗

Kt

Variance ratio K

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 6 / 26

slide-15
SLIDE 15

Results on Optimal Node Density λ∗

Variance Ratio K

K is ratio of variances under alternative and null hypotheses

  • Avg. Energy vs. λ

Density λ ¯ C

Exponent vs. λ

K < Kt K > K′

t

D Density λ

No Energy constraint

+∞

Optimal density λ∗

Kt

Variance ratio K

Feasible Energy

+∞

Optimal density λ∗

Kt

Variance ratio K

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 6 / 26

slide-16
SLIDE 16

Results on Optimal Node Density λ∗

Variance Ratio K

K is ratio of variances under alternative and null hypotheses

  • Avg. Energy vs. λ

Density λ ¯ C

Exponent vs. λ

K < Kt K > K′

t

D Density λ

No Energy constraint

+∞

Optimal density λ∗

Kt

Variance ratio K

Feasible Energy

+∞

Optimal density λ∗

Kt K′

t

? Variance ratio K

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 6 / 26

slide-17
SLIDE 17

Results on Optimal Node Density λ∗

Variance Ratio K

K is ratio of variances under alternative and null hypotheses

  • Avg. Energy vs. λ

Density λ ¯ C

Exponent vs. λ

K < Kt K > K′

t

D Density λ

No Energy constraint

+∞

Optimal density λ∗

Kt

Variance ratio K

Feasible Energy

+∞

Optimal density λ∗

Kt K′

t

λE

? Variance ratio K

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 6 / 26

slide-18
SLIDE 18

Methodology

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Modeling correlation

Gauss-Markov random field Correlation decays with dist. Partial correlation at 0 Nearest-neighbor dependency

Tractable performance metric

Closed-form Neyman Pearson error exponent (ICASSP 07)

  • Min. energy routing

NP-hard (CISS 07) 2-approx. algo DFMRF Closed-form average energy Constraint: bound on λ

Optimal node density

Analyze error exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 7 / 26

slide-19
SLIDE 19

Methodology

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Modeling correlation

Gauss-Markov random field Correlation decays with dist. Partial correlation at 0 Nearest-neighbor dependency

Tractable performance metric

Closed-form Neyman Pearson error exponent (ICASSP 07)

  • Min. energy routing

NP-hard (CISS 07) 2-approx. algo DFMRF Closed-form average energy Constraint: bound on λ

Optimal node density

Analyze error exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 7 / 26

slide-20
SLIDE 20

Methodology

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Modeling correlation

Gauss-Markov random field Correlation decays with dist. Partial correlation at 0 Nearest-neighbor dependency

Tractable performance metric

Closed-form Neyman Pearson error exponent (ICASSP 07)

  • Min. energy routing

NP-hard (CISS 07) 2-approx. algo DFMRF Closed-form average energy Constraint: bound on λ

Optimal node density

Analyze error exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 7 / 26

slide-21
SLIDE 21

Methodology

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Modeling correlation

Gauss-Markov random field Correlation decays with dist. Partial correlation at 0 Nearest-neighbor dependency

Tractable performance metric

Closed-form Neyman Pearson error exponent (ICASSP 07)

  • Min. energy routing

NP-hard (CISS 07) 2-approx. algo DFMRF Closed-form average energy Constraint: bound on λ

Optimal node density

Analyze error exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 7 / 26

slide-22
SLIDE 22

Methodology

Detection of Correlation

H1: Correlated data vs. H0: Independent observations

Modeling correlation

Gauss-Markov random field Correlation decays with dist. Partial correlation at 0 Nearest-neighbor dependency

Tractable performance metric

Closed-form Neyman Pearson error exponent (ICASSP 07)

  • Min. energy routing

NP-hard (CISS 07) 2-approx. algo DFMRF Closed-form average energy Constraint: bound on λ

Optimal node density

Analyze error exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 7 / 26

slide-23
SLIDE 23

Previous Results on Detection & Routing

Routing with Aggregation

Cond indep: Intangoniwat et

  • al. 00, Krishnachari et al. 02

In-network Proc. Surveys (Giridar & Kumar 06, Rajagopalan & Varshney 06)

Detection of Correlation

Stationary Gaussian process (Donsker & Varadhan, 85)

  • Gen. form exponent (Chen 96)

Exponent for Gauss-Markov process (Sung et al. 06)

Detection-Routing

Independent Measurements: (Yang & Blum 07, Appadwedula et al. 05, Yu & Ephremides 06) 1-D Gauss-Markov process:

Chernoff Routing (Sung et al. 06): Link-metric for detection Optimal node density (Chamberland & Veeravalli 06)

Markov random field: With 1-bit comm (Kreidl et al. 06)

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 8 / 26

slide-24
SLIDE 24

Previous Results on Detection & Routing

Routing with Aggregation

Cond indep: Intangoniwat et

  • al. 00, Krishnachari et al. 02

In-network Proc. Surveys (Giridar & Kumar 06, Rajagopalan & Varshney 06)

Detection of Correlation

Stationary Gaussian process (Donsker & Varadhan, 85)

  • Gen. form exponent (Chen 96)

Exponent for Gauss-Markov process (Sung et al. 06)

Detection-Routing

Independent Measurements: (Yang & Blum 07, Appadwedula et al. 05, Yu & Ephremides 06) 1-D Gauss-Markov process:

Chernoff Routing (Sung et al. 06): Link-metric for detection Optimal node density (Chamberland & Veeravalli 06)

Markov random field: With 1-bit comm (Kreidl et al. 06)

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 8 / 26

slide-25
SLIDE 25

Previous Results on Detection & Routing

Routing with Aggregation

Cond indep: Intangoniwat et

  • al. 00, Krishnachari et al. 02

In-network Proc. Surveys (Giridar & Kumar 06, Rajagopalan & Varshney 06)

Detection of Correlation

Stationary Gaussian process (Donsker & Varadhan, 85)

  • Gen. form exponent (Chen 96)

Exponent for Gauss-Markov process (Sung et al. 06)

Detection-Routing

Independent Measurements: (Yang & Blum 07, Appadwedula et al. 05, Yu & Ephremides 06) 1-D Gauss-Markov process:

Chernoff Routing (Sung et al. 06): Link-metric for detection Optimal node density (Chamberland & Veeravalli 06)

Markov random field: With 1-bit comm (Kreidl et al. 06)

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 8 / 26

slide-26
SLIDE 26

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Minimum Energy Routing

4

Effect of Node Density on Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 9 / 26

slide-27
SLIDE 27

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Minimum Energy Routing

4

Effect of Node Density on Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 10 / 26

slide-28
SLIDE 28

Model for Correlated Data : Graphical Model

Xi Xi−1 Xi+1

Xi−1 ⊥ Xi+1|Xi Linear graph corresponding to autoregressive process of order 1

Temporal signals

Conditional independence based on ordering Fixed number of neighbors Causal (random processes)

16 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Graph of German states and states with common borders are neighbors

Spatial signals

Conditional independence based on (undirected) Dependency Graph Variable set of neighbors Maybe acausal

Remark

Dependency graph is NOT related to communication capabilities, but to the correlation structure of data!

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 11 / 26

slide-29
SLIDE 29

Markov Random Field

Definition : MRF with Dependency Graph Gd(V, E)

Y(V) = {Yi : i ∈ V} is MRF with Gd(V, E) if PDF satisfies positivity condition and Markov property

A A C B

Markov Property

A, B, C are disjoint A, B non-empty C separates A, B YA ⊥ YB|YC

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 12 / 26

slide-30
SLIDE 30

Markov Random Field

Definition : MRF with Dependency Graph Gd(V, E)

Y(V) = {Yi : i ∈ V} is MRF with Gd(V, E) if PDF satisfies positivity condition and Markov property

B C A A

Markov Property

A, B, C are disjoint A, B non-empty C separates A, B YA ⊥ YB|YC

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 12 / 26

slide-31
SLIDE 31

Markov Random Field

Definition : MRF with Dependency Graph Gd(V, E)

Y(V) = {Yi : i ∈ V} is MRF with Gd(V, E) if PDF satisfies positivity condition and Markov property

A B C

Markov Property

A, B, C are disjoint A, B non-empty C separates A, B YA ⊥ YB|YC

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 12 / 26

slide-32
SLIDE 32

Likelihood Function of MRF

Hammersley-Clifford Theorem (1971)

For a MRF Y with dependency graph Gd(V, Ed), log P(Y; Gd) = Z +

  • c∈C

Ψc(Yc), Z ∆ =e

  • Y
  • c∈C

Ψc(Yc)

, where C is the set of all cliques in Gd and ΨC the clique potential

Dependency Graph

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 13 / 26

slide-33
SLIDE 33

Potential Matrix of GMRF

Potential Matrix

Inverse of covariance matrix of a GMRF Non-zero elements of Potential matrix correspond to graph edges

8 2 3 4 5 6 7 1

Dependency Graph

           × × × × × × × × × × × × × × × × × × × ×           

× : Non-zero element of Potential Matrix

Form of Log-Likelihood of zero-mean GMRF with potential matrix A

− log P(Yn; Gd, A) = 1

2

  • −n log 2π+log |A|+
  • (i,j)∈Ed

A(i, j)YiYj +

i∈V

A(i, i)Y 2

i

  • Acyclic Dependency Graph

Given Covariance matrix, closed-form expression of likelihood

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 14 / 26

slide-34
SLIDE 34

Hypothesis Testing for Independence

H1 : GMRF with dep. graph Gd H0 : IID Gaussian

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j.

Dependency Graph

Proximity graph: Nearest-neighbor

Nearest-Neighbor Graph

(i, j): i nearest nbr of j, vice-versa

Correlation fn.

  • Fn. of NNG edge length

g(0) = M < 1, g(∞) = 0 Decreasing, convex in edge-length

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 15 / 26

slide-35
SLIDE 35

Hypothesis Testing for Independence

H1 : GMRF with dep. graph Gd H0 : IID Gaussian

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j.

Dependency Graph

Proximity graph: Nearest-neighbor

Nearest-Neighbor Graph

(i, j): i nearest nbr of j, vice-versa

Correlation fn.

  • Fn. of NNG edge length

g(0) = M < 1, g(∞) = 0 Decreasing, convex in edge-length

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 15 / 26

slide-36
SLIDE 36

Hypothesis Testing for Independence

H1 : GMRF with dep. graph Gd H0 : IID Gaussian

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j.

Dependency Graph

Proximity graph: Nearest-neighbor

Nearest-Neighbor Graph

(i, j): i nearest nbr of j, vice-versa

Correlation fn.

  • Fn. of NNG edge length

g(0) = M < 1, g(∞) = 0 Decreasing, convex in edge-length

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 15 / 26

slide-37
SLIDE 37

Hypothesis Testing for Independence

H1 : GMRF with dep. graph Gd H0 : IID Gaussian

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j.

Dependency Graph

Proximity graph: Nearest-neighbor

Nearest-Neighbor Graph

(i, j): i nearest nbr of j, vice-versa

Correlation fn.

  • Fn. of NNG edge length

g(0) = M < 1, g(∞) = 0 Decreasing, convex in edge-length

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 15 / 26

slide-38
SLIDE 38

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Minimum Energy Routing

4

Effect of Node Density on Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 16 / 26

slide-39
SLIDE 39

Minimum Energy Routing for Optimal Inference

Minimum Energy Routing for Inference

Minimize total energy of routing such that LLR is delivered to fusion center

Edge in NNG Edge in DTG Edge in AG Aggregator Fusion center

Transmission scheme delivering LLR

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j

DFMRF: data fusion in MRF

2-Approximation: C(DFMRF) C(G∗) ≤ 2

Network and Energy Model

Connected UDG, Power control Transmission: Power law attenuation

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 17 / 26

slide-40
SLIDE 40

Minimum Energy Routing for Optimal Inference

Minimum Energy Routing for Inference

Minimize total energy of routing such that LLR is delivered to fusion center

Edge in NNG Edge in DTG Edge in AG Aggregator Fusion center

Transmission scheme delivering LLR

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j

DFMRF: data fusion in MRF

2-Approximation: C(DFMRF) C(G∗) ≤ 2

Network and Energy Model

Connected UDG, Power control Transmission: Power law attenuation

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 17 / 26

slide-41
SLIDE 41

Minimum Energy Routing for Optimal Inference

Minimum Energy Routing for Inference

Minimize total energy of routing such that LLR is delivered to fusion center

Edge in NNG Edge in DTG Edge in AG Aggregator Fusion center

Transmission scheme delivering LLR

LLR=Node + Edge Potentials

LLR(Yn; Gd) =

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j

DFMRF: data fusion in MRF

2-Approximation: C(DFMRF) C(G∗) ≤ 2

Network and Energy Model

Connected UDG, Power control Transmission: Power law attenuation

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 17 / 26

slide-42
SLIDE 42

Algorithm: LLR computation using DFMRF

Raw-data transmission phase

Tx raw data over NNG, compute edge potential locally

Aggregation phase

Init: Leaves of AG transmit local contribution Recursion: If i has received from all predecessors, transmits sum Stop : Fusion center computes its aggregate

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 18 / 26

slide-43
SLIDE 43

Algorithm: LLR computation using DFMRF

Raw-data transmission phase

Tx raw data over NNG, compute edge potential locally

Aggregation phase

Init: Leaves of AG transmit local contribution Recursion: If i has received from all predecessors, transmits sum Stop : Fusion center computes its aggregate

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 18 / 26

slide-44
SLIDE 44

Algorithm: LLR computation using DFMRF

Raw-data transmission phase

Tx raw data over NNG, compute edge potential locally

Aggregation phase

Init: Leaves of AG transmit local contribution Recursion: If i has received from all predecessors, transmits sum Stop : Fusion center computes its aggregate

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 18 / 26

slide-45
SLIDE 45

Algorithm: LLR computation using DFMRF

Raw-data transmission phase

Tx raw data over NNG, compute edge potential locally

Aggregation phase

Init: Leaves of AG transmit local contribution Recursion: If i has received from all predecessors, transmits sum Stop : Fusion center computes its aggregate

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 18 / 26

slide-46
SLIDE 46

Algorithm: LLR computation using DFMRF

Raw-data transmission phase

Tx raw data over NNG, compute edge potential locally

Aggregation phase

Init: Leaves of AG transmit local contribution Recursion: If i has received from all predecessors, transmits sum Stop : Fusion center computes its aggregate

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 18 / 26

slide-47
SLIDE 47

Outline

1

Introduction

2

Gauss-Markov Random Field

3

Minimum Energy Routing

4

Effect of Node Density on Exponent

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 19 / 26

slide-48
SLIDE 48

LLN for graph functionals (Penrose & Yukich, 02)

Pictorial Representation of result

n → ∞ Origin Normalized sum of edge weights Expectation of edges

  • f origin of Poisson process
  • e∈E

Φ(Re) n

E

  • X∈Pλ

φ(R0,X)

Remarks

LLN states that limit is a localized effect around origin

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 20 / 26

slide-49
SLIDE 49

Result on Error Exponent D

Use LLN to find error exponent

D = lim

n→∞

1 nLLR(Yn; Gd) = lim

n→∞

1 n[

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j] Yn ∼ H0

Closed-form D: Correlation + IID terms

D(λ, K; g) = 1

2Eλ h

  • Zλ−0.5, K; g
  • + DIID(K)

Variance Ratio K of Signal Model

K is ratio of mean signal powers under alternative and null hypotheses

  • Avg. energy for DFMRF
  • Tran. + Proc. Energies

¯ C = λ− ν

2 Ctce(ν) + Cp

Constraint leads to bound on λ

¯ C ≤ ¯ E ⇒ λ ≥ λE

= ( ¯ E − Cp)+ Ctce(ν) 2

ν

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 21 / 26

slide-50
SLIDE 50

Result on Error Exponent D

Use LLN to find error exponent

D = lim

n→∞

1 nLLR(Yn; Gd) = lim

n→∞

1 n[

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j] Yn ∼ H0

Closed-form D: Correlation + IID terms

D(λ, K; g) = 1

2Eλ h

  • λ−0.5Z, K; g
  • + DIID(K)

Variance Ratio K of Signal Model

K is ratio of mean signal powers under alternative and null hypotheses

  • Avg. energy for DFMRF
  • Tran. + Proc. Energies

¯ C = λ− ν

2 Ctce(ν) + Cp

Constraint leads to bound on λ

¯ C ≤ ¯ E ⇒ λ ≥ λE

= ( ¯ E − Cp)+ Ctce(ν) 2

ν

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 21 / 26

slide-51
SLIDE 51

Result on Error Exponent D

Use LLN to find error exponent

D = lim

n→∞

1 nLLR(Yn; Gd) = lim

n→∞

1 n[

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j] Yn ∼ H0

Closed-form D: Correlation + IID terms

D(λ, K; g) = 1

2Eλ h

  • Zλ−0.5, K; g
  • + DIID(K)

Variance Ratio K of Signal Model

K is ratio of mean signal powers under alternative and null hypotheses

  • Avg. energy for DFMRF
  • Tran. + Proc. Energies

¯ C = λ− ν

2 Ctce(ν) + Cp

Constraint leads to bound on λ

¯ C ≤ ¯ E ⇒ λ ≥ λE

= ( ¯ E − Cp)+ Ctce(ν) 2

ν

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 21 / 26

slide-52
SLIDE 52

Result on Error Exponent D

Use LLN to find error exponent

D = lim

n→∞

1 nLLR(Yn; Gd) = lim

n→∞

1 n[

  • i∈V

Φi +

  • (i,j)∈Ed

Φi,j] Yn ∼ H0

Closed-form D: Correlation + IID terms

D(λ, K; g) = 1

2Eλ h

  • Zλ−0.5, K; g
  • + DIID(K)

Variance Ratio K of Signal Model

K is ratio of mean signal powers under alternative and null hypotheses

  • Avg. energy for DFMRF
  • Tran. + Proc. Energies

¯ C = λ− ν

2 Ctce(ν) + Cp

Constraint leads to bound on λ

¯ C ≤ ¯ E ⇒ λ ≥ λE

= ( ¯ E − Cp)+ Ctce(ν) 2

ν

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 21 / 26

slide-53
SLIDE 53

Results on Optimal Node Density

Modified Optimization

λ∗

= arg max

λ>0 Dλ

subject to ¯ C ≤ ¯ E becomes λ∗ = arg max Dλ, λ ≥ λE

Thresholds in terms of M: correlation at zero

Kt(M) = − 1 log(1 − M2) 2M2 1 − M2 , K′

t(M) =

2 1 − M2

No Energy constraint

+∞

Optimal density λ∗

Kt

Variance ratio K

Feasible Energy: λ ≥ λE

+∞

Optimal density λ∗

Kt K′

t

λE

?

Variance ratio K

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 22 / 26

slide-54
SLIDE 54

Idea of Proof: Behavior at λ = ∞

Tight Energy Constraint: ¯ E → 0

Energy constraint satisfied when λ → ∞ and Max. correlation at λ = ∞

At λ = ∞: Contribution from corr. has a threshold

Contribution from correlation at λ = ∞ < 0, for K > Kt(M) ≥ 0, for K < Kt(M)

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 −0.1 −0.05 0.05 0.1

E[h(Zλ−0.5, K)] λ−0.5

E[h(Zλ−0.5)]

Negative at λ = ∞

  • Corr. has -ve contribution K = 2Kt

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 −0.1 −0.05 0.05 0.1 0.15 0.2

E[h(Zλ−0.5, K)] λ−0.5

E[h(Zλ−0.5)]

Positive at λ = ∞

  • Corr. has +ve contribution K = 2 < Kt
  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 23 / 26

slide-55
SLIDE 55

Conclusion

Summary

Characterized node density λ∗ that maximizes detection error exponent subject to a average energy constraint Measurement variance ratio is crucial

Determines whether energy constraint limits detection performance Optimal density displays a threshold behavior

Derived threshold value analytically and verified it with simulations

Outlook

Selection of nodes with “useful” data, node and link failures Extend to other dependency models Quantization of measurements Mobility of nodes/ coverage area of nodes

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 24 / 26

slide-56
SLIDE 56

Conclusion

Summary

Characterized node density λ∗ that maximizes detection error exponent subject to a average energy constraint Measurement variance ratio is crucial

Determines whether energy constraint limits detection performance Optimal density displays a threshold behavior

Derived threshold value analytically and verified it with simulations

Outlook

Selection of nodes with “useful” data, node and link failures Extend to other dependency models Quantization of measurements Mobility of nodes/ coverage area of nodes

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 24 / 26

slide-57
SLIDE 57

Thank You !

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 25 / 26

slide-58
SLIDE 58

LLR

LLR(Yn, V)

= log p[Yn, V; H0] p[Yn, V; H1] = log p[Yn; H0] p[Yn|V; H1], = 1 2

  • log |Σ1,V|

|σ2

0I| + YT n [Σ−1 1,V − (σ2 0I)−1]Yn

  • ,

LLR(Yn; Gd) =

  • i∈V

φi(Yi) +

  • (i,j)∈Ed

φi,j(Yi, Yj) φi,j(i, j)

= 1 2 log[1 − g2(Rij)] − g(Rij) 1 − g2(Rij) YiYj σ2

1

+ g2(Rij) 1 − g2(Rij) Y 2

i + Y 2 j

2σ2

1

φi(Yi)∆ = log σ1 σ0 + 1 2 1 σ2

1

− 1 σ2

  • Y 2

i → DIID(K)

  • A. Anandkumar, L.Tong, A. Swami (Cornell)

Detection Energy Tradeoff Allerton 2007 26 / 26