Inference in hybrid Bayesian networks with H. Langseth, T.D. - - PowerPoint PPT Presentation

inference in hybrid bayesian networks with
SMART_READER_LITE
LIVE PREVIEW

Inference in hybrid Bayesian networks with H. Langseth, T.D. - - PowerPoint PPT Presentation

Inference with MoTBFs Inference in hybrid Bayesian networks with H. Langseth, T.D. Nielsen, R. Rum , mixtures of truncated basis functions A. Salmer on MoTBFs Operations Helge Langseth (1) , Thomas D. Nielsen (2) , Rafael Rum


slide-1
SLIDE 1

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Inference in hybrid Bayesian networks with mixtures of truncated basis functions

Helge Langseth(1), Thomas D. Nielsen(2), Rafael Rum´ ı(3), Antonio Salmer´

  • n(3)

(1)Department of Computer and Information Science, NTNU (Norway) (2)Department of Computer Science, Aalborg University (Denmark) (3)Department of Statistics & Applied Mathematics, University of Almer´

ıa (Spain)

PGM 2012. Granada, 21 September 2012

1 / 16

slide-2
SLIDE 2

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Introduction

MoTBFs provide a flexible framework for hybrid BNs. Accurate approximation of known models. Learning from data. Inference?

2 / 16

slide-3
SLIDE 3

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Mixtures of truncated basis functions (MoTBFs)

In what concerns inference in Bayesian networks, initially we only have two types of MoTBFs: univariate and conditional. Any other potential showing up during inference is the result of operating over them, namely applying marginalisation and combination.

3 / 16

slide-4
SLIDE 4

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Mixtures of truncated basis functions (MoTBFs)

Definition (Univariate MoTBF density) An unconditional MoTBF density over X is: f(x) =

m−1

  • i=0

aiψi(x) x ∈ ΩX, where ψ = {ψ0(x), . . . , ψm−1(x)} is the set of basis functions for X. Particular cases MTEs: Ψ = {1, exp(−x), exp(x), exp(−2x), exp(2x), . . .}. MOPs: Ψ = {xi, i = 0, 1, . . .}.

4 / 16

slide-5
SLIDE 5

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Mixtures of truncated basis functions (MoTBFs)

Definition (Univariate MoTBF density) An unconditional MoTBF density over X is: f(x) =

m−1

  • i=0

aiψi(x) x ∈ ΩX, where ψ = {ψ0(x), . . . , ψm−1(x)} is the set of basis functions for X. Particular cases MTEs: Ψ = {1, exp(−x), exp(x), exp(−2x), exp(2x), . . .}. MOPs: Ψ = {xi, i = 0, 1, . . .}.

4 / 16

slide-6
SLIDE 6

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Mixtures of truncated basis functions (MoTBFs)

Definition (Conditional MoTBF density) Y and Z: discrete and continuous variables. Domain of Z divided into k hypercubes. For each y, and each hypercube j = 1, . . . , k, the conditional MoTBF density of X given Z and Y is f(x|z, y) =

m−1

  • i=0

ai,jψi(x)

5 / 16

slide-7
SLIDE 7

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Mixtures of truncated basis functions (MoTBFs)

Potential advantages of MoTBFs for inference Univariate MoTBFs do not require domain splitting (unlike classical approach to MTEs and MOPs). Conditional MoTBFs are piecewise univariate over the head variable. As a consequence, each variable in the BN explicitly appears in only one potential initially. If a variable appears in a potential not as a head variable, it only determines the hypercubes of the conditional density. One can consider a fixed set of possible split points for each variable, regardless of the function where it appears. There is no need to explicitly store the basis functions.

6 / 16

slide-8
SLIDE 8

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Combination

f(x1) =

m−1

  • i=0

a1

i,hψi(x1)

; f(x2) =

m−1

  • i=0

a2

i,tψi(x2)

f(x1, x2) = m−1

i=0 a1 i,hψi(x1)

m−1

i=0 a2 i,tψi(x2)

  • 7 / 16
slide-9
SLIDE 9

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Combination

f(x1) =

m−1

  • i=0

a1

i,hψi(x1)

; f(x2) =

m−1

  • i=0

a2

i,tψi(x2)

f(x1, x2) = m−1

i=0 a1 i,hψi(x1)

m−1

i=0 a2 i,tψi(x2)

  • MoTBF potential

7 / 16

slide-10
SLIDE 10

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Combination

f(x1) =

m−1

  • i=0

a1

i,hψi(x1)

; f(x2) =

m−1

  • i=0

a2

i,tψi(x2)

f(x1, x2) = m−1

i=0 a1 i,hψi(x1)

m−1

i=0 a2 i,tψi(x2)

  • MoTBF potential

MoTBF potential

7 / 16

slide-11
SLIDE 11

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Combination

f(x1) =

m−1

  • i=0

a1

i,hψi(x1)

; f(x2) =

m−1

  • i=0

a2

i,tψi(x2)

f(x1, x2) = m−1

i=0 a1 i,hψi(x1)

m−1

i=0 a2 i,tψi(x2)

  • MoTBF potential

MoTBF potential Factorised MoTBF potential

7 / 16

slide-12
SLIDE 12

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Marginalisation

Proposition fZ\{Zj}(z1, . . . , zj−1, zj+1, . . . , zc) =

r

  • l=1

 

i=j m−1

  • s=0

ai

s,·,(h,l)ψs(zi)

   

m−1

  • s=0

aj

s,·,(h,l)

  • Ωl

Zj

ψs(zj)dzj   . Bad news: NOT a factorised MoTBF potential! Good news: the integrals can be computed off-line, prior to inference, if the split points are fixed.

8 / 16

slide-13
SLIDE 13

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Marginalisation

Proposition fZ\{Zj}(z1, . . . , zj−1, zj+1, . . . , zc) =

r

  • l=1

 

i=j m−1

  • s=0

ai

s,·,(h,l)ψs(zi)

   

m−1

  • s=0

aj

s,·,(h,l)

  • Ωl

Zj

ψs(zj)dzj   . Bad news: NOT a factorised MoTBF potential! Good news: the integrals can be computed off-line, prior to inference, if the split points are fixed. This kind of potential is called SP factorised MoTBF potential

8 / 16

slide-14
SLIDE 14

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Combination of SP factorised potentials

fX1(x1) =

r1

  • l=1

fl

X1(x1) ; fX2(x2) = r2

  • m=1

fm

X2(x2)

The combination of fX1 and fX2 is a new potential over variables X12 = X1 ∪ X2 defined as f(x12) =

r1

  • l=1

r2

  • m=1

fl

X1(x↓X1 12 )fm X2(x↓X2 12 ),

which is an SP factorised potential.

9 / 16

slide-15
SLIDE 15

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Marginalisation of SP factorised potentials

fZ(z) =

r

  • l=1

fl

Z(z)

fZ\{Zj}(z1, . . . , zj−1, zj+1, . . . , zc) =

r

  • l=1

fl

Z\{Zj}(z1, . . . , zj−1, zj+1, . . . , zn)

Again, it is an SP factorised potential.

10 / 16

slide-16
SLIDE 16

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Marginalisation of SP factorised potentials

fZ(z) =

r

  • l=1

fl

Z(z)

fZ\{Zj}(z1, . . . , zj−1, zj+1, . . . , zc) =

r

  • l=1

fl

Z\{Zj}(z1, . . . , zj−1, zj+1, . . . , zn)

Again, it is an SP factorised potential.

10 / 16

slide-17
SLIDE 17

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Why are SP factorised potentials of interest?

They are closed for marginalisation and combination. Hence, inference algorithms as Shenoy-Shafer and Variable Elimination can be used. Operations over them are lazy by nature, i.e., handling them actually consists of handling sets of function (storing, indexing and retrieving them).

11 / 16

slide-18
SLIDE 18

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Classical MTE calculations vs. MoTBFs

Two experiments conducted

1 MoTBF vs. classical MTE approach:

No splits in head variables. Fixed splits in conditionals.

2 Lazy operations on SP factorised potentials vs. classical

MTE operations.

Random split points everywhere.

We use the Variable Elimination algorithm.

12 / 16

slide-19
SLIDE 19

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Classical MTE calculations vs. MoTBFs

Two experiments conducted

1 MoTBF vs. classical MTE approach:

No splits in head variables. Fixed splits in conditionals.

2 Lazy operations on SP factorised potentials vs. classical

MTE operations.

Random split points everywhere.

We use the Variable Elimination algorithm.

12 / 16

slide-20
SLIDE 20

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Experimental results

5 10 15 20 −4 −2 2 4 6 Number of variables log(inference time)

  • 13 / 16
slide-21
SLIDE 21

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Experimental results

Run time order of magnitude lower for the new approach. SP factorised potentials (lazy operations) by themselves, provide significant improvements. Storage requirements clearly lower. See the poster for the detailed numbers.

14 / 16

slide-22
SLIDE 22

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Experimental results

Run time order of magnitude lower for the new approach. SP factorised potentials (lazy operations) by themselves, provide significant improvements. Storage requirements clearly lower. See the poster for the detailed numbers.

14 / 16

slide-23
SLIDE 23

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Conclusions

MoTBF framework provides important advantages for inference (No split points in head variables, fixed set of basis functions, lazy operations). Lazy operations can be used (SP factorised potentials) even under classical MTE calculations. Significant advance wrt. previous MTE calculations: savings in space requirements and increase in efficiency.

15 / 16

slide-24
SLIDE 24

Inference with MoTBFs

  • H. Langseth,

T.D. Nielsen,

  • R. Rum´

ı,

  • A. Salmer´
  • n

MoTBFs Operations

  • ver MoTBFs

Inference Experiments Conclusions

Future work

Explore the impact (benefit) in efficiency of the use of fixed split points and its possible tradeoff with accuracy. Experiments including evidences and incorporating discrete variables. Re-approximate large potentials. Analyse other benefits of using basis functions for inference.

16 / 16