Markov Networks Asthma Cough Potential functions defined over - - PDF document

markov networks
SMART_READER_LITE
LIVE PREVIEW

Markov Networks Asthma Cough Potential functions defined over - - PDF document

Markov Networks Undirected graphical models Smoking Cancer Markov Networks Asthma Cough Potential functions defined over cliques Smoking Cancer (S,C) 1 P ( x ) c x ( ) = c False False 4.5 Z c False True 4.5


slide-1
SLIDE 1

1

Markov Networks

Markov Networks

 Undirected graphical models

Cancer Cough Asthma Smoking

 Potential functions defined over cliques 4.5 True True 2.7 False True 4.5 True False 4.5 False False Ф(S,C) Cancer Smoking

  • =

c c c x

Z x P ) ( 1 ) (

  • =

x c c c x

Z ) (

Markov Networks

 Undirected graphical models  Log-linear model: Weight of Feature i Feature i

  • ¬

=

  • therwise

Cancer Smoking if 1 ) Cancer Smoking, (

1

f

5 . 1

1 =

w

Cancer Cough Asthma Smoking

  • =
  • i

i i

x f w Z x P ) ( exp 1 ) (

Hammersley-Clifford Theorem

If Distribution is strictly positive (P(x) > 0) And Graph encodes conditional independences Then Distribution is product of potentials over cliques of graph Inverse is also true. (“Markov network = Gibbs distribution”)

slide-2
SLIDE 2

2 Markov Nets vs. Bayes Nets

Convert to Markov MCMC, BP, etc. Inference Some Some

  • Indep. props.

D-separation Graph separation

  • Indep. check

Z = 1 Z = ? Partition func. Forbidden Allowed Cycles

  • Cond. probabilities

Arbitrary Potentials

  • Prod. potentials
  • Prod. potentials

Form Bayes Nets Markov Nets Property

Inference in Markov Networks

 Goal: compute marginals & conditionals of  Exact inference is #P-complete  Conditioning on Markov blanket is easy:  Gibbs sampling exploits this

( ) ( ) ( )

exp ( ) ( | ( )) exp ( 0) exp ( 1)

i i i i i i i i i

w f x P x MB x w f x w f x = = + =

  • 1

( ) exp ( )

i i i

P X w f X Z

  • =
  • exp

( )

i i X i

Z w f X

  • =
  • MCMC: Gibbs Sampling

state ← random truth assignment for i ← 1 to num-samples do for each variable x sample x according to P(x|neighbors(x)) state ← state with new value of x P(F) ← fraction of states in which F is true

Other Inference Methods

 Belief propagation (sum-product)  Mean field / Variational approximations

slide-3
SLIDE 3

3 MAP/MPE Inference

 Goal: Find most likely state of world given

evidence

) | ( max x y P

y

Query Evidence

MAP Inference Algorithms

 Iterated conditional modes  Simulated annealing  Graph cuts  Belief propagation (max-product)