Noisy logical connectives in Bayesian networks Jirka Vomlel - - PowerPoint PPT Presentation

noisy logical connectives in bayesian networks
SMART_READER_LITE
LIVE PREVIEW

Noisy logical connectives in Bayesian networks Jirka Vomlel - - PowerPoint PPT Presentation

Noisy logical connectives in Bayesian networks Jirka Vomlel Institute of Information Theory and Automation Academy of Sciences of the Czech Republic http://www.utia.cz/vomlel Vienna, 2628 November 2010 J. Vomlel ( UTIA AV CR) Noisy


slide-1
SLIDE 1

Noisy logical connectives in Bayesian networks

Jirka Vomlel

Institute of Information Theory and Automation Academy of Sciences of the Czech Republic http://www.utia.cz/vomlel

Vienna, 26–28 November 2010

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 1 / 17

slide-2
SLIDE 2

Outline

Brief introduction to Causal Probabilistic (CP) logic

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 2 / 17

slide-3
SLIDE 3

Outline

Brief introduction to Causal Probabilistic (CP) logic Converting a CP-theory to a Bayesian network

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 2 / 17

slide-4
SLIDE 4

Outline

Brief introduction to Causal Probabilistic (CP) logic Converting a CP-theory to a Bayesian network Efficient probabilistic inference with Bayesian networks

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 2 / 17

slide-5
SLIDE 5

A story used throughout the presentation (Meert et al., 2008)

Take a person, named John, who may go to the shop to buy dinner. The probability he does is 20%. He chooses to buy either spaghetti or steak, each with 50% chance.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 3 / 17

slide-6
SLIDE 6

A story used throughout the presentation (Meert et al., 2008)

Take a person, named John, who may go to the shop to buy dinner. The probability he does is 20%. He chooses to buy either spaghetti or steak, each with 50% chance. Johns girlfriend Mary may also buy dinner. We assume that she cannot contact John during the day. The probability she goes to the shop is 90%. If she goes to the shop she buys either spaghetti with probability 30% or fish with probability 70%.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 3 / 17

slide-7
SLIDE 7

A story used throughout the presentation (Meert et al., 2008)

Take a person, named John, who may go to the shop to buy dinner. The probability he does is 20%. He chooses to buy either spaghetti or steak, each with 50% chance. Johns girlfriend Mary may also buy dinner. We assume that she cannot contact John during the day. The probability she goes to the shop is 90%. If she goes to the shop she buys either spaghetti with probability 30% or fish with probability 70%. If John and Mary both buy dinner, it is possible that they both buy

  • spaghetti. If they buy something different, they can choose what they will

have for dinner, because two meals have been bought.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 3 / 17

slide-8
SLIDE 8

Necessary notation from predicate logic

Ground terms will be just constant symbols, e.g. john, mary, spaghetti, steak, fish.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 4 / 17

slide-9
SLIDE 9

Necessary notation from predicate logic

Ground terms will be just constant symbols, e.g. john, mary, spaghetti, steak, fish. Predicate symbols of arity n. E.g. shops/1, bought/1 will be predicate symbols of arity 1.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 4 / 17

slide-10
SLIDE 10

Necessary notation from predicate logic

Ground terms will be just constant symbols, e.g. john, mary, spaghetti, steak, fish. Predicate symbols of arity n. E.g. shops/1, bought/1 will be predicate symbols of arity 1. If p is a predicate symbol of arity n and a1, . . . , an are ground terms then p(a1, . . . , an) is a ground atom. E.g., shops(paul), bought(spaghetti).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 4 / 17

slide-11
SLIDE 11

Necessary notation from predicate logic

Ground terms will be just constant symbols, e.g. john, mary, spaghetti, steak, fish. Predicate symbols of arity n. E.g. shops/1, bought/1 will be predicate symbols of arity 1. If p is a predicate symbol of arity n and a1, . . . , an are ground terms then p(a1, . . . , an) is a ground atom. E.g., shops(paul), bought(spaghetti). Herbrand base is the set of all ground atoms.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 4 / 17

slide-12
SLIDE 12

Necessary notation from predicate logic

Ground terms will be just constant symbols, e.g. john, mary, spaghetti, steak, fish. Predicate symbols of arity n. E.g. shops/1, bought/1 will be predicate symbols of arity 1. If p is a predicate symbol of arity n and a1, . . . , an are ground terms then p(a1, . . . , an) is a ground atom. E.g., shops(paul), bought(spaghetti). Herbrand base is the set of all ground atoms. Herbrand interpretation is a subset of the Herbrand base that is true.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 4 / 17

slide-13
SLIDE 13

Causal Probabilistic (CP) Logic (Vennekens, 2007)

Definition

A rule is the statement (p1 : α1) ∨ . . . ∨ (pn : αn) ← ϕ , where ϕ is a conjuction of ground atoms or their negations, the pi are ground atoms and the αi are non-zero probabilities with αi ≤ 1.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 5 / 17

slide-14
SLIDE 14

Causal Probabilistic (CP) Logic (Vennekens, 2007)

Definition

A rule is the statement (p1 : α1) ∨ . . . ∨ (pn : αn) ← ϕ , where ϕ is a conjuction of ground atoms or their negations, the pi are ground atoms and the αi are non-zero probabilities with αi ≤ 1. (p1 : α1) ∨ . . . ∨ (pn : αn) is the head of the rule and

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 5 / 17

slide-15
SLIDE 15

Causal Probabilistic (CP) Logic (Vennekens, 2007)

Definition

A rule is the statement (p1 : α1) ∨ . . . ∨ (pn : αn) ← ϕ , where ϕ is a conjuction of ground atoms or their negations, the pi are ground atoms and the αi are non-zero probabilities with αi ≤ 1. (p1 : α1) ∨ . . . ∨ (pn : αn) is the head of the rule and ϕ is the body.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 5 / 17

slide-16
SLIDE 16

Causal Probabilistic (CP) Logic (Vennekens, 2007)

Definition

A rule is the statement (p1 : α1) ∨ . . . ∨ (pn : αn) ← ϕ , where ϕ is a conjuction of ground atoms or their negations, the pi are ground atoms and the αi are non-zero probabilities with αi ≤ 1. (p1 : α1) ∨ . . . ∨ (pn : αn) is the head of the rule and ϕ is the body. Property ϕ causes an event, whose effect is that it makes at most one of the properties pi becomes true, and for each pi, the probability of it being caused by this event is αi.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 5 / 17

slide-17
SLIDE 17

Causal Probabilistic (CP) Logic (Vennekens, 2007)

Definition

A rule is the statement (p1 : α1) ∨ . . . ∨ (pn : αn) ← ϕ , where ϕ is a conjuction of ground atoms or their negations, the pi are ground atoms and the αi are non-zero probabilities with αi ≤ 1. (p1 : α1) ∨ . . . ∨ (pn : αn) is the head of the rule and ϕ is the body. Property ϕ causes an event, whose effect is that it makes at most one of the properties pi becomes true, and for each pi, the probability of it being caused by this event is αi.

Example

(bought(spaghetti) : 0.3) ∨ (bought(fish) : 0.7) ← shops(mary)

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 5 / 17

slide-18
SLIDE 18

CP-theory

Definition

CP-theory is a set of rules.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 6 / 17

slide-19
SLIDE 19

CP-theory

Definition

CP-theory is a set of rules.

Example

shops(john) : 0.2 ← · shops(mary) : 0.9 ← · (bought(spaghetti) : 0.5) ∨ (bought(steak) : 0.5) ← shops(john) (bought(spaghetti) : 0.3) ∨ (bought(fish) : 0.7) ← shops(mary)

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 6 / 17

slide-20
SLIDE 20

Semantics of CP-logic

The semantics of a CP-theory is defined by its execution model.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 7 / 17

slide-21
SLIDE 21

Semantics of CP-logic

The semantics of a CP-theory is defined by its execution model. Execution model of a CP-theory is a probabilistic process.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 7 / 17

slide-22
SLIDE 22

Semantics of CP-logic

The semantics of a CP-theory is defined by its execution model. Execution model of a CP-theory is a probabilistic process. Instead of definition, which is quite technical, we provide an example

  • f an execution model.
  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 7 / 17

slide-23
SLIDE 23

An execution model of a CP-theory

0.9 0.1 0.1 0.5 0.5 0.9 0.1 0.9 0.8 0.7 0.2 0.7 0.3 0.7 0.3 0.3

shops(mary) shops(mary) bought(spaghetti) bought(spaghetti) shops(mary) shops(john) bought(spaghetti) {} {} shops(john) bought(spaghetti) shops(mary) bought(steak) shops(john) shops(mary) bought(steak) shops(john) bought(steak) shops(john) {} shops(mary) shops(john) bought(spaghetti) shops(john) shops(mary) bought(spaghetti) shops(john) bought(spaghetti) shops(john) bought(steak) shops(john) bought(fish) shops(mary) bought(steak) shops(john) bought(fish) shops(mary) bought(fish)

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 8 / 17

slide-24
SLIDE 24

Probability of getting to a leaf

0.8 0.9 0.1 0.5 0.1 0.9 0.3 0.7 0.3 0.7 0.9 0.1 0.3 0.7 0.2 * 0.5 * 0.9 * 0.7 = 0.63 0.2 0.5

{} bought(spaghetti) shops(mary) bought(fish) shops(john) bought(steak) shops(mary) bought(fish) shops(john) bought(steak) shops(john) bought(spaghetti) shops(john) bought(spaghetti) shops(mary) shops(john) bought(spaghetti) shops(john) shops(mary) {} shops(john) bought(steak) shops(john) bought(steak) shops(mary) shops(john) bought(steak) shops(mary) bought(spaghetti) shops(john) {} bought(spaghetti) shops(mary) bought(fish) shops(mary) bought(spaghetti) shops(mary) shops(john)

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 9 / 17

slide-25
SLIDE 25

Marginal probability P(bought(spaghetti))

0.9 0.1 0.1 0.5 = 0.127 0.5 + 0.5 * 0.9 * 0.3) 0.2 * (0.5 * (0.9 * (0.3 + 0.7) + 0.1) 0.9 0.1 0.9 0.8 0.7 0.2 0.7 0.3 0.7 0.3 0.3

bought(spaghetti) shops(mary) shops(john) shops(mary) bought(fish) shops(mary) bought(spaghetti) {} {} shops(john) bought(spaghetti) shops(mary) bought(steak) shops(john) shops(mary) bought(steak) shops(john) bought(steak) shops(john) {} shops(mary) shops(john) bought(spaghetti) shops(john) shops(mary) bought(spaghetti) shops(john) bought(spaghetti) shops(john) bought(steak) shops(john) bought(fish) shops(mary) bought(steak) shops(john) bought(fish) shops(mary) bought(spaghetti)

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 10 / 17

slide-26
SLIDE 26

Construction of the Bayesian network (BN) of a CP-theory (Meert et al., 2008)

1 For every atom in the CP-theory, a Boolean variable is created.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 11 / 17

slide-27
SLIDE 27

Construction of the Bayesian network (BN) of a CP-theory (Meert et al., 2008)

1 For every atom in the CP-theory, a Boolean variable is created. 2 For every rule in the CP-theory, a choice variable is created. This

variable can take n + 1 values, where n is the number of atoms in the head.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 11 / 17

slide-28
SLIDE 28

Construction of the Bayesian network (BN) of a CP-theory (Meert et al., 2008)

1 For every atom in the CP-theory, a Boolean variable is created. 2 For every rule in the CP-theory, a choice variable is created. This

variable can take n + 1 values, where n is the number of atoms in the head.

3 Note that the choice variables are unobserved, they are auxiliary

nodes that do not correspond to atoms in the domain.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 11 / 17

slide-29
SLIDE 29

Construction of the Bayesian network (BN) of a CP-theory (Meert et al., 2008)

1 For every atom in the CP-theory, a Boolean variable is created. 2 For every rule in the CP-theory, a choice variable is created. This

variable can take n + 1 values, where n is the number of atoms in the head.

3 Note that the choice variables are unobserved, they are auxiliary

nodes that do not correspond to atoms in the domain.

4 If an atom is in the head of a rule, an edge is created from its

corresponding choice node towards the atom’s node.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 11 / 17

slide-30
SLIDE 30

Construction of the Bayesian network (BN) of a CP-theory (Meert et al., 2008)

1 For every atom in the CP-theory, a Boolean variable is created. 2 For every rule in the CP-theory, a choice variable is created. This

variable can take n + 1 values, where n is the number of atoms in the head.

3 Note that the choice variables are unobserved, they are auxiliary

nodes that do not correspond to atoms in the domain.

4 If an atom is in the head of a rule, an edge is created from its

corresponding choice node towards the atom’s node.

5 If a literal is in the body of a rule, an edge is created from the literal’s

atom node towards the rules choice node.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 11 / 17

slide-31
SLIDE 31

The BN of a CP-theory - Example

shops(mary) shops(john) C3 bought(steak) bought(fish) bought(spaghetti) C4 C2 C1

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 12 / 17

slide-32
SLIDE 32

The BN of a CP-theory - Example

shops(mary) shops(john) C3 bought(steak) bought(fish) bought(spaghetti) C4 C2 C1

For complete specification of the BN (including numerical parameters in conditional probability tables) see shopping-example-orig.net in Hugin Light.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 12 / 17

slide-33
SLIDE 33

Modified BN of a CP-theory - Example

C32 C42 C41 bought(steak) bought(fish) bought(spaghetti) shops(john) C3 C4 shops(mary) C31

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 13 / 17

slide-34
SLIDE 34

Modified BN of a CP-theory - Example

deterministic part

shops(mary) C4 C3 shops(john) bought(spaghetti) bought(fish) bought(steak) C4.1 C4.2 C3.2 C3.1

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 13 / 17

slide-35
SLIDE 35

Modified BN of a CP-theory - Example

deterministic part

shops(mary) C4 C3 shops(john) bought(spaghetti) bought(fish) bought(steak) C4.1 C4.2 C3.2 C3.1

For modified BN and computations with this BN see shopping-example.net in Hugin Light. Compare behavior with evidence bought(spaghetti) and ¬bought(spaghetti).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 13 / 17

slide-36
SLIDE 36

A part of a BN with the OR function

C2 symptom(anorexia) Cn disease(cancer) disease(hypervitaminosis D) disease(anorexia nervosa) C1

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 14 / 17

slide-37
SLIDE 37

A part of a BN with the OR function

C2 symptom(anorexia) Cn disease(cancer) disease(hypervitaminosis D) disease(anorexia nervosa) C1

A well-known model often called noisy-OR.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 14 / 17

slide-38
SLIDE 38

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-39
SLIDE 39

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far. A standard method for probabilistic inference is the junction tree method which exploits the global structure of the model, specifically, conditional independence among variables.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-40
SLIDE 40

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far. A standard method for probabilistic inference is the junction tree method which exploits the global structure of the model, specifically, conditional independence among variables. Additionally, the local structure of noisy-OR can be exploited as well to perform computations more efficiently.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-41
SLIDE 41

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far. A standard method for probabilistic inference is the junction tree method which exploits the global structure of the model, specifically, conditional independence among variables. Additionally, the local structure of noisy-OR can be exploited as well to perform computations more efficiently. Parent divorcing method (Olesen et al., 1989).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-42
SLIDE 42

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far. A standard method for probabilistic inference is the junction tree method which exploits the global structure of the model, specifically, conditional independence among variables. Additionally, the local structure of noisy-OR can be exploited as well to perform computations more efficiently. Parent divorcing method (Olesen et al., 1989). Quickscore (Heckerman, 1990).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-43
SLIDE 43

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far. A standard method for probabilistic inference is the junction tree method which exploits the global structure of the model, specifically, conditional independence among variables. Additionally, the local structure of noisy-OR can be exploited as well to perform computations more efficiently. Parent divorcing method (Olesen et al., 1989). Quickscore (Heckerman, 1990). Factorization of noisy-MAX (D´ ıez & Gal´ an, 2003).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-44
SLIDE 44

Efficient probabilistic inference in BNs containing noisy-OR

Probabilistic inference is the computation of P(Xi|e) for all variables Xi of the BN where e stands for evidence collected so far. A standard method for probabilistic inference is the junction tree method which exploits the global structure of the model, specifically, conditional independence among variables. Additionally, the local structure of noisy-OR can be exploited as well to perform computations more efficiently. Parent divorcing method (Olesen et al., 1989). Quickscore (Heckerman, 1990). Factorization of noisy-MAX (D´ ıez & Gal´ an, 2003). Rank-one decomposition (Savick´ y & Vomlel, 2007).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 15 / 17

slide-45
SLIDE 45

Comparison of ROD and PD

Comparison of the size of arithmetic circuits used for probabilistic inference in BN2O networks after: the parent divorcing method (horizontal axis)

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 16 / 17

slide-46
SLIDE 46

Comparison of ROD and PD

Comparison of the size of arithmetic circuits used for probabilistic inference in BN2O networks after: the parent divorcing method (horizontal axis) the rank-one decomposition (vertical axis).

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 16 / 17

slide-47
SLIDE 47

Comparison of ROD and PD

Comparison of the size of arithmetic circuits used for probabilistic inference in BN2O networks after: the parent divorcing method (horizontal axis) the rank-one decomposition (vertical axis).

3 4 5 6 7 8 9 3 4 5 6 7 8 9

Decadic logarithm scale is used.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 16 / 17

slide-48
SLIDE 48

Current and future work

Current work

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 17 / 17

slide-49
SLIDE 49

Current and future work

Current work Graph triangulation methods for BN2O networks.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 17 / 17

slide-50
SLIDE 50

Current and future work

Current work Graph triangulation methods for BN2O networks. The rank-one decomposition of tables of ℓ-out-of-k functions.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 17 / 17

slide-51
SLIDE 51

Current and future work

Current work Graph triangulation methods for BN2O networks. The rank-one decomposition of tables of ℓ-out-of-k functions. Future work

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 17 / 17

slide-52
SLIDE 52

Current and future work

Current work Graph triangulation methods for BN2O networks. The rank-one decomposition of tables of ℓ-out-of-k functions. Future work Aproximate inference based on rank-one decomposition.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 17 / 17

slide-53
SLIDE 53

Current and future work

Current work Graph triangulation methods for BN2O networks. The rank-one decomposition of tables of ℓ-out-of-k functions. Future work Aproximate inference based on rank-one decomposition. An application in medicine.

  • J. Vomlel (´

UTIA AV ˇ CR) Noisy logical connectives in BNs 26–28/Nov/2010 17 / 17