Journal of Articial In telligence Researc h 12 (2000) - - PDF document

journal of arti cial in telligence researc h 12 2000 219
SMART_READER_LITE
LIVE PREVIEW

Journal of Articial In telligence Researc h 12 (2000) - - PDF document

Journal of Articial In telligence Researc h 12 (2000) 219-234 Submitted 5/99; published 5/00 Randomized Algorithms for the Lo op Cutset Problem Ann Bec k er anyut a@cs.technion.a c.il Reuv en Bar-Y eh uda


slide-1
SLIDE 1 Journal
  • f
Articial In telligence Researc h 12 (2000) 219-234 Submitted 5/99; published 5/00 Randomized Algorithms for the Lo
  • p
Cutset Problem Ann Bec k er anyut a@cs.technion.a c.il Reuv en Bar-Y eh uda reuven@cs.technion.a c.il Dan Geiger d ang@cs.technion.a c.il Computer Scienc e Dep artment T e chnion, Haifa, 32000, Isr ael Abstract W e sho w ho w to nd a minim um w eigh t lo
  • p
cutset in a Ba y esian net w
  • rk
with high probabilit y . Finding suc h a lo
  • p
cutset is the rst step in the metho d
  • f
conditioning for inference. Our randomized algorithm for nding a lo
  • p
cutset
  • utputs
a minim um lo
  • p
cutset after O (c 6 k k n) steps with probabilit y at least 1
  • (1
  • 1
6 k ) c6 k , where c > 1 is a constan t sp ecied b y the user, k is the minima l size
  • f
a minim um w eigh t lo
  • p
cutset, and n is the n um b er
  • f
v ertices. W e also sho w empirically that a v arian t
  • f
this algorithm
  • ften
nds a lo
  • p
cutset that is closer to the minim um w eigh t lo
  • p
cutset than the
  • nes
found b y the b est deterministic algorithms kno wn. 1. In tro duction The metho d
  • f
conditioning is a w ell kno wn inference metho d for the computation
  • f
p
  • s-
terior probabilities in general Ba y esian net w
  • rks
(P earl, 1986, 1988; Suermondt & Co
  • p
er, 1990; P eot & Shac h ter, 1991) as w ell as for nding MAP v alues and solving constrain t sat- isfaction problems (Dec h ter, 1999). This metho d has t w
  • conceptual
phases. First to nd an
  • ptimal
  • r
close to
  • ptimal
lo
  • p
cutset and then to p erform a lik eliho
  • d
computation for eac h instance
  • f
the v ariables in the lo
  • p
cutset. This metho d is routinely used b y geneticists via sev eral genetic link age programs (Ott, 1991; Lang, 1997; Bec k er, Geiger, & Sc haer, 1998). A v arian t
  • f
this metho d w as dev elop ed b y Lange and Elston (1975). Finding a minim um w eigh t lo
  • p
cutset is NP-complete and th us heuristic metho ds ha v e
  • ften
b een applied to nd a reasonable lo
  • p
cutset (Suermondt & Co
  • p
er, 1990). Most metho ds in the past had no guaran tee
  • f
p erformance and p erformed v ery badly when presen ted with an appropriate example. Bec k er and Geiger (1994, 1996)
  • ered
an algorithm that nds a lo
  • p
cutset for whic h the logarithm
  • f
the state space is guaran teed to b e at most a constan t factor
  • the
  • ptimal
v alue. An adaptation
  • f
these appro ximation algorithms has b een included in v ersion 4.0
  • f
F ASTLINK, a p
  • pular
soft w are for analyzing large p edigrees with small n um b er
  • f
genetic mark ers (Bec k er et al., 1998). Similar algorithms in the con text
  • f
undirected graphs are describ ed b y Bafna, Berman, and F ujito (1995) and F ujito (1996). While appro ximation algorithms for the lo
  • p
cutset problem are quite useful, it is still w
  • rth
while to in v est in nding a minim um lo
  • p
cutset rather than an appro ximation b e- cause the cost
  • f
nding suc h a lo
  • p
cutset is amortized
  • v
er the man y iterations
  • f
the conditioning metho d. In fact,
  • ne
ma y in v est an eort
  • f
complexit y exp
  • nen
tial in the size
  • f
the lo
  • p
cutset in nding a minim um w eigh t lo
  • p
cutset b ecause the second phase
  • f
the conditioning algorithm, whic h is rep eated for man y iterations, uses a pro cedure
  • f
suc h c 2000 AI Access F
  • undation
and Morgan Kaufmann Publishers. All righ ts reserv ed.
slide-2
SLIDE 2 Becker, Bar-Yehud a, & Geiger complexit y . The same considerations apply also to constrain t satisfaction problems as w ell as
  • ther
problems in whic h the metho d
  • f
conditioning is useful (Dec h ter, 1990, 1999). In this pap er w e describ e sev eral randomized algorithms that compute a lo
  • p
cutset. As done b y Bar-Y eh uda, Geiger, Naor, and Roth (1994),
  • ur
solution is based
  • n
a reduction to the w eigh ted feedbac k v ertex set problem. A fe e db ack vertex set (FVS) F is a set
  • f
v ertices
  • f
an undirected graph G = (V ; E ) suc h that b y remo ving F from G, along with all the edges inciden t with F , a set
  • f
trees is
  • btained.
The Weighte d F e e db ack V ertex Set (WFVS) pr
  • blem
is to nd a feedbac k v ertex set F
  • f
a v ertex-w eigh ted graph with a w eigh t function w : V ! I R + , suc h that P v 2F w (v ) is minimized. When w (v )
  • 1,
this problem is called the FVS problem. The decision v ersion asso ciated with the FVS problem is kno wn to b e NP-Complete (Garey & Johnson, 1979, pp. 191{192). Our randomized algorithm for nding a WFVS, called Repea tedW GuessI,
  • utputs
a minim um w eigh t FVS after O (c 6 k k n) steps with probabilit y at least 1
  • (1
  • 1
6 k ) c6 k , where c > 1 is a constan t sp ecied b y the user, k is the minimal size
  • f
a minim um w eigh t FVS, and n is the n um b er
  • f
v ertices. F
  • r
un w eigh ted graphs w e presen t an algorithm that nds a minim um FVS
  • f
a graph G after O (c 4 k k n) steps with probabilit y at least 1
  • (1
  • 1
4 k ) c4 k . In comparison, sev eral deterministic algorithms for nding a minim um FVS are describ ed in the literature. One has a complexit y O ((2k + 1) k n 2 ) (Do wney & F ello ws, 1995b) and
  • thers
ha v e a complexit y O ((17k 4 )!n) (Bo dlaender, 1990; Do wney & F ello ws, 1995a). A nal v arian t
  • f
  • ur
randomized algorithms, called WRA, has the b est p erformance b ecause it utilizes information from previous runs. This algorithm is harder to analyze and its in v estigation is mostly exp erimen tal. W e sho w empirically that the actual run time
  • f
WRA is comparable to a Mo died Greedy Algorithm (MGA), describ ed b y Bec k er and Geiger (1996), whic h is the b est a v ailable deterministic algorithm for nding close to
  • ptimal
lo
  • p
cutsets, and y et, the
  • utput
  • f
WRA is
  • ften
closer to the minim um w eigh t lo
  • p
cutest than the
  • utput
  • f
MGA. The rest
  • f
the pap er is
  • rganized
as follo ws. In Section 2 w e
  • utline
the metho d
  • f
conditioning, explain the related lo
  • p
cutset problem and describ e the reduction from the lo
  • p
cutset problem to the WFVS Problem. In Section 3 w e presen t three randomized algo- rithms for the WFVS problem and their analysis. In Section 4 w e compare exp erimen tally WRA and MGA with resp ect to
  • utput
qualit y and run time. 2. Bac kground: The Lo
  • p
Cutset Problem A short
  • v
erview
  • f
the metho d
  • f
conditioning and denitions related to Ba y esian net w
  • rks
are giv en b elo w. See the b
  • k
b y P earl (1988) for more details. W e then dene the lo
  • p
cutset problem. Let P (u 1 ; : : : ; u n ) b e a probabilit y distribution where eac h v ariable u i has a nite set
  • f
p
  • ssible
v alues called the domain
  • f
u i . A directed graph D with no directed cycles is called a Bayesian network
  • f
P if there is a 1{1 mapping b et w een fu 1 ; : : : ; u n g and v ertices in D , suc h that u i is asso ciated with v ertex i and P can b e written as follo ws: P (u 1 ; : : : ; u n ) = n Y i=1 P (u i j u i 1 ; : : : ; u i j (i) ) (1) where i 1 ; : : : ; i j (i) are the source v ertices
  • f
the incoming edges to v ertex i in D . 220
slide-3
SLIDE 3 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
Supp
  • se
no w that some v ariables fv 1 ; : : : ; v l g among fu 1 ; : : : ; u n g are assigned sp ecic v alues fv 1 ; : : : ; v l g resp ectiv ely . The up dating pr
  • blem
is to compute the probabilit y P (u i j v 1 = v 1 ; : : : ; v l = v l ) for i = 1; : : : ; n. A tr ail in a Ba y esian net w
  • rk
is a subgraph whose underlying graph is a simple path. A v ertex b is called a sink with resp ect to a trail t if there exist t w
  • consecutiv
e edges a ! b and b c
  • n
t. A trail t is active by a set
  • f
vertic es Z if (1) ev ery sink with resp ect to t either is in Z
  • r
has a descendan t in Z and (2) ev ery
  • ther
v ertex along t is
  • utside
Z . Otherwise, the trail is said to b e blo cke d (d-sep ar ate d) b y Z . V erma and P earl pro v ed that if D is a Ba y esian net w
  • rk
  • f
P (u 1 ; : : : ; u n ) and all trails b et w een a v ertex in fr 1 ; : : : ; r l g and a v ertex in fs 1 ; : : : ; s k g are blo c k ed b y ft 1 ; : : : ; t m g, then the corresp
  • nding
sets
  • f
v ariables fu r 1 ; : : : ; u r l g and fu s 1 ; : : : ; u s k g are indep enden t conditioned
  • n
fu t 1 ; : : : ; u t m g (V erma & P earl, 1988). F urthermore, Geiger and P earl pro v ed that this result cannot b e enhanced (Geiger & P earl, 1990). Both results w ere presen ted and extended b y Geiger, V erma, and P earl (1990). Using the close relationship b et w een blo c k ed trails and conditional indep endence, Kim and P earl dev elop ed an algorithm upd a te-tree that solv es the up dating problem
  • n
Ba y esian net w
  • rks
in whic h ev ery t w
  • v
ertices are connected with at most
  • ne
trail (Kim & P earl, 1983). P earl then solv ed the up dating problem
  • n
an y Ba y esian net w
  • rk
as fol- lo ws (P earl, 1986). First, a set
  • f
v ertices S is selected suc h that an y t w
  • v
ertices in the net w
  • rk
are connected b y at most
  • ne
active trail in S [ Z , where Z is an y subset
  • f
v er- tices. Then, upd a te-tree is applied
  • nce
for eac h com bination
  • f
v alue assignmen ts to the v ariables corresp
  • nding
to S , and, nally , the results are com bined. This algorithm is called the metho d
  • f
c
  • nditioning
and its complexit y gro ws exp
  • nen
tially with the size
  • f
S . The set S is called a lo
  • p
cutset. Note that when the domain size
  • f
the v ariables v aries, then upd a te-tree is called a n um b er
  • f
times equal to the pro duct
  • f
the domain sizes
  • f
the v ariables whose corresp
  • nding
v ertices participate in the lo
  • p
cutset. If w e tak e the logarithm
  • f
the domain size (n um b er
  • f
v alues) as the w eigh t
  • f
a v ertex, then nding a lo
  • p
cutset suc h that the sum
  • f
its v ertices w eigh ts is minim um
  • ptimizes
P earl's up dating algorithm in the case where the domain sizes ma y v ary . W e no w giv e an alternativ e denition for a lo
  • p
cutset S and then pro vide a probabilistic algorithm for nding it. This denition is b
  • rro
w ed from a pap er b y Bar-Y eh uda et al. (1994). The underlying graph G
  • f
a directed graph D is the undirected graph formed b y ignoring the directions
  • f
the edges in D . A cycle in G is a path whose t w
  • terminal
v ertices coincide. A lo
  • p
in D is a subgraph
  • f
D whose underlying graph is a cycle. A v ertex v is a sink with resp ect to a lo
  • p
  • if
the t w
  • edges
adjacen t to v in
  • are
directed in to v . Ev ery lo
  • p
m ust con tain at least
  • ne
v ertex that is not a sink with resp ect to that lo
  • p.
Eac h v ertex that is not a sink with resp ect to a lo
  • p
  • is
called an al lowe d vertex with r esp e ct to . A lo
  • p
cutset
  • f
a directed graph D is a set
  • f
v ertices that con tains at least
  • ne
allo w ed v ertex with resp ect to eac h lo
  • p
in D . The w eigh t
  • f
a set
  • f
v ertices X is denoted b y w (X ) and is equal to P v 2X w (v ) where w (x) = log(jxj) and jxj is the size
  • f
the domain asso ciated with v ertex x. A minimum weight lo
  • p
cutset
  • f
a w eigh ted directed graph D is a lo
  • p
cutset F
  • f
D for whic h w (F
  • )
is minim um
  • v
er all lo
  • p
cutsets
  • f
G. The L
  • p
Cutset Pr
  • blem
is dened as nding a minim um w eigh t lo
  • p
cutset
  • f
a giv en w eigh ted directed graph D . 221
slide-4
SLIDE 4 Becker, Bar-Yehud a, & Geiger The approac h w e tak e is to reduce the lo
  • p
cutset problem to the w eigh ted feedbac k v ertex set problem, as done b y Bar-Y eh uda et al. (1994). W e no w dene the w eigh ted feedbac k v ertex set problem and then the reduction. Let G = (V ; E ) b e an undirected graph, and let w : V ! I R + b e a w eigh t function
  • n
the v ertices
  • f
G. A fe e db ack vertex set
  • f
G is a subset
  • f
v ertices F
  • V
suc h that eac h cycle in G passes through at least
  • ne
v ertex in F . In
  • ther
w
  • rds,
a feedbac k v ertex set F is a set
  • f
v ertices
  • f
G suc h that b y remo ving F from G, along with all the edges inciden t with F , w e
  • btain
a set
  • f
trees (i.e., a forest). The w eigh t
  • f
a set
  • f
v ertices X is denoted (as b efore) b y w (X ) and is equal to P v 2X w (v ). A minimum fe e db ack vertex set
  • f
a w eigh ted graph G with a w eigh t function w is a feedbac k v ertex set F
  • f
G for whic h w (F
  • )
is minim um
  • v
er all feedbac k v ertex sets
  • f
G. The Weighte d F e e db ack V ertex Set (WFVS) Pr
  • blem
is dened as nding a minim um feedbac k v ertex set
  • f
a giv en w eigh ted graph G ha ving a w eigh t function w . The reduction is as follo ws. Giv en a w eigh ted directed graph (D ; w ) (e.g., a Ba y esian net w
  • rk),
w e dene the splitting w eigh ted undirected graph D s with a w eigh t function w s as follo ws. Split eac h v ertex v in D in to t w
  • v
ertices v in and v
  • ut
in D s suc h that all incoming edges to v in D b ecome undirected inciden t edges with v in in D s , and all
  • utgoing
edges from v in D b ecome undirected inciden t edges with v
  • ut
in D s . In addition, connect v in and v
  • ut
in D s b y an undirected edge. No w set w s (v in ) = 1 and w s (v
  • ut
) = w (v ). F
  • r
a set
  • f
v ertices X in D s , w e dene (X ) as the set
  • btained
b y replacing eac h v ertex v in
  • r
v
  • ut
in X b y the resp ectiv e v ertex v in D from whic h these v ertices
  • riginated.
Note that if X is a cycle in D s , then (X ) is a lo
  • p
in D , and if Y is a lo
  • p
in D , then 1 (Y ) = S v 2Y 1 (v ) is a cycle in D s where 1 (v ) = 8 > < > : v in v is a sink
  • n
Y v
  • ut
v is a source
  • n
Y fv in ; v
  • ut
g
  • therwise
(A v ertex v is a sour c e with resp ect to a lo
  • p
Y if the t w
  • edges
adjacen t to v in Y
  • riginate
from v ). This mapping b et w een lo
  • ps
in D and cycles in D s is
  • ne-to-one
and
  • n
to. Our algorithm can no w b e easily stated. ALGORITHM Lo
  • pCutset
Input: A Bayesian network D Output: A lo
  • p
cutset
  • f
D 1. Construct the splitting graph D s with w eigh t function w s 2. Find a feedbac k v ertex set F for (D s ; w s ) using the W eigh ted Randomized Algorithm (WRA) 3. Output (F ). It is immediately seen that if WRA (dev elop ed in later sections)
  • utputs
a feedbac k v ertex set F
  • f
D s whose w eigh t is minim um with high probabilit y , then (F ) is a lo
  • p
cutset
  • f
D with minim um w eigh t with the same probabilit y . This
  • bserv
ation holds due to the
  • ne-to-one
and
  • n
to corresp
  • ndence
b et w een lo
  • ps
in D and cycles in D s and b ecause WRA nev er c ho
  • ses
a v ertex that has an innite w eigh t. 222
slide-5
SLIDE 5 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
3. Algorithms for the WFVS Problem Recall that a feedbac k v ertex set
  • f
G is a subset
  • f
v ertices F
  • V
suc h that eac h cycle in G passes through at least
  • ne
v ertex in F . In Section 3.1 w e address the problem
  • f
nding a FVS with a minim um n um b er
  • f
v ertices and in Sections 3.2 and 3.3 w e address the problem
  • f
nding a FVS with a minim um w eigh t. Throughout, w e allo w G to ha v e parallel edges. If t w
  • v
ertices u and v ha v e parallel edges b et w een them, then ev ery FVS
  • f
G includes either u, v ,
  • r
b
  • th.
3.1 The Basic Algorithms In this section w e presen t a randomized algorithm for the FVS problem. First w e in tro duce some additional terminology and notation. Let G = (V ; E ) b e an undirected graph. The degree
  • f
a v ertex v in G, denoted b y d(v ), is the n um b er
  • f
v ertices adjacen t to v . A self-lo
  • p
is an edge with t w
  • endp
  • in
ts at the same v ertex. A le af is a v ertex with degree less
  • r
equal 1, a linkp
  • int
is a v ertex with degree 2 and a br anchp
  • int
is a v ertex with degree strictly higher than 2. The cardinalit y
  • f
a set X is denoted b y jX j. A graph is called rich if ev ery v ertex is a branc hp
  • in
t and it has no self-lo
  • ps.
Giv en a graph G, b y rep eatedly remo ving all lea v es, and b ypassing with an edge ev ery linkp
  • in
t, a graph G is
  • btained
suc h that the size
  • f
a minim um FVS in G and in G are equal and ev ery minim um FVS
  • f
G is a minim um FVS
  • f
G. Since ev ery v ertex in v
  • lv
ed in a self-lo
  • p
b elongs to ev ery FVS, w e can transform G to a ric h graph G r b y adding the v ertices in v
  • lv
ed in self lo
  • ps
to the
  • utput
  • f
the algorithm. Our algorithm is based
  • n
the
  • bserv
ation that if w e pic k an edge at random from a ric h graph there is a probabilit y
  • f
at least 1=2 that at least
  • ne
endp
  • in
t
  • f
the edge b elongs to an y giv en FVS F . A precise form ulation
  • f
this claim is giv en b y Lemma 1 whose pro
  • f
is giv en implicitl y b y V
  • ss
(1968, Lemma 4). Lemma 1 L et G = (V ; E ) b e a rich gr aph, F b e a fe e db ack vertex set
  • f
G and X = V n F . L et E X denote the set
  • f
e dges in E whose endp
  • ints
ar e al l vertic es in X and E F ;X denote the set
  • f
e dges in G that c
  • nne
ct vertic es in F with vertic es in X . Then, jE X j
  • jE
F ;X j. Pro
  • f.
The graph
  • btained
b y deleting a feedbac k v ertex set F
  • f
a graph G(V ; E ) is a forest with v ertices X = V n F . Hence, jE X j < jX j. Ho w ev er, eac h v ertex in X is a branc hp
  • in
t in G, and so, 3 jX j
  • X
v 2X d(v ) = jE F ;X j + 2 jE X j: Th us, jE X j
  • jE
F ;X j. 2 Lemma 1 implies that when pic king an edge at random from a ric h graph, it is at least as lik ely to pic k an edge in E F ;X than an edge in E X . Consequen tly , selecting a v ertex at random from a randomly selected edge has a probabilit y
  • f
at least 1=4 to b elong to a minim um FVS. This idea yields a simple algorithm to nd a FVS. 223
slide-6
SLIDE 6 Becker, Bar-Yehud a, & Geiger ALGORITHM SingleGuess(G,j) Input: A n undir e cte d gr aph G and an inte ger j > 0. Output: A fe e db ack vertex set F
  • f
size
  • j
,
  • r
"F ail"
  • therwise.
F
  • r
i = 1; : : : ; j 1. Reduce G i1 to a ric h graph G i while placing self lo
  • p
v ertices in F . 2. If G i is the empt y graph Return F 3. Pic k an edge e = (u; v ) at random from E i 4. Pic k a v ertex v i at random from (u; v ) 5. F F [ fv i g 6. V V n fv i g Return "F ail" Due to Lemma 1, when SingleGuess(G; j ) terminates with a FVS
  • f
size j , there is a probabilit y
  • f
at least 1=4 j that the
  • utput
is a minim um FVS. Note that steps 3 and 4 in SingleGuess determine a v ertex v b y rst selecting an arbitrary edge and then selecting an arbitrary endp
  • in
t
  • f
this edge. An equiv alen t w a y
  • f
ac hieving the same selection rule is to c ho
  • se
a v ertex with probabilit y prop
  • rtional
to its degree: p(v ) = d(v ) P u2V d(u) = d(v ) 2
  • jE
j T
  • see
the equiv alence
  • f
these t w
  • selection
metho ds, dene (v ) to b e a set
  • f
edges whose
  • ne
endp
  • in
t is v , and note that for graphs without self-lo
  • ps,
p(v ) = X e2(v ) p(v je)
  • p(e)
= 1 2 X e2(v ) p(e) = d(v ) 2
  • jE
j This equiv alen t phrasing
  • f
the selection criterion is easier to extend to the w eigh ted case and will b e used in the follo wing sections. An algorithm for nding a minim um FVS with high probabilit y , whic h w e call Repea t- edGuess, can no w b e describ ed as follo ws: Start with j = 1. Rep eat SingleGuess c 4 j times where c > 1 is a parameter dened b y the user. If in
  • ne
  • f
the iterations a FVS
  • f
size
  • j
is found, then
  • utput
this FVS,
  • therwise,
increase j b y
  • ne
and con tin ue. ALGORITHM Rep eatedGuess(G,c) Input: A n undir e cte d gr aph G and a c
  • nstant
c > 1. Output: A fe e db ack vertex set F . F
  • r
j = 1; : : : ; jV j Rep eat c 4 j times 1. F SingleGuess(G; j ) 2. If F is not "F ail" then Return F End fRep eatg End fF
  • rg
224
slide-7
SLIDE 7 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
The main claims ab
  • ut
these algorithms are giv en b y the follo wing theorem. Theorem 2 L et G b e an undir e cte d gr aph and c
  • 1
b e a c
  • nstant.
Then, SingleGuess(G; k )
  • utputs
a FVS whose exp e cte d size is no mor e than 4k , and Repea tedGuess(G; c)
  • utputs,
after O (c 4 k k n) steps, a minimum FVS with pr
  • b
ability at le ast 1
  • (1
  • 1
4 k ) c4 k , wher e k is the size
  • f
a minimum FVS and n is the numb er
  • f
vertic es. The claims ab
  • ut
the probabilit y
  • f
success and n um b er
  • f
steps follo w immediately from the fact that the probabilit y
  • f
success
  • f
SingleGuess(G; j ) is at least (1=4) j and that, in case
  • f
success, O (c 4 j ) iterations are p erformed eac h taking O (j n) steps. The result follo ws from the fact that P k j =1 j 4 j is
  • f
  • rder
O (k 4 k ). The pro
  • f
ab
  • ut
the exp ected size
  • f
a single guess is presen ted in the next section. Theorem 2 sho ws that eac h guess pro duces a FVS whic h,
  • n
the a v erage, is not to
  • far
from the minim um, and that after enough iterations, the algorithm con v erges to the minim um with high probabilit y . In the w eigh ted case, discussed next, w e managed to ac hiev e eac h
  • f
these t w
  • guaran
tees in separate algorithms, but w e w ere unable to ac hiev e b
  • th
guaran tees in a single algorithm. 3.2 The W eigh ted Algorithms W e no w turn to the w eigh ted FVS problem (WFVS)
  • f
size k whic h is to nd a feedbac k v ertex set F
  • f
a v ertex-w eigh ted graph (G; w ), w : V ! I R + ,
  • f
size less
  • r
equal k suc h that w (F ) is minimized. Note that for the w eigh ted FVS problem w e cannot replace eac h linkp
  • in
t v with an edge b ecause if v has w eigh t ligh ter than its branc hp
  • in
t neigh b
  • rs
then v can participate in a minim um w eigh t FVS
  • f
size k . A graph is called br anchy if it has no endp
  • in
ts, no self lo
  • ps,
and, in addition, eac h linkp
  • in
t is connected
  • nly
to branc hp
  • in
ts (Bar-Y eh uda, Geiger, Naor, & Roth, 1994). Giv en a graph G, b y rep eatedly remo ving all lea v es, and b ypassing with an edge ev ery linkp
  • in
t that has a neigh b
  • r
with equal
  • r
ligh ter w eigh t, a graph G is
  • btained
suc h that the w eigh t
  • f
a minim um w eigh t FVS (of size k ) in G and in G are equal and ev ery minim um WFVS
  • f
G is a minim um WFVS
  • f
G. Since ev ery v ertex with a self-lo
  • p
b elongs to ev ery FVS, w e can transform G to a branc h y graph without self-lo
  • ps
b y adding the v ertices in v
  • lv
ed in self lo
  • ps
to the
  • utput
  • f
the algorithm. T
  • address
the WFVS problem w e
  • er
t w
  • sligh
t mo dications to the algorithm Sin- gleGuess presen ted in the previous section. The rst algorithm, whic h w e call Sin- gleW GuessI, is iden tical to SingleGuess except that in eac h iteration w e mak e a re- duction to a branc h y graph instead
  • f
a reduction to a ric h graph. It c ho
  • ses
a v ertex with probabilit y prop
  • rtional
to the degree using p(v ) = d(v )= P u2V d(u). Note that this probabilit y do es not tak e the w eigh t
  • f
a v ertex in to accoun t. A second algorithm, whic h w e call SingleW GuessI I, c ho
  • ses
a v ertex with probabilit y prop
  • rtional
to the ratio
  • f
its degree
  • v
er its w eigh t, p(v ) = d(v ) w (v ) = X u2V d(u) w (u) : (2) 225
slide-8
SLIDE 8 Becker, Bar-Yehud a, & Geiger ALGORITHM SingleW GuessI(G,j) Input: A n undir e cte d weighte d gr aph G and an inte ger j > 0. Output: A fe e db ack vertex set F
  • f
size
  • j
,
  • r
"F ail"
  • therwise.
F
  • r
i = 1; : : : ; j 1. Reduce G i1 to a branc h y graph G i (V i ; E i ) while placing self lo
  • p
v ertices in F . 2. If G i is the empt y graph Return F 3. Pic k a v ertex v i 2 V i at random with probabilit y p i (v ) = d i (v )= P u2V i d i (u) 4. F F [ fv i g 5. V V n fv i g Return "F ail" The second algorithm uses Eq. 2 for computing p(v ) in Line 3. These t w
  • algorithms
ha v e remark ably dieren t guaran tees
  • f
p erformance. V ersion I guaran tees that c ho
  • sing
a v ertex that b elongs to an y giv en FVS is larger than 1=6, ho w ev er, the exp ected w eigh t
  • f
a FVS pro duced b y v ersion I cannot b e b
  • unded
b y a constan t times the w eigh t
  • f
a minim um WFVS. V ersion I I guaran tees that the exp ected w eigh t
  • f
its
  • utput
is b
  • unded
b y 6 times the w eigh t
  • f
a minim um WFVS, ho w ev er, the probabilit y
  • f
con v erging to a minim um after an y xed n um b er
  • f
iterations can b e arbitrarily small. W e rst demonstrate via an example the negativ e claims. The p
  • sitiv
e claims are phrased more precisely in Theorem 3 and pro v en thereafter. Consider the graph sho wn in Figure 1 with three v ertices a,b and c, and corresp
  • nding
w eigh ts w (a) = 6, w (b) = 3 and w (c) = 3m, with three parallel edges b et w een a and b, and three parallel edges b et w een a and c. The minim um WFVS F
  • with
size 1 consists
  • f
v ertex a. According to V ersion I I, the probabilit y
  • f
c ho
  • sing
v ertex a is (Eq. 2): p(a) =
  • (1
+ 1=m)
  • +
1 So if
  • is
arbitrarily small and m is sucien tly large, then the probabilit y
  • f
c ho
  • sing
v ertex a is arbitrarily small. Th us, the probabilit y
  • f
c ho
  • sing
a v ertex from some F
  • b
y the criterion d(v )=w (v ), as done b y V ersion I I, can b e arbitrarily small. If,
  • n
the
  • ther
hand, V ersion I is used, then the probabilit y
  • f
c ho
  • sing
a; b,
  • r
c is 1=2; 1=4; 1=4 , resp ectiv ely . Th us, the exp ected w eigh t
  • f
the rst v ertex to b e c hosen is 3=4
  • (
+ m + 4), while the w eigh t
  • f
a minim um WFVS is 6. Consequen tly , if m is sucien tly large, the exp ected w eigh t
  • f
a WFVS found b y V ersion I can b e arbitrarily larger than a minim um WFVS. The algorithm for rep eated guesses, whic h w e call Repea tedW GuessI(G; c; j ) is as follo ws: rep eat SingleW GuessI(G; j ) c 6 j times, where j is the minimal n um b er
  • f
v ertices
  • f
a minim um w eigh t FVS w e seek. If no FVS is found
  • f
size
  • j
, the algorithm
  • utputs
that the size
  • f
a minim um WFVS is larger than j with high probabilit y ,
  • therwise,
it
  • utputs
the ligh test FVS
  • f
size less
  • r
equal j among those explored. The follo wing theorem summarizes the main claims. 226
slide-9
SLIDE 9 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
c a b w (c) = 3 w (a) = 6 w (b) = 3m Figure 1: The minim um WFVS F
  • =
fag. Theorem 3 L et G b e a weighte d undir e cte d gr aph and c
  • 1
b e a c
  • nstant.
a) The algorithm Repea tedW GuessI(G; c; k )
  • utputs
after O (c 6 k k n) steps a minimum WFVS with pr
  • b
ability at le ast 1
  • (1
  • 1
6 k ) c6 k , wher e k is the minimal size
  • f
a minimum weight FVS
  • f
G and n is the numb er
  • f
vertic es. b) The algorithm SingleW GuessI I(G,k )
  • utputs
a fe e db ack vertex set whose exp e cte d weight is no mor e than six times the weight
  • f
a minimum weight FVS. The pro
  • f
  • f
eac h part requires a preliminary lemma. Lemma 4 L et G = (V ; E ) b e a br anchy gr aph, F b e a fe e db ack vertex set
  • f
G and X = V n F . L et E X denote the set
  • f
e dges in E whose endp
  • ints
ar e al l vertic es in X and E F ;X denote the set
  • f
e dges in G that c
  • nne
ct vertic es in F with vertic es in X . Then, jE X j
  • 2
  • jE
F ;X j. Pro
  • f.
Let X b b e the set
  • f
branc hp
  • in
ts in X . W e replace ev ery linkp
  • in
t in X b y an edge b et w een its neigh b
  • rs,
and denote the resulting set
  • f
edges b et w een v ertices in X b b y E b X b and b et w een v ertices in X b and F b y E b F ;X b . The pro
  • f
  • f
Lemma 1 sho ws that jE b X b j
  • jE
b F ;X b j: Since ev ery linkp
  • in
t in X has b
  • th
neigh b
  • rs
in the set X b [ F , the follo wing holds: jE X j
  • 2
  • jE
b X b j and jE F ;X j = jE b F ;X b j: Hence, jE X j
  • 2
  • jE
F ;X j. 2 An immediate consequence
  • f
Lemma 4 is that the probabilit y
  • f
randomly c ho
  • sing
an edge that has at least
  • ne
endp
  • in
t that b elongs to a FVS is greater
  • r
equal 1=3. Th us, selecting a v ertex at random from a randomly selected edge has a probabilit y
  • f
at least 1=6 to b elong to a FVS. Consequen tly , if the algorithm terminates after c 6 k iterations, with a WFVS
  • f
size k , there is a probabilit y
  • f
at least 1
  • (1
  • 1
6 k ) c6 k that the
  • utput
is a minim um WFVS
  • f
size at most k . This pro v es part (a)
  • f
Theorem 3. Note that since k is not kno wn in adv ance, w e use Repea tedW GuessI(G; c; j ) with increasing v alues
  • f
j un til a FVS is found, sa y when j=J. When suc h a set is found it is still p
  • ssible
that there exists a WFVS with more than J v ertices that has a smaller w eigh t than the
  • ne
found. This happ ens when k > J . Ho w ev er, among the WFVSs
  • f
size at most J , the algorithm nds
  • ne
with minim um w eigh t with high probabilit y . The second part requires the follo wing lemma. 227
slide-10
SLIDE 10 Becker, Bar-Yehud a, & Geiger Lemma 5 L et G b e a br anchy gr aph and F b e a FVS
  • f
G. Then, X v 2V d(v )
  • 6
X v 2F d(v ): Pro
  • f.
Denote b y d Y (v ) the n um b er
  • f
edges b et w een a v ertex v and a set
  • f
v ertices Y . Then, X v 2V d(v ) = X v 2X d(v ) + X v 2F d(v ) = X v 2X d X (v ) + X v 2X d F (v ) + X v 2F d(v ): Due to Lemma 4, X v 2X d X (v ) = 2jE X j
  • 4jE
F ;X j = 4 X v 2X d F (v ): (3) Consequen tly , X v 2V d(v )
  • 4
X v 2X d F (v )+ X v 2X d F (v ) + X v 2F d(v )
  • 6
X v 2F d(v ) as claimed. 2 W e can no w pro v e part (b)
  • f
Theorem 3 analyzing SingleW GuessI I(G,k ). Recall that V i is the set
  • f
v ertices in graph G i in iteration i, d i (v ) is the degree
  • f
v ertex v in G i , and v i is the v ertex c hosen in iteration i. F urthermore, recall that p i (v ) is the probabilit y to c ho
  • se
v ertex v in iteration i. The exp ected w eigh t E i (w (v )) = P v 2V i w (v )
  • p
i (v )
  • f
a c hosen v ertex in iteration i is denoted with a i . Th us, due to the linearit y
  • f
the exp ectation
  • p
erator, E (w (F )) = P k i=1 a i , assuming jF j = k . W e dene a normalization constan t for iteration i as follo ws:
  • i
= " X u2V i d i (u) w (u) # 1 Then, p i (v ) =
  • i
  • d
i (v ) w (v ) and a i = X v 2V i w (v )
  • d
i (v ) w (v )
  • i
=
  • i
  • X
v 2V i d i (v ) Let F
  • b
e a minim um FVS
  • f
G and F
  • i
b e minim um w eigh t FVS
  • f
the graph G i . The exp ected w eigh t E i (w (v )jv 2 F
  • i
))
  • f
a v ertex c hosen from F
  • i
in iteration i is denoted with b i . W e ha v e, b i = X v 2F
  • i
w (v )
  • p
i (v ) =
  • i
  • X
v 2F
  • i
d i (v ) By Lemma 5, a i =b i
  • 6
for ev ery i. 228
slide-11
SLIDE 11 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
Recall that b y denition F
  • 2
is the minim um FVS in the branc h y graph G 2
  • btained
from G 1 n fv 1 g. W e get, E (w (F
  • ))
  • E
1 (w (v )jv 2 F
  • 1
)) + E (w (F
  • 2
)) b ecause the righ t hand side is the exp ected w eigh t
  • f
the
  • utput
F assuming the algorithm nds a minim um FVS
  • n
G 2 and just needs to select
  • ne
additional v ertex, while the left hand side is the unrestricted exp ectation. By rep eating this argumen t w e get, E (w (F
  • ))
  • b
1 + E (w (F
  • 2
))
  • k
X i=1 b i Using P i a i = P i b i
  • max
i a i =b i
  • 6,
w e
  • btain
E (w (F ))
  • 6
  • E
(w (F
  • )):
Hence, E (w (F ))
  • 6
  • w
(F
  • )
as claimed. 2 The pro
  • f
that SingleGuess(G; k )
  • utputs
a FVS whose exp ected size is no more than 4k (Theorem 2) where k is the size
  • f
a minim um FVS is analogous to the pro
  • f
  • f
Theorem 3 in the follo wing sense. W e assign a w eigh t 1 to all v ertices and replace the reference to Lemma 5 b y a reference to the follo wing claim: If F is a FVS
  • f
a ric h graph G, then P v 2V d(v )
  • 4
P v 2F d(v ). The pro
  • f
  • f
this claim is iden tical to the pro
  • f
  • f
Lemma 5 except that instead
  • f
using Lemma 4 w e use Lemma 1. 3.3 The Practical Algorithm In previous sections w e presen ted sev eral algorithms for nding minim um FVS with high probabilit y . The description
  • f
these algorithms w as geared to w ards analysis, rather than as a prescription to a programmer. In particular, the n um b er
  • f
iterations used within Repea tedW GuessI(G; c; k ) is not c hanged as the algorithm is run with j < k . This feature allo w ed us to regard eac h call to SingleW GuessI(G; j ) made b y Repea tedW GuessI as an indep enden t pro cess. F urthermore, there is a small probabilit y for a v ery long run ev en when the size
  • f
the minim um FVS is small. W e no w sligh tly mo dify Repea tedW GuessI to
  • btain
an algorithm, termed WRA, whic h do es not suer from these deciencies. The new algorithm w
  • rks
as follo ws. Rep eat SingleW GuessI(G; jV j) for min(Max; c 6 w (F ) ) iterations, where w (F ) is the w eigh t
  • f
the ligh test WFVS found so far and Max is some sp ecied constan t determining the maxim um n um b er
  • f
iterations
  • f
SingleW GuessI. ALGORITHM WRA(G; c; Max ) Input: A n undir e cte d weighte d gr aph G(V ; E ) and c
  • nstants
Max and c > 1 Output: A fe e db ack vertex set F F SingleW GuessI (G; jV j) M min(Max ; c 6 w (F ) ); i 1; While i
  • M
do 1. F SingleW GuessI(G; jV j) 2. If w (F )
  • w
(F ) then 229
slide-12
SLIDE 12 Becker, Bar-Yehud a, & Geiger jV j jE j v alues size MGA WRA Eq. 15 25 2{6 3{6 12 81 7 15 25 2{8 3{6 7 89 4 15 25 2{10 3{6 6 90 4 25 55 2{6 7{12 3 95 2 25 55 2{8 7{12 3 97 25 55 2{10 7{12 100 55 125 2{10 17{22 100 31 652 17 Figure 2: Num b er
  • f
graphs in whic h MGA
  • r
WRA yield a smaller lo
  • p
cutset. The last column records the n um b er
  • f
graphs for whic h the t w
  • algorithms
pro duced lo
  • p
cutsets
  • f
the same w eigh t. Eac h line in the table is based
  • n
100 graphs. F F ; M min(Max; c 6 w (F ) ) 3. i i + 1; End fWhileg Return F Theorem 6 If Max
  • c6
k , wher e k is the minimal size
  • f
a minimum WFVS
  • f
an undi- r e cte d weighte d gr aph G, then WRA(G; c; Max )
  • utputs
a minimum WFVS
  • f
G with pr
  • b-
ability at le ast 1
  • (1
  • 1
6 k ) c6 k . The pro
  • f
is an immediate corollary
  • f
Theorem 3. The c hoice
  • f
Max and c dep end
  • n
the application. A decision-theoretic approac h for selecting suc h v alues for an y-time algorithms is discussed b y Breese and Horvitz (1990). 4. Exp erime n tal Results The exp erimen ts compared the
  • utputs
  • f
WRA vis- a-vis a greedy algorithm GA and a mo died greedy algorithm MGA (Bec k er & Geiger, 1996) based
  • n
randomly generated graphs and
  • n
some real graphs con tributed b y the Hugin group (www.h ugin.com). The random graphs are divided in to three sets. Graphs with 15 v ertices and 25 edges where the n um b er
  • f
v alues asso ciated with eac h v ertex is randomly c hosen b et w een 2 and 6, 2 and 8, and b et w een 2 and 10. Graphs with 25 v ertices and 55 edges where the n um b er
  • f
v alues asso ciated with eac h v ertex is randomly c hosen b et w een 2 and 6, 2 and 8, and b et w een 2 and 10. Graphs with 55 v ertices and 125 edges where the n um b er
  • f
v alues asso ciated with eac h v ertex is randomly c hosen b et w een 2 and 10. Eac h instance
  • f
the three classes is based
  • n
100 random graphs generated as describ ed b y Suermondt and Co
  • p
er (1990). The total n um b er
  • f
random graphs w e used is 700. The results are summarized in the table
  • f
Figure 2. WRA is run with Max = 300 and c = 1. The t w
  • algorithms,
MGA and WRA,
  • utput
lo
  • p
cutsets
  • f
the same size in
  • nly
230
slide-13
SLIDE 13 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
Name jV j jE j jF
  • j
GA MGA WRA W ater 32 123 16 40.7 42.7 29.5 Mildew 35 80 14 48.1 40.5 39.3 Barley 48 126 20 72.1 76.3 57.3 Munin1 189 366 59 159.4 167.5 122.6 Figure 3: Log size (base 2)
  • f
the lo
  • p
cutsets found b y GA, MGA, and WRA. 17 graphs and when the algorithms disagree, then in 95%
  • f
these graphs WRA p erformed b etter than MGA. The actual run time
  • f
W RA(G; 1; 300) is ab
  • ut
300 times slo w er than GA (or MGA)
  • n
G. On the largest random graph w e used, it to
  • k
4.5 min utes. Most
  • f
the time is sp end in the last impro v emen t
  • f
WRA. Considerable run time can b e sa v ed b y letting Max = 5. F
  • r
all 700 graphs, WRA(G,1,5) has already
  • btained
a b etter lo
  • p
cutset than MGA. The largest impro v emen t, with Max = 300, w as from a w eigh t
  • f
58.0 (l
  • g
2 scale) to a w eigh t
  • f
35.9. The impro v emen ts in this case w ere
  • btained
in iterations 1, 2, 36, 83, 189 with resp ectiv e w eigh ts
  • f
46.7, 38.8, 37.5, 37.3, 35.9 and resp ectiv e sizes
  • f
22, 18, 17, 18, and 17 no des. On the a v erage, after 300 iterations, the impro v emen t for the larger 100 graphs w as from a w eigh t
  • f
52 to 39 and from size 22 to 20. The impro v emen t for the smaller 600 graphs w as from a w eigh t
  • f
15 to 12.2 and from size 9 to 6.7. The second exp erimen t compared b et w een GA, MGA and WRA
  • n
four real Ba y esian net w
  • rks
sho wing that WRA
  • utp
erformed b
  • th
GA and MGA after a single call to Sin- gleW GuessI. The w eigh t
  • f
the
  • utput
con tin ued to decrease logarithmically with the n um b er
  • f
iterations. W e rep
  • rt
the results with Max = 1000 and c = 1. Run time w as b et w een 3 min utes for W ater and 15 min utes for Munin1
  • n
a P en tium 133 with 32M RAM. 5. Discussion Our randomized algorithm, WRA, has b een incorp
  • rated
in to the p
  • pular
genetic soft w are F ASTLINK 4.1 b y Alejandro Sc h aer who dev elops and main tains this soft w are at the National Institute
  • f
Health. WRA replaced previous appro ximation algorithms for nding FVS b ecause with a small Max v alue it already matc hed
  • r
impro v ed F ASTLINK 4.0
  • n
most datasets examined. The datasets used for comparison are describ ed b y Bec k er et al. (1998). The main c haracteristics
  • f
these datasets is that they w ere all collected b y geneticists, they ha v e a small n um b er
  • f
lo
  • ps,
and a large n um b er
  • f
v alues at eac h no de (tens to h undreds dep ending
  • n
the genetic analysis). F
  • r
suc h net w
  • rks
the metho d
  • f
conditioning is widely used b y geneticists. The leading inference algorithm, ho w ev er, for Ba y esian net w
  • rks
is the clique-tree algo- rithm (Lauritzen & Spiegelhalter, 1988) whic h has b een further dev elop ed in sev eral pap ers (Jensen, Lauritzen, & Olsen, 1990a; Jensen, Olsen, & Andersen, 1990b). F
  • r
the net w
  • rks
presen ted in T able 3 conditioning is not a feasible metho d while the clique tree algorithm can and is b eing used to compute p
  • sterior
probabilities in these net w
  • rks.
F urthermore, it has b een sho wn that the w eigh t
  • f
the largest clique is b
  • unded
b y the w eigh t
  • f
the lo
  • p
cutset union the largest paren t set
  • f
a v ertex in a Ba y esian net w
  • rk
implying that the 231
slide-14
SLIDE 14 Becker, Bar-Yehud a, & Geiger clique tree algorithm is alw a ys sup erior in time p erformance
  • v
er the conditioning algorithm (Shac h ter, Andersen, & Szolo vits, 1994). The t w
  • metho
ds, ho w ev er, can b e com bined to strik e a balance b et w een time and space requiremen ts as done within the buc k et elimination framew
  • rk
(Dec h ter, 1999). The algorithmic ideas b ehind the randomized algorithms presen ted herein can also b e applied for constructing go
  • d
clique trees and initial exp erimen ts conrm that an impro v e- men t
  • v
er deterministic algorithms is
  • ften
  • btained.
The idea is that instead
  • f
greedily selecting the smallest clique when constructing a clique tree,
  • ne
w
  • uld
randomly select the next clique according to the relativ e w eigh ts
  • f
the candidate cliques. It remains to dev elop the theory b ehind random c hoices
  • f
clique trees b efore a solid assessmen t can b e presen ted. Curren tly , there is no algorithm for nding a clique tree suc h that its size is guaran teed to b e close to
  • ptimal
with high probabilit y . Horvitz et al. (1989) sho w that the metho d
  • f
conditioning can b e useful for appro ximate inference. In particular, they sho w ho w to rank the instances
  • f
a lo
  • p
cutset according to their prior probabilities assuming all v ariables in the cutset are marginally indep enden t. The conditioning algorithm can then b e run according to this ranking and the answ er to a query b e giv en as an in terv al that shrinks to w ards the exact solution as more instances
  • f
the lo
  • p
cutset are considered (Horvitz, Suermondt, & Co
  • p
er, 1989; Horvitz, 1990). Applying this idea without making indep endence assumptions is describ ed b y Darwic he (1994). So if the maximal clique is to
  • large
to store
  • ne
can still p erform appro ximate inferences using the conditioning algorithm. Ac kno wledgmen t W e thank Se Naor for fruitful discussions. P art
  • f
this w
  • rk
w as done while the third author w as
  • n
sabbatical at Microsoft Researc h. A v arian t
  • f
this w
  • rk
has b een presen ted at the fteen th conference
  • n
uncertain t y in articial in telligence, July 1999, Sw eden. References Bafna, V., Berman, P ., & F ujito, T. (1995). Constan t ratio appro ximations
  • f
the w eigh ted feedbac k v ertex set problem for undirected graphs. In Pr
  • c
e e dings
  • f
the Sixth A nnual Symp
  • sium
  • n
A lgorithms and Computation (ISAA C95), pp. 142{151. Bar-Y eh uda, R., Geiger, D., Naor, J., & Roth, R. (1994). Appro ximation algorithms for the feedbac k v ertex set problems with applications to constrain t satisfaction and Ba y esian inference. In Pr
  • c
e e dings
  • f
the 5th A nnual A CM-Siam Symp
  • sium
On Discr ete A l- gorithms, pp. 344{354. Bec k er, A., & Geiger, D. (1994). Appro ximation algorithms for the lo
  • p
cutset problem. In Pr
  • c
e e dings
  • f
the 10th c
  • nfer
enc e
  • n
Unc ertainty in A rticial Intel ligenc e, pp. 60{68. Bec k er, A., & Geiger, D. (1996). Optimization
  • f
p earl's metho d
  • f
conditioning and greedy- lik e appro ximation algorithms for the feedbac k v ertex set problem. A rticial Intel li- genc e, 83, 167{188. 232
slide-15
SLIDE 15 Randomized Algorithms f
  • r
the Loop Cutset Pr
  • blem
Bec k er, A., Geiger, D., & Sc haer, A. (1998). Automatic selection
  • f
lo
  • p
break ers for genetic link age analysis. Human Her e dity, 48, 47{60. Bo dlaender, H. (1990). On disjoin t cycles. International Journal
  • f
F
  • undations
  • f
Com- puter Scienc e (IJF CS), 5, 59{68. Breese, J., & Horvitz, E. (1990). Ideal reform ulation
  • f
b elief net wroks. In Pr
  • c
e e dings
  • f
the 6th c
  • nfer
enc e
  • n
Unc ertainty in A rticial Intel ligenc e, pp. 64{72. Darwic he, A. (1994).
  • b
  • unded
conditioning: A metho d for the appro ximate up dating
  • f
causal net w
  • rks.
R ese ar ch note, R
  • ckwel
l Scienc e Center. Dec h ter, R. (1990). Enhancemen t sc hemes for constrain t pro cessing: bac kjumping, learning, and cutset decomp
  • sition.
A rticial Intel ligenc e, 41, 273{312. Dec h ter, R. (1999). Buc k et elimination: A unifying framew
  • rk
for structure-driv en infer- ence. A rticial Intel ligenc e, T
  • app
e ar. Do wney , R., & F ello ws, M. (1995a). Fixed-parameter tractabilit y and completeness I: Basic results. SIAM Journal
  • n
Computing, 24 (4), 873{921. Do wney , R., & F ello ws, M. (1995b). P arameterized computational feasibilit y . In P . Clote, . J. R. (Ed.), F e asible Mathematics II, pp. 219{244. Birkhauser, Boston. F ujito, T. (1996). A note
  • n
appro ximation
  • f
the v ertex co v er and feedbac k v ertex set problems
  • unied
approac h. Information Pr
  • c
essing L etters, 59, 59{63. Garey , M., & Johnson, D. (1979). Computers and Intr actability: A Guide to the The
  • ry
  • f
NP-c
  • mpleteness.
W. H. F reeman, San F rancisco, California. Geiger, D., & P earl, J. (1990). On the logic
  • f
causal mo dels. In Unc ertainty in A rticial Intel ligenc e 4, pp. 3{14 New Y
  • rk.
North-Holland. Geiger, D., V erma, T., & P earl, J. (1990). Iden tifying indep endence in ba y esian net w
  • rks.
Networks, 20, 507{534. Horvitz, E. J. (1990). Computation and action under b
  • unded
resources. Ph.D dissertation, Stanfor d university. Horvitz, E. J., Suermondt, H. J., & Co
  • p
er, G. H. (1989). Bounded conditioning: Flexible inference for decisions under scarce resources. In Pr
  • c
e e dings
  • f
5th c
  • nfer
enc e
  • n
Unc ertainty in A rticial Intel ligenc e, pp. 182{193. Morgan Kaufmann. Jensen, F., Lauritzen, S. L., & Olsen, K. (1990a). Ba y esian up dating in causal probabilisitic net w
  • rks
b y lo cal computations. Computational Statistics Quarterly, 4, 269{282. Jensen, F., Olsen, K., & Andersen, S. (1990b). An algebra
  • f
ba y esian b elief univ erses for kno wledge-based systems. Networks, 20, 637{659. Kim, H., & P earl, J. (1983). A computational mo del for com bined causal and diagnos- tic reasoning in inference systems. In Pr
  • c
e e dings
  • f
the Eighth International Joint Confer enc e
  • n
A rticial Intel ligenc e (IJCAI83), pp. 190{193. 233
slide-16
SLIDE 16 Becker, Bar-Yehud a, & Geiger Lang, K. (1997). Mathematic al and statistic al metho ds for genetic analysis. Springer. Lange, K., & Elston, R. (1975). Extensions to p edigree analysis. I. lik eliho
  • d
calculation for simple and complex p edigrees. Human Her e dity, 25, 95{105. Lauritzen, S., & Spiegelhalter, D. (1988). Lo cal computations with probabilities
  • n
graphical structures and their application to exp ert systems (with discussion). Journal R
  • yal
Statistic al So ciety, B, 50, 157{224. Ott, J. (1991). A nalysis
  • f
human genetic linkage (r evise d e dition). The Johns Hopkins Univ ersit y Press. P earl, J. (1986). F usion, propagation and structuring in b elief net w
  • rks.
A rticial Intel li- genc e, 29, 241{288. P earl, J. (1988). Pr
  • b
abilistic r e asoning in intel ligent systems: Networks
  • f
plausible infer- enc e. Morgan Kaufmann, San Mateo, California. P eot, M., & Shac h ter, R. (1991). F usion and propagation with m ultiple
  • bserv
ations in b elief net w
  • rks.
A rticial Intel ligenc e, 48, 299{318. Shac h ter, R., Andersen, S., & Szolo vits, P . (1994). Global conditioning for probabilistic inference in b elief net w
  • rks.
In Pr
  • c
e e dings
  • f
the tenth c
  • nfer
enc e
  • n
Unc ertainty in A rticial Intel ligenc e, pp. 514{522. Morgan Kaufmann. Suermondt, H., & Co
  • p
er, G. (1990). Probabilistic inference in m ultiply connected b elief net w
  • rks
using lo
  • p
cutsets. International Journal
  • f
Appr
  • ximate
R e asoning, 4, 283{ 306. V erma, T., & P earl, J. (1988). Causal net w
  • rks:
Seman tics and expressiv eness. In Pr
  • c
e e d- ings
  • f
4th Workshop
  • n
Unc ertainty in A rticial Intel ligenc e, pp. 352{359. V
  • ss,
H. (1968). Some prop erties
  • f
graphs con taining k indep enden t circuits. In Pr
  • c
e e dings
  • f
Col lo quium Tihany, pp. 321{334 New Y
  • rk.
Academic Press. 234