cpts 570 machine learning school of eecs washington state
play

CptS 570 Machine Learning School of EECS Washington State - PowerPoint PPT Presentation

CptS 570 Machine Learning School of EECS Washington State University CptS 570 - Machine Learning 1 Relational data Logic-based representation Graph-based representation Propositionalization Inductive Logic Programming


  1. CptS 570 – Machine Learning School of EECS Washington State University CptS 570 - Machine Learning 1

  2.  Relational data  Logic-based representation  Graph-based representation  Propositionalization  Inductive Logic Programming (ILP)  Graph-based relational learning  Applications CptS 570 - Machine Learning 2

  3.  So far, training data have been propositional ◦ Each instance represents one entity and its features Person ID ID Firs First N Name Last na t name me Ag Age Inc ncom ome P1 John Doe 30 120,000 P2 Jane Doe 29 140,000 P3 Robert Smith 45 280,000 … … … … …  Learned hypotheses have also been propositional ◦ If Income > 250,000 Then Rich CptS 570 - Machine Learning 3

  4.  Entities may be related to each other Married Perso son1 Perso son2 P1 P2 P3 P7 … …  Learned hypotheses should allow relations ◦ If If Income(Person1,Income1) and and Income(Person2,Income2) and and Married(Person1,Person2) and (Income1+Income2)>250,000 The and Then RichCouple(Person1,Person2) CptS 570 - Machine Learning 4

  5.  Logic-based representation  Data ◦ Person(ID, FirstName,LastName,Income)  Person(P1,John,Doe,120,000) ◦ Married(Person1,Person2)  Married(P1,P2), Married(P2,P1)  Hypotheses ◦ If If Person(ID1,FirstName1,LastName1,Income1) and nd Person(ID2,FirstName2,LastName2,Income2) and nd Married(ID1,ID2) and nd (Income1+Income2)>250,000 Th Then RichCouple(ID1,ID2) CptS 570 - Machine Learning 5

  6.  Graph-based representation  Data Doe John Doe Jane First Last First Last Married ID ID P1 Person Person P2 Age Income Income Age 30 120000 29 140000 Smith Robert First Last … ID Married P1 Person Income Age 45 280000 CptS 570 - Machine Learning 6

  7.  Graph-based representation  Hypotheses Married Person Person Income Income X Y Operand Operand + Result 250000 Z Operand Operand > Result true CptS 570 - Machine Learning 7

  8.  Logical rule ◦ Instance consists of relations  E.g., Person(), Married(), … ◦ Check if rule is matched by new instance ◦ Unification (NP-Complete)  Graphical rule ◦ Instance consists of a graph ◦ Check if rule matches a subgraph of instance ◦ Subgraph isomorphism (NP-Complete)  Many polynomial-time specializations exist (e.g., Horn clauses, trees) CptS 570 - Machine Learning 8

  9.  Create new single table combining all relations Firs irst Last t Age1 Income ome1 Firs irst Last La Age2 Income ome2 Marrie ried Name me1 Name me1 Name me2 Name me2 John Doe 30 120,000 Jane Doe 29 140,000 Yes Jane Doe 29 140,000 Robert Smith 45 280,000 No Robert Smith 45 280,000 Jane Doe 29 140,000 No … … … … … … … … …  Apply propositional learner  Number of fields in new table can grow exponentially CptS 570 - Machine Learning 9

  10.  Terminology ◦ Relations are predicates (e.g., person, married) ◦ Predicate p(a 1 ,a 2 ,…,a n ) has arguments a 1 , a 2 , …, a n ◦ Arguments can be constants (e.g., sally) or variables (e.g., X, Income1) ◦ A predicate is ground if it contains no variables CptS 570 - Machine Learning 10

  11.  Terminology ◦ A literal is a predicate or its negation ◦ A clause is a disjunction of literals ◦ A Horn clause has at most one positive literal ◦ A definite clause has exactly one positive literal (a ⋀ b  c) ≡ (~a ⋁ ~b ⋁ c)  Adopt Prolog syntax ◦ Predicates and constants are lowercase ◦ Variables are uppercase ◦ E.g., married(X,Y), person(p1,john,doe,30,120000) CptS 570 - Machine Learning 11

  12.  Given ◦ Training examples (x t ,r t ) ∈ X  x t is a set of facts (ground predicates)  r t is a ground predicate ◦ Background knowledge B  B is a set of predicates and rules (definite clauses)  Find hypothesis h such that ◦ ( ∀( x t ,r t ) ∈ X) B ⋀ h ⋀ x t ⊢ r t ◦ where ⊢ means entails (can deduce) CptS 570 - Machine Learning 12

  13.  Example ◦ Learn concept of child(X,Y): Y is the child of X ◦ r t : child(bob,sharon) ◦ x t : male(bob), female(sharon), father(sharon,bob) ◦ B: parent(U,V)  father(U,V) ◦ h 1 : child(X,Y)  father(Y,X) ◦ h 2 : child(X,Y)  parent(Y,X) CptS 570 - Machine Learning 13

  14.  First-Order Induction of Logic (FOIL)  Learns Horn clauses  Set covering algorithm ◦ Seeks rules covering subsets of positive examples  Each new rule generalizes the learned hypothesis  Each conjunct added to a rule specializes the rule CptS 570 - Machine Learning 14

  15. CptS 570 - Machine Learning 15

  16.  Learning rule p(X 1 ,X 2 ,…,X k )  L 1 ⋀ … ⋀ L n  Candidate specializations add new literal of form: ◦ Q(V 1 ,…,V r ) where at least one of V i must already exist as a variable in the rule ◦ Equal(X j ,X k ) where X j and X k are variables present in the rule ◦ The negation of either of the above forms of literals CptS 570 - Machine Learning 16

  17.   p p   = − 1 0 ( , ) log log FoilGain L R t   + + 2 2   p n p n 1 1 0 0  L is the candidate literal to add to rule R  p 0 = number of positive bindings of R  n 0 = number of negative bindings of R  p 1 = number of positive bindings of R+L  n 1 = number of negative bindings of R+L  t = number of positive bindings of R and R+L  Note: -log 2 (p 0 /(p 0 + n 0 )) = number of bits to indicate the class of a positive binding of R CptS 570 - Machine Learning 17

  18.  Target concept ◦ canReach(X,Y) true if directed path from X to Y  Examples ◦ Pairs of nodes for which path exists (e.g., <1,5>) ◦ Graph described by literals  E.g., linkedTo(0,1), linkedTo(0,8)  Hypothesis space ◦ Horn clauses using predicates linkedTo and canReach CptS 570 - Machine Learning 18

  19. X: 0, 1, 2, 3, 4, 5, 6, 7, 8. 2,3 *linkedto(X,X) 2,4 0,1 canreach(X,X) 2,5 0,2 2,6 0,1 1,2 0,2 2,7 2,3 0,3 2,8 3,4 0,4 3,4 3,8 0,5 3,5 4,5 3,6 0,6 4,8 0,7 3,7 5,6 0,8 3,8 6,7 1,2 4,5 6,8 1,3 4,6 7,8 1,4 4,7 . 1,5 4,8 1,6 5,6 1,7 5,7 1,8 5,8 … 6,7 6,8 7,8 . … CptS 570 - Machine Learning 19

  20. FOIL 6.4 [January 1996] -------- Relation canreach Relation *linkedto ---------- canreach: State (36/81, 91.4 bits available) Save clause ending with linkedto(A,B) (cover 12, accuracy 100%) Save linkedto(C,B) (36,72 value 6.0) Best literal linkedto(A,B) (4.6 bits) Clause 0: canreach(A,B) :- linkedto(A,B). … CptS 570 - Machine Learning 20

  21. State (24/69, 81.4 bits available) Save clause ending with not(linkedto(C,A)) (cover 6, accuracy 85%) Save linkedto(C,B) (24,60 value 4.8) Save not(linkedto(B,A)) (24,57 value 6.5) Best literal not(linkedto(C,A)) (4.6 bits) State (6/7, 33.5 bits available) Save clause ending with A<>B (cover 6, accuracy 100%) Best literal A<>B (2.0 bits) Clause 1: canreach(A,B) :- not(linkedto(C,A)), A<>B. State (18/63, 71.5 bits available) Save not(linkedto(B,A)) (18,51 value 5.4) Best literal linkedto(C,B) (4.6 bits) … CptS 570 - Machine Learning 21

  22. State (27/73 [18/54], 66.9 bits available) Save clause ending with canreach(A,C) (cover 18, accuracy 100%) Best literal canreach(A,C) (4.2 bits) Clause 2: canreach(A,B) :- linkedto(C,B), canreach(A,C). Delete clause canreach(A,B) :- not(linkedto(C,A)), A<>B. canreach(A,B) :- linkedto(A,B). canreach(A,B) :- linkedto(C,B), canreach(A,C). Time 0.0 secs CptS 570 - Machine Learning 22

  23.  Resolution rule  Given initial clauses C 1 and C 2 , find a literal L from clause C 1 such that ¬L occurs in clause C 2  Form the resolvent C by including all literals from C 1 and C 2 , except for L and ¬L ◦ C = (C 1 − {L}) ∪ (C 2 − {¬L}) ◦ where ∪ denotes set union, and “−” denotes set difference CptS 570 - Machine Learning 23

  24. CptS 570 - Machine Learning 24

  25.  Propositional  Given initial clauses C 1 and C, find a literal L that occurs in clause C 1 , but not in clause C  Form the second clause C 2 by including the following literals ◦ C 2 = (C − (C 1 − {L})) ∪ {¬L} CptS 570 - Machine Learning 25

  26.  First-order resolution ◦ Find a literal L 1 from clause C 1 , literal L2 from clause C 2 , and substitution θ such that L 1 θ = ¬L 2 θ ◦ Form the resolvent C by including all literals from C 1 θ and C 2 θ, except for L 1 θ and ¬L 2 θ ◦ C = (C 1 − {L 1 })θ ∪ (C 2 − {L 2 })θ  Inverting first-order resolution ◦ C 2 = (C − (C 1 − {L 1 })θ 1 )θ 2 -1 ∪ {¬ L 1 θ 1 θ 2 -1 } CptS 570 - Machine Learning 26

  27. CptS 570 - Machine Learning 27

  28.  Reduce combinatorial explosion by generating the most specific acceptable h  User specifies H by stating predicates, functions, and forms of arguments allowed for each  Progol uses set covering algorithm ◦ For each <x t ,r t > ◦ Find most specific hypothesis h t s.t. B ∧ h t ∧ x t ⊢ r t  actually, considers only k-step entailment  Conduct general-to-specific search bounded by specific hypothesis h t , choosing hypothesis with minimum description length CptS 570 - Machine Learning 28

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend