Larry Holder School of EECS Washington State University Artificial - - PowerPoint PPT Presentation

larry holder school of eecs washington state university
SMART_READER_LITE
LIVE PREVIEW

Larry Holder School of EECS Washington State University Artificial - - PowerPoint PPT Presentation

Larry Holder School of EECS Washington State University Artificial Intelligence 1 } Knowledge base T ELL agent about the environment } Knowledge representation First-order logic Many others } Reasoning via inference A SK agent


slide-1
SLIDE 1

Larry Holder School of EECS Washington State University

1 Artificial Intelligence

slide-2
SLIDE 2

} Knowledge base

  • TELL agent about the environment

} Knowledge representation

  • First-order logic
  • Many others…

} Reasoning via inference

  • ASK agent how to achieve goal based on current

knowledge

Artificial Intelligence 2

slide-3
SLIDE 3

Artificial Intelligence 3

function KB-AGENT (percept) returns an action persistent: KB, a knowledge base t, a counter, initially 0, indicating time TELL (KB, MAKE-PERCEPT-SENTENCE (percept, t)) action ← ASK (KB, MAKE-ACTION-QUERY (t)) TELL (KB, MAKE-ACTION-SENTENCE (action, t)) t ← t + 1 return action

slide-4
SLIDE 4

} Goals

  • Visit safe locations
  • Grab gold if present
  • If have gold or no more safe,

unvisited locations, then move to [1,1] and Climb

Artificial Intelligence 4

slide-5
SLIDE 5

Artificial Intelligence 5

2 3 3 1 2 4 1 4

slide-6
SLIDE 6

Artificial Intelligence 6

2 3 3 1 2 4 1 4

slide-7
SLIDE 7

Artificial Intelligence 7

2 3 3 1 2 4 1 4

slide-8
SLIDE 8

} Percept1 = [None,None,None,None,None]

  • [1,2] and [2,1] safe

} Action = GoForward } Percept2 = [None,Breeze,None,None,None] } Either [2,2] or [3,1] or both has a pit } Execute TurnLeft, TurnLeft, GoForward, TurnRight, GoForward

Artificial Intelligence 8

slide-9
SLIDE 9

}

Percept7 = [Stench,None,None,None,None]

  • Wumpus in [1,3]
  • No pit in [2,2] (safe), so pit in [3,1]

}

Could Shoot, but <TurnRight,GoForward> to [2,2]

}

Percept9 = [None,None,None,None,None]

  • [3,2] and [2,3] are safe

}

<TurnLeft,GoForward> to [2,3]

}

Percept11 = [Stench,Breeze,Glitter,None,None]

}

Grab gold, head home, and Climb (score: 1000 – 17 = 983)

Artificial Intelligence 9

OK OK

slide-10
SLIDE 10

} A knowledge base (KB) consists of

“sentences”

} Syntax specifies a well-formed sentence } Semantics specifies the meaning of a

sentence

} Example

  • Syntax: Wumpus(2,2)
  • Semantics: Wumpus is in location (2,2)

Artificial Intelligence 10

slide-11
SLIDE 11

} Logical inference is the process of inferring

  • ne sentence is true from others

} Inference should be sound or truth-

preserving

  • Everything inferred is true

} Inference should be complete

  • Everything that is true can be inferred

Artificial Intelligence 11

slide-12
SLIDE 12

} Pro

Propositional logic assumes world consists of facts that are either true, false or unknown

  • E.g., Wumpus(1,3) ⇒ Stench(1,2)

} Fi

First-or

  • rder log

logic ic assumes world consists of facts, objects and relations that are either true, false or unknown

  • E.g., Wumpus(x,y) ^ Adjacent(x,y,w,z) ⇒ Stench(w,z)

} Tem

Temporal logic = FOL where facts hold at particular times

  • E.g., Before(Action(Shoot), Percept(Scream))

} Hi

Higher-or

  • rder log

logic ic assumes world includes first-order relations as objects

  • E.g., Know( [ Wumpus(x,y) ^ Adjacent(x,y,w,z) ⇒ Stench(w,z) ] )

} Pro

Probabilistic logic = propositional logic with a degree of belief for each fact

  • E.g., P(Wumpus(1,3)) = 0.067

Artificial Intelligence 12

slide-13
SLIDE 13

} Or, First-Order Predicate Calculus (FOPC) } Borrowing from elements of natural language

  • Objects: nouns, noun phrases (e.g., wumpus, pit)
  • Relations: verbs, verb phrases (e.g., shoot)

– Properties: adjectives (e.g., smelly) – Functions: map input to single output (e.g., location(wumpus))

Artificial Intelligence 13

slide-14
SLIDE 14

Artificial Intelligence 14

Sentence → AtomicSentence | ComplexSentence AtomicSentence → Predicate | Predicate (Term,...) | Term = Term ComplexSentence → (Sentence) | [Sentence] | ¬ Sentence | Sentence ∧ Sentence | Sentence ∨ Sentence | Sentence ⇒ Sentence | Sentence ⇔ Sentence | Quantifier Variable,... Sentence Term → Function (Term,...) | Constant | Variable Quantifier → " | $ Constant → A | B | Wumpus | 1 | 2 | ... Variable → a | x | s | ... Predicate → True | False | Adjacent | At | Alive | ... Function → Location | RightOf | ... Operator Precedence: ¬, =, ∧, ∨, ⇒, ⇔

slide-15
SLIDE 15

} Not (¬) is a negation } Literal is either an atomic sentence (positive

literal) or a negated atomic sentence (negative literal)

  • ¬Breeze(1,1), Breeze(2,1)

} And (∧) is a conjunction; its parts are conjuncts } Or (∨) is a disjunction; its parts are disjuncts } Implies (⇒) is an implication

  • Pit(2,2) ⇒ Breeze(1,2)
  • Lefthand side is the antecedent or premise
  • Righthand side is the consequent or conclusion

} If and only if (⇔) is a biconditional

Artificial Intelligence 15

slide-16
SLIDE 16

} How to determine the truth value (true or

false) of every sentence

} True is always true } False is always false } Truth values of every other sentence must be

specified directly or inferred

  • E.g., Wumpus(2,2) is true, Wumpus(3,3) is false

Artificial Intelligence 16

slide-17
SLIDE 17

} Semantics for complex sentences

  • ¬P is true iff P is false
  • P ∧ Q is true iff both P and Q are true
  • P ∨ Q is true iff either P or Q is true
  • P ⇒ Q is true unless P is true and Q is false
  • P ⇔ Q is true iff P and Q are both true or both false

Artificial Intelligence 17

P Q ¬P P ∧ Q P ∨ Q P ⇒ Q P ⇔ Q false false true false false true true false true true false true true false true false false false true false false true true false true true true true Truth Table

slide-18
SLIDE 18

} Constant symbols stand for objects } Predicate symbols stand for relations } Function symbols stand for functions } R&N convention: Above symbols begin with

uppercase letters

  • E.g., Wumpus, Adjacent, RightOf

} Arity is the number of arguments to a

predicate or function

  • E.g., Adjacent (loc1, loc2), RightOf (location)

Artificial Intelligence 18

slide-19
SLIDE 19

} Terms represent objects with constants,

variables or functions

} Note: Functions do not return an object, but

represent that object

  • E.g., Action(GoForward,t) ∧ Orientation(Agent, Right,

t) ∧ At(Agent, loc, t) ⇒ At(Agent, RightOf(loc), t+1)

} R&N convention: variables begin with

lowercase letters

Artificial Intelligence 19

slide-20
SLIDE 20

} Express properties of collections of objects } Universal quantification (")

  • A statement is true for all objects represented by

quantified variables

  • E.g., " x,y At(Wumpus,x,y) ⇒ Stench(x+1,y)
  • Same as " x,y At(Wumpus,x,y) ∧ Stench(x+1,y) ?
  • Same as " x,y ¬At(Wumpus,x,y) ∨ Stench(x+1,y) ?

} "x P(x) ≡ P(A) ∧ P(B) ∧ P(Wumpus) ∧ ...

Artificial Intelligence 20

slide-21
SLIDE 21

} Existential quantification ($)

  • There exists at least one set of objects, represented

by quantified variables, for which a statement is true

  • E.g., $ w,x,y At(w,x,y) ∧ Wumpus(w)
  • Same as $ w,x,y At(w,x,y) ⇒ Wumpus(w) ?

} $x P(x) ≡ P(A) ∨ P(B) ∨ P(Wumpus) ∨ ...

Artificial Intelligence 21

slide-22
SLIDE 22

} Nested quantifiers } "x "y same as "y "x same as "x,y } $x $y same as $y $x same as $x,y } $x "y same as "y $x ?

  • $x "y Likes(x,y) ?
  • "y $x Likes(x,y) ?
  • "x $y Likes(x,y) ?
  • $y "x Likes(x,y) ?

Artificial Intelligence 22

slide-23
SLIDE 23

} Negation and quantifiers } $x P(x) ≡ ¬"x ¬P(x)

  • “If P is true for some x, then P can’t be false for all x”

} "x P(x) ≡ ¬$x ¬P(x)

  • “If P is true for all x, then there can’t be an x for which P

is false.”

} "x ¬P(x) ≡ ¬$x P(x)

  • “If P is false for all x, then there can’t be an x for which P

is true.”

} ¬"x P(x) ≡ $x ¬P(x)

  • “If P is not true for all x, then there must be an x for

which P is false.”

Artificial Intelligence 23

slide-24
SLIDE 24

} Equality symbol (Term1 = Term1) means

Term1 and Term2 refer to the same object

  • E.g., RightOf(Location(1,1)) = Location(2,1)

} Useful for constraining two terms to be

different

} E.g., Sibling

  • Sibling(x,y) ⇔ Parent(p,x) ∧ Parent(p,y)
  • Sibling(x,y) ⇔ Parent(p,x) ∧ Parent(p,y) ∧ ¬(x = y)
  • "x,y Sibling(x,y) ⇔ $p Parent(p,x) ∧ Parent(p,y) ∧

¬(x = y)

Artificial Intelligence 24

slide-25
SLIDE 25

} Closed-world assumption

  • Atomic sentences not known to be true are

assumed false

} Unique-names assumption

  • Every constant symbol refers to a distinct object

} Domain closure

  • If not named by a constant symbol, then doesn’t

exist

Artificial Intelligence 25

slide-26
SLIDE 26

} TELL (KB, α)

  • TELL (KB, Percept([st,br,Glitter,bu,sc],5))

} ASK (KB, β)

  • ASK (KB, ∃a Action(a,5))
  • I.e., does KB entail any particular actions at time 5?
  • Answer: Yes, {a/Grab} ß substitution (binding list)

} ASKVARS (KB, α)

  • Returns answers (variable bindings) that make α true
  • Or, use Answer literal (later)
  • ASK (KB, ∃a Action(a,5) ^ Answer(a))

Artificial Intelligence 26

slide-27
SLIDE 27

} Percepts

  • Percept(p,t) = predicate that is true if percept p
  • bserved at time t
  • Percept is a list of five terms
  • E.g., Percept([Stench,Breeze,Glitter,None,None],5)

} Actions

  • GoForward, TurnLeft, TurnRight, Grab, Shoot, Climb

} AskVars (∃a BestAction(a,5)) à {a/Grab}

Artificial Intelligence 27

slide-28
SLIDE 28

} “Perception”

  • ∀ t,s,g,m,c Percept([s,Breeze,g,m,c],t) ⇒ Breeze(t)
  • ∀ t,s,b,m,c Percept([s,b,Glitter,m,c],t) ⇒ Glitter(t)

} Reflex agent

  • ∀ t Glitter(t) ⇒ BestAction(Grab,t)

Artificial Intelligence 28

slide-29
SLIDE 29

} Location list term [x,y] (e.g., [1,2])

  • Pit(s) or Pit([x,y])
  • At(Wumpus,[x,y],t)
  • At(Agent,[1,1],1)

} Definition of Breezy(s), where s is a location

  • ∀s Breezy(s) ⇔ ∃r Adjacent(s,r) ∧ Pit(r)

} Definition of Adjacent

  • ∀x,y,a,b Adjacent([x,y],[a,b]) ⇔ (x=a ∧ (y=b-1 ∨

y=b+1)) ∨ (y=b ∧ (x=a-1 ∨ x=a+1))

Artificial Intelligence 29

slide-30
SLIDE 30

} Movement } Wumpus never moves

  • ∀t At(Wumpus,[1,3],t)

} Nothing can be in two places at once

  • ∀x,s1,s2,t At(x, s1,t) ∧ At(x, s2,t) ⇒ s1=s2

} Successor-state axioms for each action

  • Describes what’s true before and after action
  • ∀t HaveArrow(t+1) ⇔ (HaveArrow(t) ∧

¬Action(Shoot,t))

  • ∀t HaveGold(t+1) ⇔ (HaveGold(t) ∨ (Glitter(t) ∧

Action(Grab,t)))

  • ...

Artificial Intelligence 30

slide-31
SLIDE 31

} How do we express that there is only one

wumpus?

  • "x,y (Wumpus(x,y) ⇒ ¬($w,z Wumpus(w,z) ∧ (¬(w

= x) ∨ ¬(z=y))))

} Only one arrow? } Only one gold? } At least one pit?

Artificial Intelligence 31

slide-32
SLIDE 32

Artificial Intelligence 32

slide-33
SLIDE 33

Artificial Intelligence 33

slide-34
SLIDE 34

} Now that we have FOL, how can we perform

sound, complete and efficient inference?

} Approaches

  • Generalized modus ponens
  • Forward and backward chaining
  • Resolution

} State of the art

Artificial Intelligence 34

slide-35
SLIDE 35

} Carefully…

Artificial Intelligence 35

Monty Python and the Holy Grail (1975)

slide-36
SLIDE 36

} Notation } Modus Ponens

Artificial Intelligence 36

slide-37
SLIDE 37

} Substitution (binding) θ = {x/y}

  • Replace all occurrences of x with y
  • E.g., α = At(Wumpus,s,t), θ = {s/[1,3], t/5}

– α θ = At(Wumpus,[1,3],5)

} Generalized Modus Ponens

  • where SUBST(θ,pi’) = SUBST(θ,pi) for all i

} Find θ via unification

Artificial Intelligence 37

slide-38
SLIDE 38

} Example } ∀s,r Pit(s) ∧ Adjacent(s,r) ⇒ Breeze(r) } Pit([3,1]), Adjacent([3,1],[2,1]) } p1 = Pit(s), p2 = Adjacent(s,r), q = Breeze(r) } p1’ = Pit([3,1]), p2’ = Adjacent([3,1],[2,1]) } θ = {s/[3,1], r/[2,1]} } SUBST(θ,q) = Breeze([2,1])

Artificial Intelligence 38

slide-39
SLIDE 39

} Unification determines if two sentences

match given some substitution (unifier)

} UNIFY(p,q) = q where SUBST(q,p) = SUBST(q,q) } Examples

  • UNIFY (At(Wumpus,s,t), At(Wumpus,[1,3],5)) =

{s/[1,3], t/5}

  • UNIFY (At(Wumpus,s,t), At(Wumpus,r,5)) = {s/r, t/5}
  • UNIFY (At(Wumpus,s,t), At(Wumpus,AgentLoc(t),5)) =

{s/AgentLoc(t), t/5} = {s/AgentLoc(5), t/5}

Artificial Intelligence 39

slide-40
SLIDE 40

} Standarize variables apart

  • Use unique variable names in each sentence
  • UNIFY (At(x,[1,3],t), At(Wumpus,x,t)) = failure
  • UNIFY (At(x17,[1,3],t), At(Wumpus,x21,5)) =

{x17/Wumpus, x21/[1,3], t/5}

Artificial Intelligence 40

slide-41
SLIDE 41

} Occur check

  • When matching variable and term, check if variable
  • ccurs in term
  • If so, failure; e.g., P(x) does not unify with P(P(x))
  • Makes UNIFY quadratic in size of expression
  • Some inference systems omit occur check

Artificial Intelligence 41

slide-42
SLIDE 42

Term 1 Term 2 Substitution (or fail) Glitter(x,y) Glitter(3,3) Adjacent(x,2,2,y) Adjacent(2,y,x,3) At(w,x,y) At(Wumpus,u,3) At(Agent,x,Row(Wumpus)) At(z,3,Row(w)) At(w,Column(w),y) At(Agent,Column(Wumpus),3)

Artificial Intelligence 42

slide-43
SLIDE 43

} Start with atomic sentences in KB } Apply Modus Ponens where possible to infer

new atomic sentences

} Continue until goal is proven or no new

inferences can be made

} Assume first-order definite clauses for now

  • Disjunction of literals with exactly one positive

literal

  • E.g., ∀x,y ¬A(x) ∨ ¬B(y) ∨ C(x,y) º

∀x,y A(x) ∧ B(y) ⇒ C(x,y)

Artificial Intelligence 43

slide-44
SLIDE 44

} The law says that it is a crime for an

American to sell weapons to hostile nations.

} The country Nono, an enemy of America, has

some missiles, and all of its missiles were sold to it by Colonel West, who is American.

} Prove that Col. West is a criminal

Artificial Intelligence 44

slide-45
SLIDE 45

} “… it is a crime for an American to sell

weapons to hostile nations.”

  • R1: ∀x,y,z American(x) ∧ Weapon(y) ∧ Sells(x,y,z) ∧

Hostile(z) ⇒ Criminal(x)

} “… Nono, an enemy of America, …”

  • R2: Enemy(Nono,America)

} “… Nono … has some missiles”

  • ∃x Owns(Nono,x) ∧ Missile(x)
  • R3: Owns(Nono,M1)
  • R4: Missle(M1)

Artificial Intelligence 45

} Existential Instantiation

slide-46
SLIDE 46

} “… all of its missiles were sold to it by

Colonel West”

  • R5: ∀x Missile(x) ∧ Owns(Nono,x) ⇒

Sells(West,x,Nono)

} “… Colonel West, who is American.”

  • R6: American(West)

} A few more rules…

  • R7: ∀x Missile(x) ⇒ Weapon(x)
  • R8: ∀x Enemy(x,America) ⇒ Hostile(x)

Artificial Intelligence 46

slide-47
SLIDE 47

Artificial Intelligence 47

American(West) Missile(M1) Owns(Nono,M1) Enemy(Nono,America) Hostile(Nono) Weapon(M1) Sells(West,M1,Nono) Criminal(West) R1: {x4/West, y1/M1, z1/Nono} R7: {x1/M1} R8: {x3/Nono} R5: {x2/M1} R6 R4 R3 R2

slide-48
SLIDE 48

} Sound? } Complete? } Efficient?

  • Matching all rules against all known facts

– Conjunct ordering – R5: ∀x Missile(x) ∧ Owns(Nono,x) ⇒ Sells(West,x,Nono)

  • Recheck every rule on every iteration

– Every new fact inferred on iteration t must be derived from at least one new fact inferred on iteration t-1 – Incremental forward chaining

  • Irrelevant facts (e.g., Enemy(Wumpus,America))

Artificial Intelligence 48

slide-49
SLIDE 49

} Work backwards from the goal } For rules concluding goal, add premises as

new goals

} Continue until all open goals supported by

known facts

} Again, assume first-order definite clauses for

now

Artificial Intelligence 49

slide-50
SLIDE 50

Artificial Intelligence 50

Criminal(West) American(West) Weapon(y) Sells(West,M1,z) Hostile(Nono) Missile(y) Missile(M1) Owns(Nono, M1) Enemy(Nono,America) R6 R1: {x1/West} {y/M1} R5: {z/Nono} R4 Missile(M1) R7: {x2/y} R4 R3 R2 R8

slide-51
SLIDE 51

} Sound? } Complete? } Efficient?

  • Matching all rules against all op
  • pen goa
  • als

– More constraints – R5: ∀x Missile(x) ∧ Owns(Nono,x) ⇒ Sells(West,x,Nono)

  • Recheck every rule on every iteration

– Yes, but only those whose consequent unifies with an

  • pen goal
  • Irrelevant facts (e.g., Enemy(Wumpus,America))

– Excluded

} Logic programming in Prolog

Artificial Intelligence 51

slide-52
SLIDE 52

} l's and m's are literals } A clause is a disjunction of literals } Resolution takes two clauses and infers a new

clause

Artificial Intelligence 52

slide-53
SLIDE 53

} Resolution using refutation (proof by

contradiction) is sound and complete

  • KB = {¬A(x) ∨ B(x), A(Wumpus)}
  • Prove: B(wumpus)
  • Add negated goal to KB: ¬B(wumpus)
  • Search for contradition using resolution

– If result ever empty clause, then proven

  • Resolve original clauses: B(wumpus), θ = {x/wumpus}
  • Resolve B(wumpus) and ¬B(wumpus): {}

} Convert FOL to clausal form (CNF) } Efficient? Resolution strategies

Artificial Intelligence 53

slide-54
SLIDE 54

} Conjunctive Normal Form (CNF)

  • Conjunction of clauses
  • Each clause is a disjunction of literals
  • Variables assumed to be universally quantified

} Example

  • ∀x,y,z American(x) ∧ Weapon(y) ∧ Sells(x,y,z) ∧

Hostile(z) ⇒ Criminal(x)

  • ¬American(x) ∨ ¬Weapon(y) ∨ ¬Sells(x,y,z) ∨

¬Hostile(x) ∨ Criminal(x)

Artificial Intelligence 54

slide-55
SLIDE 55

} Step 1: Eliminate implications ⇒

  • From: ∀x A(x) ∧ B(x) ⇒ C(x)
  • To: ∀x ¬A(x) ∨ ¬B(x) ∨ C(x)

} Step 2: Move ¬ inwards

  • ¬∀x A(x) becomes ∃x ¬A(x)
  • ¬ ∃x A(x) becomes ∀x ¬A(x)

} Step 3: Standardize variables

  • From: (∀x A(x)) ∧ (∀x B(x))
  • To: (∀x1 A(x1)) ∧ (∀x2 B(x2))

Artificial Intelligence 55

slide-56
SLIDE 56

} Step 4: Skolemize (Skolemization)

  • Eliminate existential quantifiers by replacing them

with a new constant or function

  • Skolem constant, Skolem function
  • Arguments of the Skolem function are all the

universally quantified variables in whose scope the existential quantifier appears

  • From: ∃x P(x), To: P(F1)
  • From: ∀x,y ∃z P(x,y,z)
  • To: ∀x,y P(x,y,F1(x,y))

Artificial Intelligence 56

Thoralf Skolem (1887-1963) Norwegian mathematician

slide-57
SLIDE 57

} Step 5: Drop universal quantifiers

  • All remaining variables universally quantified
  • So, just drop the ∀x,y,…

} Step 6: Distribute ∨ over ∧

  • From: (A(x) ∧ B(x)) ∨ C(x)
  • To: (A(x) ∨ C(x)) ∧ (B(x) ∨ C(x))

Artificial Intelligence 57

slide-58
SLIDE 58

} What is a brick?

  • A brick is on something that is not a pyramid
  • There is nothing that a brick is on and that is on the

brick as well

  • There is nothing that is not a brick and also is the

same thing as a brick.

Artificial Intelligence 58

∀x [Brick(x) ⇒ (∃y [On(x,y) ∧ ¬Pyramid(y)] ∧ ¬∃y [On(x,y) ∧ On(y,x)] ∧ ∀y [¬Brick(y) ⇒ ¬Equal(x,y)])]

slide-59
SLIDE 59

} Step 1: Eliminate implications

Artificial Intelligence 59

∀x [¬Brick(x) ∨ (∃y [On(x,y) ∧ ¬Pyramid(y)] ∧ ¬∃y [On(x,y) ∧ On(y,x)] ∧ ∀y [¬¬Brick(y) ∨ ¬Equal(x,y)])]

slide-60
SLIDE 60

} Step 2: Move ¬ inwards

Artificial Intelligence 60

∀x [¬Brick(x) ∨ (∃y [On(x,y) ∧ ¬Pyramid(y)] ∧ ∀y ¬[On(x,y) ∧ On(y,x)] ∧ ∀y [Brick(y) ∨ ¬Equal(x,y)])] ∀x [¬Brick(x) ∨ (∃y [On(x,y) ∧ ¬Pyramid(y)] ∧ ∀y [¬On(x,y) ∨ ¬On(y,x)] ∧ ∀y [Brick(y) ∨ ¬Equal(x,y)])]

slide-61
SLIDE 61

} Step 3: Standardize variables

Artificial Intelligence 61

∀x [¬Brick(x) ∨ (∃y [On(x,y) ∧ ¬Pyramid(y)] ∧ ∀a [¬On(x,a) ∨ ¬On(a,x)] ∧ ∀b [Brick(b) ∨ ¬Equal(x,b)])]

slide-62
SLIDE 62

} Step 4: Skolemization } Step 5: Drop universal quantifiers

Artificial Intelligence 62

∀x [¬Brick(x) ∨ ([On(x,F(x)) ∧ ¬Pyramid(F(x))] ∧ ∀a [¬On(x,a) ∨ ¬On(a,x)] ∧ ∀b [Brick(b) ∨ ¬Equal(x,b)])] ¬Brick(x) ∨ ([On(x,F(x)) ∧ ¬Pyramid(F(x))] ∧ [¬On(x,a) ∨ ¬On(a,x)] ∧ [Brick(b) ∨ ¬Equal(x,b)])

slide-63
SLIDE 63

} Step 6: Distribute ∨ over ∧

Artificial Intelligence 63

(¬Brick(x) ∨ On(x,F(x))) ∧ (¬Brick(x) ∨ ¬Pyramid(F(x))) ∧ (¬Brick(x) ∨ ¬On(x,a) ∨ ¬On(a,x)) ∧ (¬Brick(x) ∨ Brick(b) ∨ ¬Equal(x,b))

slide-64
SLIDE 64

} CNF

  • ¬American(x) ∨ ¬Weapon(y) ∨ ¬Sells(x,y,z) ∨ ¬Hostile(z)

∨ Criminal(x)

  • ¬Missile(x) ∨ ¬Owns(Nono,x) ∨ Sells(West,x,Nono)
  • ¬Missile(x) ∨ Weapon(x)
  • ¬Enemy(x,America) ∨ Hostile(x)
  • Enemy(Nono,America)
  • Owns(Nono,M1)
  • Missile(M1)
  • American(West)

} Prove: Criminal(West)

  • Add ¬Criminal(West) to KB and derive empty clause

Artificial Intelligence 64

slide-65
SLIDE 65

Artificial Intelligence 65

slide-66
SLIDE 66

} Prove: ∃c Criminal(c)

  • Add ¬∃c Criminal(c) to KB
  • I.e., add ¬Criminal(c) to KB

} Generated clauses

  • ¬American(c) ∨ ¬Weapon(y) ∨ ¬Sells(c,y,z) ∨

¬Hostile(z)

  • ¬Weapon(y) ∨ ¬Sells(West,y,z) ∨ ¬Hostile(z)
  • ¬Missile(y) ∨ ¬Sells(West,y,z) ∨ ¬Hostile(z)
  • ¬Sells(West,M1,z) ∨ ¬Hostile(z)
  • ¬Missile(M1) ∨ ¬Owns(Nono,M1) ∨ ¬Hostile(Nono)
  • ¬Owns(Nono,M1) ∨ ¬Hostile(Nono)
  • ¬Hostile(Nono)
  • ¬Enemy(Nono,America)

Artificial Intelligence 66

slide-67
SLIDE 67

} Add clause with negated goal and answer

literal to KB

} Search for clause containing only answer

literal

} Prove: ∃x,y,z Goal(x,y,z) } Add (¬Goal(x,y,z) ∨ Answer(x,y,z)) to KB } Final clause Answer(x,y,z) will have variables

bound to answers

Artificial Intelligence 67

slide-68
SLIDE 68

} Prove: ∃c Criminal(c) and retrieve c

  • Add [¬Criminal(c) ∨ Answer(c)] to KB

} Generated clauses

  • ¬American(c) ∨ ¬Weapon(y) ∨ ¬Sells(c,y,z) ∨ ¬Hostile(z) ∨

Answer(c)

  • ¬Weapon(y) ∨ ¬Sells(West,y,z) ∨ ¬Hostile(z) ∨

Answer(West)

  • ¬Missile(y) ∨ ¬Sells(West,y,z) ∨ ¬Hostile(z) ∨ Answer(West)
  • ¬Sells(West,M1,z) ∨ ¬Hostile(z) ∨ Answer(West)
  • ¬Missile(M1) ∨ ¬Owns(Nono,M1) ∨ ¬Hostile(Nono) ∨

Answer(West)

  • ¬Owns(Nono,M1) ∨ ¬Hostile(Nono) ∨ Answer(West)
  • ¬Hostile(Nono) ∨ Answer(West)
  • ¬Enemy(Nono,America) ∨ Answer(West)
  • Answer(West)

Artificial Intelligence 68

slide-69
SLIDE 69

} Willy is a wumpus. Swiper is not a wumpus,

and Swiper is not a dog. If something is neither a wumpus nor a dog, then it is a fox. Foxes jump over Wumpuses.

} Prove that Swiper jumps over Willy.

Artificial Intelligence 69

slide-70
SLIDE 70

Artificial Intelligence 70

slide-71
SLIDE 71

Artificial Intelligence 71

slide-72
SLIDE 72

} Vampire (vprover.github.io) } iProver (www.cs.man.ac.uk/~korovink/iprover) } Conference on Automated Deduction (CADE)

ATP System Competition (CASC)

  • tptp.cs.miami.edu/CASC

} Applications

  • Mathematical theorem proving
  • Hardware and software synthesis and verification
  • Reasoning

Artificial Intelligence 72

slide-73
SLIDE 73

} Knowledge-based (logical) agent } First-order logic } Inference

  • Resolution proof by refutation

Artificial Intelligence 73