ARTIFICIAL INTELLIGENCE
Russell & Norvig Chapter 7. Logical Agents
ARTIFICIAL INTELLIGENCE Russell & Norvig Chapter 7. Logical - - PowerPoint PPT Presentation
ARTIFICIAL INTELLIGENCE Russell & Norvig Chapter 7. Logical Agents Knowledge bases Knowledge base = set of sentences in a formal language Declarative approach to building an agent (or other system): Tell it what it needs to know
Russell & Norvig Chapter 7. Logical Agents
i.e., what they know, regardless of how implemented
feature
conclusions may be drawn
sentences and the real world
symbol in the knowledge base
enumerated exhaustively
¬P is true iff P is false P ∧ Q is true iff P is true and Q is true P ∨ Q is true iff P is true or Q is true P ⇒ Q is true iff P is false or Q is true P ⇔ Q is true iff P ⇒ Q is true and Q ⇒ P is true
models: α ≡ ß iff α╞ β and β╞ α
Let Pi,j be true if there is a pit in [i, j]. Let Bi,j be true if there is a breeze in [i, j].
¬ P1,1 ¬B1,1 B2,1
B1,1 ⇔
(P1,2 ∨ P2,1)
B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
A sentence is valid if it is true in all models,
e.g., True, A ∨¬A, A ⇒ A, (A ∧ (A ⇒ B)) ⇒ B
A sentence is satisfiable if it is true in some model
e.g., A∨ B, C
A sentence is unsatisfiable if it is true in no models
e.g., A∧¬A
Satisfiability is connected to inference via the following:
KB ╞ α if and only if (KB ∧¬α) is unsatisfiable
KB╞ α
that KB ├i α
expressive enough to say almost anything of interest, and for which there exists a sound and complete inference procedure.
follows from what is known by the KB.
assignments of all its symbols), and check that α is true in every model in which KB is true?
size 2n
generate new sentences or conclusions given the premises in the KB
Can use inference rules as operators in a standard search algorithm
e.g., min-conflicts-like hill-climbing algorithms
premises conclusion
α: “The weather is dry” β: “The weather is rainy” γ: “I carry an umbrella”
literals and adding resulting clauses to the list
entail α
contradiction and KB╞ α
α ∧ ¬ β is unsatisfiable
time