The problem Consider the following rules specifying robots behavior: - - PowerPoint PPT Presentation

the problem
SMART_READER_LITE
LIVE PREVIEW

The problem Consider the following rules specifying robots behavior: - - PowerPoint PPT Presentation

Modeling and reasoning in propositional calculus Modeling and reasoning in propositional calculus Lecture 01 The problem Consider the following rules specifying robots behavior: CUGS: Logic II 1 if the surface is not dry then slow down or


slide-1
SLIDE 1

Modeling and reasoning in propositional calculus

CUGS: Logic II

Lecture 01

Lecture 01 Slide 1 of 36 Modeling and reasoning in propositional calculus Lecture 01

The problem

Consider the following rules specifying robot’s behavior:

1 if the surface is not dry then slow down or turn on the safe

mode

2 if the speed is not reduced then keep signal turned on 3 if signal is on and the surface is not dry then do not turn on

the safe mode

4 if the surface is not dry then slow down.

Lecture 01 Slide 2 of 36 Modeling and reasoning in propositional calculus Lecture 01

The problem

The task is to verify whether: (i) the conjunction of (1), (2) and (3) implies (4) (ii) the conjunction of (1), (2) and (4) implies (3).

Lecture 01 Slide 3 of 36 Modeling and reasoning in propositional calculus Lecture 01

Modeling

Modeling A good model is a more or less simplified description of reality. It should allow to derive conclusions valid in the modeled reality. Usually the goal is to use as elementary formal tools as possible to specify the model at the required level of simplification. Example Model of a car:

  • a driver’s point of view: e.g., steering wheel, gears, starter,

light switches, etc.

  • a designer’s point of view: e.g., model of aerodynamical flows,

models of materials’ strength, etc.

  • a dealer’s point of view: shape, color, price, etc.

Lecture 01 Slide 4 of 36

slide-2
SLIDE 2

Modeling and reasoning in propositional calculus Lecture 01

Typical environments of intelligent systems

K1 K2 Kn . . .

perception language logic EaLity

REALITY

REALITY

REALITY

REALITY

ReAlity R˚ AÆ⁀

  • oΛiTY

Lecture 01 Slide 5 of 36 Modeling and reasoning in propositional calculus Lecture 01

Quantitative and symbolic reasoning

Quantitative reasoning

  • algorithmic methods
  • analytical/numerical methods
  • probabilistic/statistical methods
  • fuzzy logic.

Symbolic reasoning

  • classical logic and logic programming
  • three- and many-valued logics
  • modal and temporal logics
  • nonmonotonic reasoning
  • approximate reasoning

Lecture 01 Slide 6 of 36 Modeling and reasoning in propositional calculus Lecture 01

From sensors to higher level reasoning

  • . . .

✣ ✣ ✣ ❪ ❪

. . .               Quantitative reasoning . . .

✻ ✻ ✻

Qualitative Databases

Sensors, cameras,... Noisy, incomplete data                        Qualitative reasoning

✻ ✻ ✻ ✻ ✻ ✻

. . .         

✻ ✻

Incomplete, inconsistent data Quantitative Databases

Lecture 01 Slide 7 of 36 Modeling and reasoning in propositional calculus Lecture 01

What are logics?

Logic (approximately) is the tool allowing us to perceive (specify and reason about) the world through truth level of properties (formulas) expressed in a given language with well defined syntax and semantics. Logic ≈ language + semantics/models (semantically) ≈ language + deduction (syntactically)

Lecture 01 Slide 8 of 36

slide-3
SLIDE 3

Modeling and reasoning in propositional calculus Lecture 01

Modeling in logics

  • fix a formal language (dictionary, grammar)
  • fix methods of correct reasoning
  • specify properties of the investigated reality the chosen

language – obtaining a model

  • test the model by reasoning about properties of the reality
  • investigate the reality solely through the level of truth of the

expressed properties.

Lecture 01 Slide 9 of 36 Modeling and reasoning in propositional calculus Lecture 01

Fixing the language

The language is adjusted to a given application area. For example,

1 talking about politics we use concepts like “political party”,

“prime minister”, “parliament”, “program”, etc.

2 talking about computer science we use concepts like

“software”, “database”, “program”, etc. We may have different vocabularies, although some names can be the same having different meanings.

Lecture 01 Slide 10 of 36 Modeling and reasoning in propositional calculus Lecture 01

Semantical presentation of logics

Semantical presentation depends on choosing models and attaching interpretation of formulas in models. If A is a formula and M is a model then we write M | = A to indicate that A is true in M and M | = A to indicate that A is not true in M. If S is a set of formulas then M | = S denotes the fact that for all A ∈ S, M | = A. We say that a formula A is a consequence of a set of formulas S if for any model M we have that M | = S implies that M | = A.

Lecture 01 Slide 11 of 36 Modeling and reasoning in propositional calculus Lecture 01

Example

Assume that in a model M we have three objects: o1 being a red car, o2 being a brown car and o3 being a red bicycle. Assume further that in our language we have propositions: car, bicycle, red, brown. Let proposition car be t for o1, o2, bicycle be t for o3, red be t for

  • 1, o3 and brown be t for o2.

Then:

  • M |

= red or brown

  • M |

= if car then brown

Lecture 01 Slide 12 of 36

slide-4
SLIDE 4

Modeling and reasoning in propositional calculus Lecture 01

Logical language

Logical language is defined by fixing logical connectives, operators, dictionaries and syntax rules how to form formulas. Logical connectives and operators have a fixed meaning. Dictionaries reflect concepts of a given application area and are flexible. Elements of a logical language

  • logical constants: true, false, denoting logical values t, f;

sometimes also another, – e.g., unknown, inconsistent

  • logical (propositional) variables (letters, atoms),

representing logical unknowns, – e.g.,: p, q

  • relation symbols, representing relations,

– e.g., =, ≤,

Lecture 01 Slide 13 of 36 Modeling and reasoning in propositional calculus Lecture 01

Logical language

Elements of a logical language – continued

  • individual constants (constants), representing objects

– e.g., 0, 1, John

  • individual variables, representing obiects, e.g., x, y, m, n
  • function symbols, representing functions, e.g., +, ∗, father()
  • propositional connectives and operators allow one to create

more complex formulas from simpler formulas, – examples of connectives: “and”, “or”, “implies”, – examples of operators: “for all”, “exists”, “knows”, “always”

  • auxiliary symbols, making notation easier to read

– examples: “(”, “)”, “[”, “]”.

Lecture 01 Slide 14 of 36 Modeling and reasoning in propositional calculus Lecture 01

Logical language

Why “function/relation symbols” rather than “functions/relations”? In natural language names are not objects they denote. In logics symbols correspond to names. Function/relation symbol is not a function/relation, but a name. Comparing to natural language, – in logic: a symbol denotes a unique object.

Lecture 01 Slide 15 of 36 Modeling and reasoning in propositional calculus Lecture 01

Bnf notation

Bnf notation allows us to define syntax of languages. There are two forms of rules: rule meaning S ::= S1 . . . Sn symbol S may be replaced by sequence S1 . . . Sn S ::= S1 | . . . | Sn symbol S may be replaced by one of S1, . . . , Sn

Lecture 01 Slide 16 of 36

slide-5
SLIDE 5

Modeling and reasoning in propositional calculus Lecture 01

Propositional calculus

Propositional calculus investigates the validity of complex sentences on the basis of truth values of sub-sentences. Let P denote the set of propositional variables. Truth values: t, f Formulas: fml ::= P | ¬fml | fml ∨ fml | fml ∧ fml | fml → fml | fml ≡ fml | (fml) | [fml] Convention Brackets are used to make the notation unambiguous. To simplify notation we often omit brackets, assuming that the order of precedence from high to low is: ¬, ∧, ∨, ≡, →. For example, A ∨ ¬B ∧ C → ¬D ≡ E ∨ F abbreviates

  • A ∨
  • (¬B) ∧ C
  • (¬D) ≡ (E ∨ F)
  • .

Lecture 01 Slide 17 of 36 Modeling and reasoning in propositional calculus Lecture 01

Propositional calculus

Examples of propositional formulas break pedal pressed → slow down engine on ∧ gear on ∧ gas pedal pressed    → motion

  • ¬gear on
  • ¬motion ∨ slow down
  • . . .

Lecture 01 Slide 18 of 36 Modeling and reasoning in propositional calculus Lecture 01

Towards solving the initial problem (see slide 2)

Atomic sentences

  • dry – standing for “the surface is dry”
  • slow – standing for “slow down” (i.e., “reduce speed”)
  • safe – standing for “turn on safe mode”
  • sig – standing for “signal on”.

Translation of the considered sentences

1 (¬dry) → (slow ∨ safe) 2 (¬slow) → sig 3 (sig ∧ ¬dry) → ¬safe 4 (¬dry) → slow.

Lecture 01 Slide 19 of 36 Modeling and reasoning in propositional calculus Lecture 01

Solving the initial problem

The first task is to verify: [(¬dry) → (slow ∨ safe)] ∧ [(¬slow) → sig] ∧ [(sig ∧ ¬dry) → ¬safe]    → [(¬dry) → slow] The second task is to verify: [(¬dry) → (slow ∨ safe)] ∧ [(¬slow) → sig] ∧ [(¬dry) → slow]    → [(sig ∧ ¬dry) → ¬safe]

Lecture 01 Slide 20 of 36

slide-6
SLIDE 6

Modeling and reasoning in propositional calculus Lecture 01

The first semantics: assign truth values to sentences

Truth tables Truth tables provide us with a semantics of logical connectives. Truth table consists of columns representing arguments

  • f a connective and one (last) column representing the logical

value of the sentence built from arguments, using the connective. Truth table for negation A ¬A f t t f

Lecture 01 Slide 21 of 36 Modeling and reasoning in propositional calculus Lecture 01

Truth tables: conjunction, disjunction, implication, equivalence

Truth table for connectives A B A ∧ B A ∨ B A → B A ≡ B f f f f t t f t f t t f t f f t f f t t t t t t

Lecture 01 Slide 22 of 36 Modeling and reasoning in propositional calculus Lecture 01

Some definitions

  • A tautology is a formula which is t for all possible assignments
  • f truth values to atomic sentences.
  • A formula is satisfiable if it is t for at lest one such assignment.
  • A formula which is f for all such assignments is called

a counter-tautology.

Lecture 01 Slide 23 of 36 Modeling and reasoning in propositional calculus Lecture 01

What is correct reasoning?

A correct reasoning is based on correct (sound) arguments. A correct (sound) argument is one in which anyone who accepts its premises should also accept its conclusions. To see whether an argument is correct, one does not judge whether there are good reasons for accepting the premisses, but whether person who accepted the premisses, for whatever reasons, good or bad, ought also accept the conclusion.

Lecture 01 Slide 24 of 36

slide-7
SLIDE 7

Modeling and reasoning in propositional calculus Lecture 01

Examples

Examples of correct arguments

  • if x is a parent of y, and y is a parent of z,

then x is a grandparent of z

  • if A and B is true, then A is true.

Examples of incorrect arguments

  • if A implies B then B implies A
  • if A or B is true, then A is true.

Lecture 01 Slide 25 of 36 Modeling and reasoning in propositional calculus Lecture 01

Proof systems

Proof systems allow us to formalize correct reasoning. There are many ways to define them. For example:

  • analytic tableaux (Beth, Smullyan)
  • Gentzen-like proof systems (natural deduction)
  • Hilbert-like systems
  • resolution (Robinson)

Lecture 01 Slide 26 of 36 Modeling and reasoning in propositional calculus Lecture 01

Some meta-properties

A meta-property is a property of logic rather than of the reality the logic describes. There are two important meta-properties relating syntactical and semantical approaches, namely soundness (also called correctness) and completeness of a proof system wrt a given semantics. Assume a logic is given by its semantics S and by a proof system

  • P. Then we say that:
  • proof system P is sound (correct) wrt the semantics S

iff every property that can be proved using P is true under semantics S,

  • proof system P is complete wrt the semantics S

iff every property that is true under semantics S can be proved using P.

Lecture 01 Slide 27 of 36 Modeling and reasoning in propositional calculus Lecture 01

Propositional calculus: tableaux

A literal is an atom or negation of an atom. For any formula A, {A, ¬A} is a complementary pair of formulas. A is the complement of ¬A and ¬A is the complement of A. Semantic tableau for a formula A A semantic tableau T is a tree with each node labeled with a set

  • f formulas, where T represents A in such a way that A is

equivalent to the disjunction of formulas appearing in all leaves, assuming that sets of formulas labeling leaves are interpreted as conjunctions of their members.

Lecture 01 Slide 28 of 36

slide-8
SLIDE 8

Modeling and reasoning in propositional calculus Lecture 01

Example

The tableau: A ↓ B, C ւ ց D, E, F, G E, H, I, J, K represents

  • D ∧ E ∧ F ∧ G
  • the first branch

  • E ∧ H ∧ I ∧ J ∧ K
  • the second branch

Lecture 01 Slide 29 of 36 Modeling and reasoning in propositional calculus Lecture 01

α-formulas and β-formulas

α α1 α2 β β1 β2 ¬¬A1 A1 A1 ∧ A2 A1 A2 ¬(B1 ∧ B2) ¬B1 ¬B2 ¬(A1 ∨ A2) ¬A1 ¬A2 B1 ∨ B2 B1 B2 ¬(A1→A2) A1 ¬A2 B1→B2 ¬B1 B2 A1≡A2 A1→A2 A2→A1 ¬(B1≡B2) ¬(B1→B2) ¬(B2→B1)

Lecture 01 Slide 30 of 36 Modeling and reasoning in propositional calculus Lecture 01

Closed and open leaves

A leaf is called closed if it contains a complementary pair of literals. If a leaf consists of literals only and contains no complementary pair of literals then we call it open. A tableau is completed if all its leaves are open or closed.

Lecture 01 Slide 31 of 36 Modeling and reasoning in propositional calculus Lecture 01

Construction of a semantic tableau for a formula C

Initially, T consists of a single node labeled with {C}. If T is completed then no further construction is possible. Otherwise chose a leaf, say l, labeled with S containing a non-literal and chose from S a formula D which is not a literal and:

  • if D is an α-formula then create a successor of l and label it

with

  • S − {D}
  • ∪ {α1, α2}
  • if D is a β-formula then create two new successors of l, the

first one labeled with

  • S − {D}
  • ∪ {β1} and the second one

labeled with

  • S − {D}
  • ∪ {β2}.

Lecture 01 Slide 32 of 36

slide-9
SLIDE 9

Modeling and reasoning in propositional calculus Lecture 01

Example

(¬p ∨ ¬q) ∧ p ∧ q ↓ (¬p ∨ ¬q), p ∧ q ↓ (¬p ∨ ¬q), p, q ւ ց ¬p, p, q ¬q, p, q

Lecture 01 Slide 33 of 36 Modeling and reasoning in propositional calculus Lecture 01

Important properties

Soundness and completeness The construction of a tableau always terminates and leads to a completed tableau. A completed tableau is closed iff all its leaves are closed. Let T be a completed tableau for a formula A. Then A is unsatisfiable iff T is closed. Proving with semantic tableaux To prove that a formula A is a tautology we construct a completed tableau for its negation ¬A. If the tableau is closed then A is a tautology (its negation is not satisfiable). Otherwise A is not a tautology.

Lecture 01 Slide 34 of 36 Modeling and reasoning in propositional calculus Lecture 01

Example

To prove

  • (p → q) → p
  • → p we construct a closed tableau for its

negation:

  • (p → q) → p
  • ∧ ¬p

  • (p → q) → p
  • , ¬p

ւ ց ¬(p → q), ¬p p, ¬p closed ↓ p, ¬q, ¬p

  • closed

Lecture 01 Slide 35 of 36 Modeling and reasoning in propositional calculus Lecture 01

Solving the problem

Tableau for the negation of the formula formalizing the first task (see slide 20) [¬dry → (slow ∨ safe)] ∧ [¬slow → sig]∧ [(sig ∧ ¬dry) → ¬safe] ∧ ¬dry ∧ ¬slow ↓ ¬dry → (slow ∨ safe), ¬slow → sig, (sig ∧ ¬dry) → ¬safe, ¬dry, ¬slow ւ ց dry . . . closed (slow ∨ safe), . . . ւ ց slow . . . closed safe . . . left as an exercise

Lecture 01 Slide 36 of 36

slide-10
SLIDE 10

Modeling and reasoning in propositional calculus

CUGS: Logic II

Lecture 02

Lecture 02 Slide 1 of 32 Modeling and reasoning in propositional calculus Lecture 02

The problem

Consider the following relationships among concepts: Students are humans enrolled in university courses. (1) A student can be a male or a female. (2) Good students are active and have high grades. (3) Good students are students. (4) Express the resulting concept relationships (taxonomy).

Lecture 02 Slide 2 of 32 Modeling and reasoning in propositional calculus Lecture 02

Expressing concept relationships

Set-based semantics for propositional formulas

  • Let U be a set of objects and P be a set of propositional
  • variables. The set-based semantics is defined by means of a

mapping ·U assigning to each propositional variable a subset

  • f U (intuitively, the set of objects satisfying a given

proposition).

  • The semantics is extended for all formulas as follows:

(¬A)U def = U − AU (A ∧ B)U def = AU ∩ BU (A ∨ B)U def = AU ∪ BU (A → B)U def ≡ AU ⊆ BU (A ≡ B)U def ≡ AU = BU.

Lecture 02 Slide 3 of 32 Modeling and reasoning in propositional calculus Lecture 02

Expressing exemplary concept relationships

  • (1): human ∧ enr → stud
  • (2): stud → male ∨ female
  • (3): active ∧ highG → goodStud
  • (4): goodStud → stud.

Where is the problem?

  • Do the above relationships express the whole knowledge

provided by (1)–(4)?

  • Consider a (propositional) database, where knowledge is

expressed by means of propositional formulas and is built around concepts and their taxonomies. How to query such a database?

Lecture 02 Slide 4 of 32

slide-11
SLIDE 11

Modeling and reasoning in propositional calculus Lecture 02

Querying propositional databases

The method Let ∆ be a propositional database. We are interested in checking whether a given formula A is a consequence of ∆, i.e., whether ∆ → A. We first negate ∆ → A which results in ∆ ∧ ¬A. Therefore, we add ¬A to ∆ and try to show that it leads to contradiction (meaning that ∆ ∪ {¬A} it is unsatisfiable). Remarks

  • ∆ is unchanged! We only modify A which is relatively much

less complex.

  • We need a good representation of formulas in ∆ and good

algorithms for checking unsatisfiability.

Lecture 02 Slide 5 of 32 Modeling and reasoning in propositional calculus Lecture 02

NNF: Negation Normal Form

Recall that a literal is a formula of the form A or ¬A, where A is an atomic formula. A literal of the form A is called positive and of the form ¬A is called negative. We say that formula A is in negation normal form, abbreviated by Nnf, iff it contains no other connectives than ∧, ∨, ¬, and the negation sign ¬ appears in literals only. Examples

1 (p ∨ ¬q ∧ s) ∨ ¬t is in Nnf 2 (p ∨ ¬¬q ∧ s) ∨ ¬t is not in Nnf 3 r ∧ ¬q ∧ ¬s ∧ t is in Nnf 4 r ∧ ¬q ∧ ¬[¬s ∧ t] is not in Nnf.

Lecture 02 Slide 6 of 32 Modeling and reasoning in propositional calculus Lecture 02

Transforming formulas into NNF

Any propositional formula can be equivalently transformed into the Nnf by replacing subformulas according to the table below, until Nnf is obtained. Rule Subformula Replaced by 1 A ≡ B (¬A ∨ B) ∧ (A ∨ ¬B) 2 A → B ¬A ∨ B 3 ¬¬A A 4 ¬(A ∨ B) ¬A ∧ ¬B 5 ¬(A ∧ B) ¬A ∨ ¬B

Lecture 02 Slide 7 of 32 Modeling and reasoning in propositional calculus Lecture 02

Transforming formulas into NNF

Example ¬[(¬p ∧ r) → (¬(q ∨ r))]

(2)

← → ¬[¬(¬p ∧ r) ∨ (¬(q ∨ r))]

(4)

← → ¬(¬(¬p ∧ r)) ∧ ¬(¬(q ∨ r))

(3)

← → (¬p ∧ r) ∧ ¬(¬(q ∨ r))

(3)

← → (¬p ∧ r) ∧ (q ∨ r).

Lecture 02 Slide 8 of 32

slide-12
SLIDE 12

Modeling and reasoning in propositional calculus Lecture 02

CNF: Conjunctive Normal Form

  • A clause is any formula of the form A1 ∨ A2 ∨ . . . ∨ Ak, where

k ≥ 1 and A1, A2, . . . , Ak are literals

  • A Horn clause is a clause in which at most one literal is

positive.

  • Formula A is in conjunctive normal form, abbreviated by Cnf,

if it is a conjunction of clauses. It is in clausal form if it is a set of clauses (considered to be an implicit conjunction of clauses).

Lecture 02 Slide 9 of 32 Modeling and reasoning in propositional calculus Lecture 02

Examples

1 p ∨ q ∨ ¬r is a clause but not a Horn clause 2 ¬p ∨ q ∨ ¬r as well as ¬p ∨ ¬q ∨ ¬r are Horn clauses 3 (p ∨ q ∨ t) ∧ (s ∨ ¬t) ∧ (¬p ∨ ¬s ∨ ¬t) is in Cnf 4 {(p ∨ q ∨ t), (s ∨ ¬t), (¬p ∨ ¬s ∨ ¬t)} is in clausal form 5 (p ∧ q ∨ t) ∧ (s ∨ ¬t) ∧ (¬p ∨ ¬s ∨ ¬t) is not in Cnf 6 {(p ∧ q ∨ t), (s ∨ ¬t), (¬p ∨ ¬s ∨ ¬t)} is not in clausal form.

Lecture 02 Slide 10 of 32 Modeling and reasoning in propositional calculus Lecture 02

Why Horn clauses are important?

Any Horn clause can be transformed into the form of implication: [A1 ∧ A2 ∧ . . . ∧ Al] → B, where all A1, A2, . . . , Al are positive literals and B is either a positive literal or f. Such implications are frequently used in everyday reasoning, expert systems, deductive database queries, etc. Examples rain → take umbrella, (snow ∧ ice) → cold weather, (starterProblem ∧ cold weather) → chargeBattery.

Lecture 02 Slide 11 of 32 Modeling and reasoning in propositional calculus Lecture 02

Non-Horn clauses

In the case of more positive literals a clause is equivalent to: [A1 ∧ A2 ∧ . . . ∧ Al] → [B1 ∨ B2 ∨ . . . ∨ Bn], where all literals A1, . . . , Al, B1, . . . , Bn are positive. Disjunction at the righthand side of implication causes serious complexity problems.

Lecture 02 Slide 12 of 32

slide-13
SLIDE 13

Modeling and reasoning in propositional calculus Lecture 02

Transforming formulas into CNF

Any propositional formula can be equivalently transformed into the Cnf:

1 Transform the formula into Nnf 2 Replace subformulas according to the table below, until Cnf

is obtained. Rule Subformula Replaced by 6 (A ∧ B) ∨ C (A ∨ C) ∧ (B ∨ C) 7 C ∨ (A ∧ B) (C ∨ A) ∧ (C ∨ B)

Lecture 02 Slide 13 of 32 Modeling and reasoning in propositional calculus Lecture 02

Example

(¬fastFood ∧ ¬restaurant) ∨ (walk ∧ park)

(6)

← → (¬fastFood ∨ (walk ∧ park)) ∧ (¬restaurant ∨ (walk ∧ park))

(7)

← → (¬fastFood ∨ walk) ∧ (¬fastFood ∨ park)∧ (¬restaurant ∨ (walk ∧ park))

(7)

← → (¬fastFood ∨ walk) ∧ (¬fastFood ∨ park)∧ (¬restaurant ∨ walk) ∧ (¬restaurant ∨ park) We then have the following conjunction of implications:

      fastFood → walk fastFood → park restaurant → walk restaurant → park.

Lecture 02 Slide 14 of 32 Modeling and reasoning in propositional calculus Lecture 02

Applications of CNF

  • rule-based reasoning and expert systems
  • logic programming and deductive databases
  • tautology checking.

A formula in Cnf can easily be tested for validity, since in this case we have the following simple criterion: if each clause contains a literal together with its negation, then the formula is a tautology, otherwise it is not a tautology. Examples:

1 (p ∨ q ∨ ¬p) ∧ (p ∨ ¬q ∨ r ∨ q) is a tautology 2 (p ∨ q ∨ ¬p) ∧ (p ∨ ¬q ∨ r) is not a tautology.

Lecture 02 Slide 15 of 32 Modeling and reasoning in propositional calculus Lecture 02

Resolution rule for propositional calculus

Resolution method has been introduced by Robinson (1965) and is considered the most powerful automated proving technique. Resolution rule, denoted by (res), is formulated as follows: α ∨ L, ¬L ∨ β α ∨ β where L is a literal and α, β are clauses. The position of L and ¬L in clauses does not matter. The empty clause is equivalent to f.

Lecture 02 Slide 16 of 32

slide-14
SLIDE 14

Modeling and reasoning in propositional calculus Lecture 02

Remarks

Resolution rule reflects the transitivity of implication Resolution rule can be formulated equivalently as: (¬α) → L, L → β (¬α) → β reflecting the transitivity of implication.

  • Presenting α, β as clauses, the resolution rule takes the form:

L1 ∨ . . . ∨ Lk ∨ L, ¬L ∨ M1 ∨ . . . ∨ Ml L1 ∨ . . . ∨ Lk ∨ M1 ∨ . . . ∨ Ml

Lecture 02 Slide 17 of 32 Modeling and reasoning in propositional calculus Lecture 02

Examples

John sleeps ∨ John works, John sleeps ∨ John doesn’t work John sleeps ∨ John sleeps P ∨ Q, ¬Q ∨ S ∨ T P ∨ S ∨ T ¬Q ∨ P, S ∨ T ∨ Q P ∨ S ∨ T

Lecture 02 Slide 18 of 32 Modeling and reasoning in propositional calculus Lecture 02

Factorization rule for propositional calculus

Factorization rule, denoted by (fctr): Remove from a clause all repetitions of literals. Example P ∨ P ∨ ¬Q ∨ P ∨ ¬Q P ∨ ¬Q Resolution rule preserves satisfiability, while factorization preserves equivalence.

Lecture 02 Slide 19 of 32 Modeling and reasoning in propositional calculus Lecture 02

Resolution method

The method, where formula A is to be proved

1 transform ¬A into the conjunctive normal form 2 try to derive the empty clause f by applying resolution (res)

and factorization (fctr)

  • if the empty clause is obtained then A is a tautology,
  • if the empty clause cannot be obtained no matter how (res)

and (fctr) are applied, then conclude that A is not a tautology.

Soundness and completeness Resolution method is sound: deriving f for ¬A proves that A is a tautology. It is complete: if a formula A is a tautology then there is a derivation of f using the resolution method.

Lecture 02 Slide 20 of 32

slide-15
SLIDE 15

Modeling and reasoning in propositional calculus Lecture 02

Example

Prove that formula [(A ∨ B) ∧ (¬A)] → B is a tautology:

1 negate: (A ∨ B) ∧ (¬A) ∧ ¬B – this formula is in the Cnf 2 obtaining the empty clause f:

1. A ∨ B the first clause 2. ¬A the second clause 3. ¬B the third clause 4. B (res): 1, 2 5. f (res): 3, 4

Lecture 02 Slide 21 of 32 Modeling and reasoning in propositional calculus Lecture 02

Solving the problem from the previous lecture

Negation of the formula formalizing the first task from the previous lecture [(¬dry) → (slow ∨ safe)] [dry ∨ slow ∨ safe] ∧ [(¬slow) → sig] ∧ [slow ∨ sig] ∧ [(sig ∧ ¬dry) → ¬safe] ∧ [¬sig ∨ dry ∨ ¬safe] ∧ ¬dry ∧ ¬dry ∧ ¬slow ∧ ¬slow

Lecture 02 Slide 22 of 32 Modeling and reasoning in propositional calculus Lecture 02

Example continued

Proof of the first implication 1. dry ∨ slow ∨ safe 2. slow ∨ sig 3. ¬sig ∨ dry ∨ ¬safe 4. ¬dry 5. ¬slow 6. ¬sig ∨ ¬safe (res) : 3, 4 7. slow ∨ ¬safe (res) : 2, 6 8. slow ∨ safe (res) : 1, 4 9. slow ∨ slow (res) : 7, 8 10. slow (fctr) : 9 11. f (res) : 5, 10

Lecture 02 Slide 23 of 32 Modeling and reasoning in propositional calculus Lecture 02

Example continued

The second task [(¬dry) → (slow ∨ safe)] ∧ [(¬slow) → sig] ∧ [(¬dry) → slow]    → [(sig ∧ ¬dry) → ¬safe] Negated [(¬dry) → (slow ∨ safe)] [dry ∨ slow ∨ safe] ∧ [(¬slow) → sig] ∧ [slow ∨ sig] ∧ [(¬dry) → slow] ∧ [dry ∨ slow] ∧ [sig ∧ ¬dry] ∧ sig ∧ safe ∧ ¬dry ∧ safe No matter how (res) and (fctr) are applied, f cannot be derived.

Lecture 02 Slide 24 of 32

slide-16
SLIDE 16

Modeling and reasoning in propositional calculus Lecture 02

Testing the initial taxonomy

Database of formulas {(1),(2),(3),(4)} human ∧ enr → stud ¬human ∨ ¬enr ∨ stud stud → male ∨ female ¬stud ∨ male ∨ female active ∧ highG → goodStud ¬active ∨ ¬highG ∨ goodStud goodStud → stud ¬goodStud ∨ stud. Exemplary queries

  • Does the database imply that good students are males or

females?

  • does the database imply that good students are humans?

Lecture 02 Slide 25 of 32 Modeling and reasoning in propositional calculus Lecture 02

Checking the first query

goodStud → male ∨ female (we add its negation goodStud ∧ ¬male ∧ ¬female) 1. ¬human ∨ ¬enr ∨ stud 2. ¬stud ∨ male ∨ female 3. ¬active ∨ ¬highG ∨ goodStud 4. ¬goodStud ∨ stud 5. goodStud 6. ¬male 7. ¬female 8. stud (res) : 4, 5 9. male ∨ female (res) : 2, 8 10. female (res) : 6, 9 11. f (res) : 7, 10

Lecture 02 Slide 26 of 32 Modeling and reasoning in propositional calculus Lecture 02

Checking the second query

goodStud → human (we add its negation goodStud ∧ ¬human) 1. ¬human ∨ ¬enr ∨ stud 2. ¬stud ∨ male ∨ female 3. ¬active ∨ ¬highG ∨ goodStud 4. ¬goodStud ∨ stud 5. goodStud 6. ¬human 7. stud (res) : 4, 5 8. male ∨ female (res) : 2, 7 What happened? No further resolution steps are possible. So we do not have expected results. This means that the taxonomy expressed in the considered database is too weak. We should “close” the rules.

Lecture 02 Slide 27 of 32 Modeling and reasoning in propositional calculus Lecture 02

Correcting the taxonomy

We should extend the taxonomy by formulas:

  • stud → human ∧ enr
  • goodStud → active ∧ highG.

(We do not have knowledge about other conditions as to being a student or a good student. We reduce the lack of knowledge by Closed World Assumption, which will be clarified during a lecture

  • n deductive databases. This is a basic nonmonotonic technique,

applied in databases, including SQL databases.)

Lecture 02 Slide 28 of 32

slide-17
SLIDE 17

Modeling and reasoning in propositional calculus Lecture 02

The SAT problem

Definition of SAT Given a propositional formula check whether the formula is satisfiable, i.e., whether there is an assignment of truth values to propositions making the formula true. Complexity

  • SAT is NPTime-complete (Cook, 1971), even for formulas in

Cnf and 3-Cnf (where each clause contains at most three literals).

  • There are some classes of formulas for which SAT is in

PTime, for example 2-Cnf and conjunctions of Horn clauses.

Lecture 02 Slide 29 of 32 Modeling and reasoning in propositional calculus Lecture 02

GSAT and WalkSAT

The purpose GSAT (Selman, Levesque and Mitchell, 1992) and WalkSat (McAllester, Selman and Kautz, 1997) are local search algorithms to check satisfiability of propositional formulas in Cnf. The idea Both algorithms start by assigning a random value to each

  • variable. If the assignment satisfies all clauses, the algorithm

terminates, returning the assignment. Otherwise, a variable is flipped and the above is then repeated until all the clauses are

  • satisfied. WalkSAT and GSAT differ in the methods used to select

which variable to flip. Both algorithms may restart with a new random assignment if no solution has been found for too long, as a way of getting out of local minima of numbers of unsatisfied clauses.

Lecture 02 Slide 30 of 32 Modeling and reasoning in propositional calculus Lecture 02

GSAT and WalkSAT continued

  • GSAT makes the change which minimizes the number of

unsatisfied clauses in the new assignment.

  • WalkSAT first picks a clause which is unsatisfied by the

current assignment, then flips a variable within that clause. The clause is generally picked at random among unsatisfied

  • clauses. The variable is picked to result in the fewest

previously satisfied clauses becoming unsatisfied, with some probability of picking one of the variables at random.

  • When picking a guessed to be optimal variable, WalkSAT has

to do less calculation than GSAT because it considers fewer possibilities.

Lecture 02 Slide 31 of 32 Modeling and reasoning in propositional calculus Lecture 02

GSAT pseudocode

GSAT_Solver() { for i = 1 to MAX_TRIES { V = a randomly generated assignment for j = 1 to MAX_FLIPS { if (no unsatisfied clauses exist) return V else x = find variable in V with max score flip the assignment of x } } return ‘No satisfying assignment found’ } }

Lecture 02 Slide 32 of 32

slide-18
SLIDE 18

Quantifiers and deductive databases

CUGS: Logic II

Lecture 03

Lecture 03 Slide 1 of 37 Quantifiers and deductive databases Lecture 03

The problem

Given a database of direct train and bus connections, find places connected directly or indirectly (changes of busses and trains might be necessary).

  • Can you express the query using SELECT of standard SQL?
  • How to express database queries in logic?
  • What is the complexity of querying in logics?
  • What are the advantages of expressing queries in logics?

Lecture 03 Slide 2 of 37 Quantifiers and deductive databases Lecture 03

Syntax of predicate calculus

Let x stands for an individual variable, a stands for a constant symbol and p sands for a relation symbol. argument ::= x argument ::= a argument list ::= argument argument list ::= argument, argument list atomic formula ::= p | p(argument list) formula ::= atomic formula formula ::= ¬formula formula ::= formula ∨ formula formula ::= ∀x formula formula ::= ∃x formula

Lecture 03 Slide 3 of 37 Quantifiers and deductive databases Lecture 03

The scope of quantifiers

The scope of a quantifier is the portion of a formula where it binds its variables. If a variable is in the scope of a quantifier then it is bound by the quantifier. Free variables are variables which are not in the scope of a quantifier.

Lecture 03 Slide 4 of 37

slide-19
SLIDE 19

Quantifiers and deductive databases Lecture 03

Examples

∀x       R(x) ∨ ∃y the scope of ∃y

  • R(x) → S(x, y)

    

  • the scope of ∀x

∀x       R(x) ∨ ∃y the scope of ∃y

  • ∃x
  • R(x) → T(x, y, z)
  • the scope of ∃x

     

  • the scope of ∀x

Lecture 03 Slide 5 of 37 Quantifiers and deductive databases Lecture 03

Semantics of predicate calculus

Interpretations Let U be a set of formulas such that {p1, . . . , pm} are all relation symbols and {a1, . . . , ak} are all constant symbols appearing in U. An interpretation is a triple D, {R1, . . . , Rm}, {d1, . . . , dk}, where

  • D is a non-empty domain
  • Ri is an assignment of a relation on D to pi
  • di ∈ D is an assignment of an element of D to ai.

Lecture 03 Slide 6 of 37 Quantifiers and deductive databases Lecture 03

Semantics of predicate calculus

Example Interpretations for the formula ∀x[p(a, x)] can be: N, {≤}, {0}, N, {≥}, {3}, N, {>}, {1}, where N is the set of natural numbers. In the first case ∀x[p(a, x)] is interpreted as “∀x[≤ (0, x)] (or, in a bit more readable infix notation, ∀x[0 ≤ x]). Assignments Let V be the set of variables. An assignment σI : V − → D is a function which maps every variable to an element of the domain

  • f I. By σI[xi ← di] we denote the assignment the same as σI

except that xi is mapped to di.

Lecture 03 Slide 7 of 37 Quantifiers and deductive databases Lecture 03

Semantics of predicate calculus

Semantics Let A be a formula, I an interpretation and σI an assignment. The value of A under σI, denoted by vσI(A) is defined by

  • let A = p(c1, . . . , cn) be an atomic formula, where each ci is

either a variable xi or a constant ai. vσI(A) = t iff (d1, . . . , dn) ∈ R, where R is the relation assigned by I to p and di ∈ D is assigned to ci either by I (if ci is a constant) or by σI (if ci is a variable)

  • vσI(¬A) = t iff vσI(A) = f
  • vσI(A1 ∨ A2) = t iff vσI(A1) = t or vσI(A2) = t
  • vσI(∀x A) = t iff vσI[x←d](A) = t for all d ∈ D
  • vσI(∃x A) = t iff vσI[x←d](A) = t for some d ∈ D.

Lecture 03 Slide 8 of 37

slide-20
SLIDE 20

Quantifiers and deductive databases Lecture 03

A hierarchy of logics

It is important that variables under quantifiers represent

  • individuals. In fact, we have the following hierarchy of logics:

0 zero-order logic (propositional), where we do not allow neither

variables representing domain elements nor quantifiers

1 first-order logic, where we allow variables representing domain

elements and quantifiers over domain elements (in this case quantifiers are called first-order quantifiers)

2 second-order logic, where we additionally allow variables

representing sets of domain elements and quantifiers over sets

  • f elements (so-called second-order quantifiers),

3 third-order logic, where we additionally allow variables

representing sets of sets of domain elements and quantifiers

  • ver sets of sets of elements (so-called third-order quantifiers)

4 etc.

Lecture 03 Slide 9 of 37 Quantifiers and deductive databases Lecture 03

Higher-order formulas

Examples

1 “To like is to be pleased with”: likes → pleased

– a formula of the zero-order logic

2 “John likes everybody”: ∀x likes(John, x)

– a formula of the first-order logic

3 “John has no relationships with anybody”: ¬∃R∃x R(John, x)

– a formula of the second-order logic

4 “No matter how the word good is understood, John has no

good relationships with anybody”: ∀Good¬∃R∃x Good(R) ∧ R(John, x) – a formula of the third-order logic.

Lecture 03 Slide 10 of 37 Quantifiers and deductive databases Lecture 03

What is a database?

A database is any finite collection of related data. Examples

1 a figure illustrating dependencies between shapes on a plane 2 a phone book 3 a library 4 an electronic mailbox 5 a personnel database of a department 6 an accounting database 7 a flight reservation database 8 . . .

Lecture 03 Slide 11 of 37 Quantifiers and deductive databases Lecture 03

A closer look at databases

Database Representation in logic an atomic data item an element = number, letter, etc.

  • f a domain (universe)

a record a fact = a tuple of atomic data items a table a relation = a collection of “compatible” records a relational database an extensional database = a collection of tables = a collection of relations + integrity constraints + formulas

Lecture 03 Slide 12 of 37

slide-21
SLIDE 21

Quantifiers and deductive databases Lecture 03

A closer look at databases

Database Representation in logic a deductive database a deductive database = relational database = extensional database + deduction mechanism + intensional database (rules) a database query a formula a linguistic tool for selecting pieces of information Integrity constraints An integrity constraint of a database is a constraint that has to be satisfied by any instance of the database.

Lecture 03 Slide 13 of 37 Quantifiers and deductive databases Lecture 03

Example: a mobile phone book

Database Representation in logic atomic data: domains: = first&last name, Names, phone number Numbers a record: a fact: {Eve Jones, 0701 334567} PN(′Eve Jones′, 0701 334567) a table: a relation: {Eve Jones, 0701 334567} PN(′Eve Jones′, 0701 334567) {John Smith, 0701 334555} PN(′JohnSmith′, 0701 334555) . . . . . .

Lecture 03 Slide 14 of 37 Quantifiers and deductive databases Lecture 03

Example: a mobile phone book – continued

Database Representation in logic a relational database: the extensional database: the above table + constraints the above relation + e.g., “different persons have ∀x, y ∈ Names ∀n, m∈ Numbers different phone numbers” [PN(x, n) ∧ PN(y, m) ∧ x = y] → n = m a database query: formula: “find phone numbers PN(′Eve Jones′, X)

  • f Eve Jones”

Lecture 03 Slide 15 of 37 Quantifiers and deductive databases Lecture 03

The architecture of deductive databases

Extensional Database

  • .....
  • intensional databases

deduction mechanisms

s ✰ ◆ ❄ ✻

                                      

Queries

❪ ✴ ■ ✠

..... Integrity constraints

                    

Lecture 03 Slide 16 of 37

slide-22
SLIDE 22

Quantifiers and deductive databases Lecture 03

Common assumptions

  • If c1, . . . , cn are all constant symbols appearing

in the database, then it is assumed that

  • UNA (Unique Names Axiom):

for all 1 ≤ i = j ≤ n we have that ci = cj

  • DCA (Domain Closure Axiom):

∀x [x = c1 ∨ . . . ∨ x = cn]

  • CWA (Closed World Assumption):

if R(¯ a1), . . . , R(¯ ak) are all facts about R stored in the database then we have: ∀¯ x [R(¯ x) → (¯ x = ¯ a1 ∨ . . . ∨ ¯ x = ¯ ak)].

Lecture 03 Slide 17 of 37 Quantifiers and deductive databases Lecture 03

First-order logic as a query language

Given a formula A(x1, . . . , xk) with x1, . . . , xk being all free variables in A, and a database ∆, the query selects all tuples a1, . . . , ak such that

  • a1, . . . , ak ∈ dom(∆)
  • A(a1, . . . , ak) is true in ∆ (considered as an interpretation).

Lecture 03 Slide 18 of 37 Quantifiers and deductive databases Lecture 03

Complexity of queries

Two ways to measure complexity of queries

  • data complexity, i.e., the complexity of evaluating a fixed

query for variable database inputs,

  • expression complexity, i.e., the complexity of of evaluating, on

a fixed database instance, the various queries specifiable in a given query language. Data complexity is typically more relevant so we use the term complexity to refer to data complexity.

Lecture 03 Slide 19 of 37 Quantifiers and deductive databases Lecture 03

Complexity of queries

Query recognition problem For a query Q we have the following query recognition problem: given a database D and a tuple ¯ u, determine whether ¯ u belongs to the answer Q(D). Complexity classes of queries For each Turing time of space complexity class C, one can define the corresponding complexity class of queries, denoted by QC. The class of queries QC consists of all queries whose recognition problem is in C. Examples: QPTime, QPSpace, QNPTime,...

Lecture 03 Slide 20 of 37

slide-23
SLIDE 23

Quantifiers and deductive databases Lecture 03

Data complexity of first-order queries

Fact Data complexity of first-order queries is QPTime. In fact, it is also in QLogSpace. Assumptions

  • Dom is the database domain
  • Q1z1 . . . QrzrA(zr, . . . , z1, x1, . . . , xk) is a first-order query

with quantifier-free A, Q1, . . . , Qr ∈ {∀, ∃} and zr, . . . , z1, x1, . . . , xk being all variables in A.

Lecture 03 Slide 21 of 37 Quantifiers and deductive databases Lecture 03

Data complexity of first-order queries

Naive algorithm (suffices to show that data complexity of first-order queries is QPTime — how to show that it is QLogSpace?)

1 compute the set R of (r + k)-tuples satisfying

A(z1, . . . , zr, x1, . . . , xk)

2 for i = r, . . . , 1 do

  • if Qi = ∀ then

R := {b1, . . . , bk+i−1 | for all x ∈ Dom, x, b1, . . . , bk+i−1 ∈ R}

  • otherwise (if Qi = ∃)

R := {b1, . . . , bk+i−1 | there is x ∈ Dom such that x, b1, . . . , bk+i−1 ∈ R}

3 return R as the result.

Lecture 03 Slide 22 of 37 Quantifiers and deductive databases Lecture 03

Datalog

Extensional databases, integrity constraints and queries are present in traditional database management systems. Deductive databases offer means to store deductive rules (intensional databases) and provide a deduction machinery. The basic language for deductive databases is Datalog. Facts in Datalog Facts in Datalog are represented in the form of relations name(arg1, . . . , argk), where name is a name of a relation and arg1, . . . , argk are constants. Examples

1 address(John,′ Gloucester Road 12′) 2 likes(John, Marc).

Lecture 03 Slide 23 of 37 Quantifiers and deductive databases Lecture 03

Atomic queries about facts

Atomic queries are of the form name(arg1, . . . , argk), where arg1, . . . , argk are constants or variables, where ‘ ’ is also a variable (without a name, “do-not-care” variable). Examples

1 likes(John, Marc) – does John like Marc? 2 likes(X, Marc) – who likes Marc?

(compute X’s satisfying likes(X, Marc))

3 likes(John, X) – whom likes John? 4 likes(X, Y ) – compute all pairs X, Y such that likes(X, Y )

holds.

Lecture 03 Slide 24 of 37

slide-24
SLIDE 24

Quantifiers and deductive databases Lecture 03

Rules in Datalog

Rules in Datalog are expressed in the form of (a syntactic variant

  • f) Horn clauses:

R(¯ Z):− R1(¯ Z1), . . . , Rk(¯ Zk) where ¯ Z, ¯ Z1, . . . , ¯ Zk are vectors of variable or constant symbols such that any variable appearing on the lefthand side of ‘:− ’ (called the head of the rule) appears also on the righthand side of the rule (called the body of the rule).

Lecture 03 Slide 25 of 37 Quantifiers and deductive databases Lecture 03

Rules in Datalog

Semantics The intended meaning of the rule is that [R1(¯ Z1) ∧ . . . ∧ Rk(¯ Zk)] → R(¯ Z), where all variables that appear both in the rule’s head and body are universally quantified, while those appearing only in the rule’s body, are existentially quantified. Example Rule: R(X, c):− Q(X, Z), S(Z, X) denotes implication: ∀X {∃Z [Q(X, Z) ∧ S(Z, X)] → R(X, c)}.

Lecture 03 Slide 26 of 37 Quantifiers and deductive databases Lecture 03

Datalog queries

A Datalog query is a conjunction of literals (conjunction is denoted by comma). Assumptions UNA, DCA and CWA allow one to formally define the meaning of queries as minimal relations which make queries logical consequences of facts and rules stored in the

  • database. Intuitively,
  • if a query does not have free variables, then the answer is

either t or f dependently on the database contents

  • if a query contains variables, then the result consists of all

assignments of values to variables, making the query t. Examples

1 R(X,a) returns all X’s satisfying R(X, a) 2 R(X,Y), S(X) returns all pairs X, Y satisfying the

conjunction R(X, Y ) ∧ S(X).

Lecture 03 Slide 27 of 37 Quantifiers and deductive databases Lecture 03

Example: authors database

The scenario Consider a database containing information about book authors. It is reasonable to store authors in one relation, say A, and book titles in another relation, say T. The intended meaning of relations is the following:

  • A(N, Id) means that the author whose name is N has its

identification number Id

  • T(Id, Ttl) means that the author whose identification number

is Id is a (co-) author of the book entitled Ttl.

Lecture 03 Slide 28 of 37

slide-25
SLIDE 25

Quantifiers and deductive databases Lecture 03

Example: authors database – continued

The scenario Now, for example, one can define a new relation Co(N1, N2), meaning that authors N1 and N2 co-authored a book: Co(N1,N2):- A(N1,Id1), T(Id1,B), A(N2,Id2), T(Id2,B), N1=/= N2. Note that the above rule reflects the intended meaning provided that authors have unique Id numbers, i.e., that the following integrity constraint holds: ∀n, m, i, j [(A(n, i) ∧ A(n, j)) → i = j]∧ [(A(n, i) ∧ A(m, i)) → n = m]

Lecture 03 Slide 29 of 37 Quantifiers and deductive databases Lecture 03

Example: authors database – continued

Querying the author database In the authors database:

  • Co(‘Widom’,‘Ullman’) returns t iff the database contains a

book co-authored by Widom and Ullman

  • Co(N,‘Ullman’) returns a unary relation consisting of all

co-authors of Ullman. First-order logic as a query language In fact, query language can easily be made more general by allowing all first-order formulas to be database queries. For example, the query ∀X, Y [T(X, Y ) → ∃Z(T(X, Z) ∧ Z = Y )] returns t iff any person in the database is an author of at least two books.

Lecture 03 Slide 30 of 37 Quantifiers and deductive databases Lecture 03

Solving the initial problem: train and bus connections

Assume we have relations:

  • bus(P1, P2) meaning that there is a direct bus connection

between places P1 and P2

  • train(P1, P2) meaning that there is a direct train connection

between places P1 and P2. We define a new relation connected(P1, P2) meaning that places P1 and P2 are connected by train/bus connections, perhaps indirectly (changes of busses and trains might be necessary): connected(P1, P2)

def

≡ [bus(P1, P2) ∨ train(P1, P2)∨ ∃Q(connected(P1, Q) ∧ connected(Q, P2))].

Lecture 03 Slide 31 of 37 Quantifiers and deductive databases Lecture 03

Example: train and bus connections – continued

Rules connected(P1,P2):- bus(P1,P2). connected(P1,P2):- train(P1,P2). connected(P1,P2):- connected(P1,Q),connected(Q,P2).

Lecture 03 Slide 32 of 37

slide-26
SLIDE 26

Quantifiers and deductive databases Lecture 03

Example

The scenario Design a Datalog database for storing information about events. Each event is characterized by its name, date and duration. In addition, for each event the database contains information about a number of related events. Formulate in logic queries selecting:

1 all events later than 2006-02-03, whose duration greater than

10 and which are related to an event named “project evaluation”

2 all events related (maybe indirectly) to a given event (a given

event E is indirectly related to an event F if there are events E1, . . . , Ek with k ≥ 1 such that E is related to E1, E1 is related to E2, . . ., Ek is related to F.

Lecture 03 Slide 33 of 37 Quantifiers and deductive databases Lecture 03

Example – continued

Relations

  • Event(N,D,U) – meaning that an event named N occurred in

date D and it’s duration has been U

  • Related(N1,N2) – meaning that events N1 and N2 are

related to each other. Integrity constraints

  • Related(N1, N2) → Related(N2, N1)
  • ∀N∀D∀U∀D′∀U′

[(Event(N, D, U) ∧ Event(N, D′, U′)) → (D =D′ ∧ U =U′)]

Lecture 03 Slide 34 of 37 Quantifiers and deductive databases Lecture 03

Example – continued

Queries

1 Event(N,D,U), D>‘2006-02-03’, U>10. 2 We need a new relation “related directly or indirectly”:

diRelated(N1,N2):- Related(N1,N2) diRelated(N1,N2):- Related(N1,N3), diRelated(N3,N2). The required query: diRelated(N1,given), where given is the name of the given event.

Lecture 03 Slide 35 of 37 Quantifiers and deductive databases Lecture 03

Data and expression complexity of Datalog queries

Data complexity Data complexity of Datalog queries is QPTime and QPSpace. Expression complexity Expression complexity of first-order and Datalog queries is QExpTime.

Lecture 03 Slide 36 of 37

slide-27
SLIDE 27

Quantifiers and deductive databases Lecture 03

Data complexity of some other formalisms

Second-order queries

  • Data complexity of existential second-order queries is

QNPTime (Fagin 1973).

  • Data complexity of second-order queries is in QPSpace

(Stockmeyer 1977 – in fact he proved that it corresponds to the polynomial time hierarchy). Remark To prove Fagin’s theorem encode a problem known to be in

  • QNPTime. Hint: use 3-colorability of a graph.

Lecture 03 Slide 37 of 37 Deductive databases: negation, fixpoint logics

CUGS: Logic II

Lecture 04

Lecture 04 Slide 1 of 39 Deductive databases: negation, fixpoint logics Lecture 04

The problem

  • Given a database of direct train and bus connections, find

places connected directly or indirectly by train but not connected (directly or indirectly) by bus.

  • Given a database of family parent-child relationships.

Check whether person A is not an ancestor of person B. The problem is to express the above queries and other queries, where negation is involved.

Lecture 04 Slide 2 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Rules in Datalog (a reminder)

Rules in Datalog: R(¯ Z):− R1(¯ Z1), . . . , Rk(¯ Zk) do not allow to use negation. Remark Datalog is insufficient when negation is involved.

Lecture 04 Slide 3 of 39

slide-28
SLIDE 28

Deductive databases: negation, fixpoint logics Lecture 04

Datalog¬ and Datalog¬¬

Datalog¬ Rules: R(¯ Z):− ± R1(¯ Z1), . . . , ±Rk(¯ Zk), where ± is either negation or the empty symbol. Datalog¬¬ Rules: ±R(¯ Z):− ± R1(¯ Z1), . . . , ±Rk(¯ Zk), where ± is either negation or the empty symbol.

Lecture 04 Slide 4 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Problems with negation in heads of rules

Example ¬safe(X):− hot(X). safe(X):− guarded(X). hot(a). guarded(a). We have a contradiction, which trivializes reasoning in classical first-order logic.

Lecture 04 Slide 5 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Problems with negation in heads of rules

Complexity R(X):− ¬G(X), ¬B(X). G(X):− ¬R(X), ¬B(X). B(X):− ¬R(X), ¬G(X). ¬R(X):− R(Y ), E(X, Y ). ¬G(X):− G(Y ), E(X, Y ). ¬B(X):− B(Y ), E(X, Y ). Observe that the above rules express 3-colorability of a graph, an NPTime-complete problem.

Lecture 04 Slide 6 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Problems with negation in bodies of rules

Potential nondeterminism Observe that: [(A ∧ ¬B) → C] ≡ [A → (B ∨ C)]. Semantics? p:− ¬p, ¬r. The result is p (obtained due to CWA). On the other hand, body

  • f the rule is no longer true, so the conclusion p is no longer

supported. Why a simpler rule p:− ¬p. has not been used instead?

Lecture 04 Slide 7 of 39

slide-29
SLIDE 29

Deductive databases: negation, fixpoint logics Lecture 04

Semipositive Datalog

Definition Semipositive Datalog allows negation in literals in bodies of rules, assuming that negated literals are part of the extensional database. Remark Negation is “artificial”: By CWA one can add to extensional database complements of relations corresponding to negated

  • literals. On the other hand, one cannot express negation without

extending extensional database in a similar way.

Lecture 04 Slide 8 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Example

Let G be a binary relation stored in the extensional part of a database. Rules: T(X, Y ):− ¬G(X, Y ). T(X, Y ):− ¬G(X, Z), T(Z, Y ). computes the transitive closure of the complement of G. Remark The above semipositive program is not expressible in first order logic nor in Datalog.

Lecture 04 Slide 9 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Stratified Datalog¬

Intuition

  • Generalize semipositive programs in the following way:

the program is supposed to be divided into strata where the first stratum consists of the extensional database and negation can be applied in a given stratum to literals corresponding to relations fully defined in earlier strata.

  • When a given stratum is considered, all earlier strata can be

considered as the extensional database, as in the case

  • f semipositive programs.

Lecture 04 Slide 10 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Stratified Datalog¬

Definition A stratification of a Datalog¬ program P is a sequence of Datalog¬ programs P1, . . . , Pn such that there is a mapping σ from the set of predicates of P to {1, . . . , n} satisfying:

  • P1, . . . , Pn is a partition of P
  • for each predicate R, all the rules in P defining R are in Pσ(R)

(i.e., in the same program of the partition)

  • if R(u):− . . . R′(v) . . . is a rule in P and R′ is an intensional

relation then σ(R′) ≤ σ(R)

  • if R(u):− . . . ¬R′(v) . . . is a rule in P and R′ is an intensional

relation then σ(R′) < σ(R).

Lecture 04 Slide 11 of 39

slide-30
SLIDE 30

Deductive databases: negation, fixpoint logics Lecture 04

Stratified Datalog¬

Definition

  • Given stratification P1, . . . , Pn of P, each Pi is called

a stratum.

  • A Datalog¬ program P is stratisfiable iff there is

a stratification of P. Fact Checking whether a program is stratifiable can be done in PTime.

Lecture 04 Slide 12 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Solving the initial problem: train connections without bus connections

Assume we have relations:

  • bus(P1, P2) meaning that there is a direct bus connection

between places P1 and P2

  • train(P1, P2) meaning that there is a direct train connection

between places P1 and P2. We will first define new relations connectedB(P1, P2) and connectedT(P1, P2) meaning that places P1 and P2 are connected by busses (respectively, trains) perhaps indirectly and then use them in the final query.

Lecture 04 Slide 13 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Solving the initial problem – continued

Rules connectedB(P1, P2):− bus(P1, P2). connectedB(P1, P2):− bus(P1, Q), connectedB(Q, P2). connectedT(P1, P2):− train(P1, P2). connectedT(P1, P2):− train(P1, Q), connectedT(Q, P2). query(P1, P2):− connectedT(P1, P2), ¬connectedB(P1, P2).

Lecture 04 Slide 14 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Data complexity of stratified Datalog¬

Stratified Datalog¬ queries

  • Data complexity of stratified Datalog¬ queries is QPTime.
  • Assuming that the database domain is linearly ordered,

Datalog¬ captures QPTime (i.e., every PTime-computable query can be expressed as a Datalog¬ query).

Lecture 04 Slide 15 of 39

slide-31
SLIDE 31

Deductive databases: negation, fixpoint logics Lecture 04

What are Fixpoints?

Given a mapping f : 2Dom − → 2Dom, where 2Dom denotes the powerset of Dom, i.e., the set of all subsets of Dom

  • by a fixpoint of f , if exists, we understand A ⊆ Dom such

that A = f (A).

  • by the least and the greatest fixpoint of f , if exist, we mean

respectively fixpoints A, B ⊆ Dom such that for all fixpoints C of f , we have that A ⊆ C and C ⊆ B.

  • The least fixpoint of f , if exists, is denoted by Lfp X.f (X)

and the greatest fixpoint of f , if exists, is denoted by Gfp X.f (X).

Lecture 04 Slide 16 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Examples

1 Let f (A) def

= −A. Then f has no fixpoint — why?

2 Let

f (A) def = ∅ for A = ∅ −A

  • therwise.

Then the least and the greatest fixpoint of f is ∅.

3 Let f : 2ω −

→ 2ω be defined by f (A) def = A for A ω ∅ for A = ω. Then the least fixpoint of f is ∅ and the greatest fixpoint of f does not exist (why?).

Lecture 04 Slide 17 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Monotone mappings

Given a set Dom and a mapping f : 2Dom − → 2Dom, we say that f is monotone provided that for any A, B ⊆ Dom, A ⊆ B implies f (A) ⊆ f (B). Examples

1 f (A) def

= A is monotone

2 f (A) def

= B ∩ A is monotone

3 let A ⊆ ω. Then f (A) def

= A ∪ {min(ω − A)} is monotone.

Lecture 04 Slide 18 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Further examples

1 none of mappings considered in examples given in slide 17 is

monotone — why?

2 recall that every classical first-order formula represents a set

  • f tuples satisfying the formula (elements of tuples correspond

to free variables). We then have:

  • f (X(x, y))

def

  • x = y ∨ ∃z[R(x, z) ∧ X(z, y)]
  • — is monotone
  • f (X(x, y))

def

  • x = y ∨ ∃z[R(x, z) ∧ ¬X(z, y)]
  • — is not monotone

Lecture 04 Slide 19 of 39

slide-32
SLIDE 32

Deductive databases: negation, fixpoint logics Lecture 04

Knaster-Tarski Theorem

Theorem [Knaster-Tarski] Let f : 2Dom − → 2Dom be a monotone mapping. Then there exist the least and the greatest fixpoint of f . Remark In what follows we shall prove the theorem in the case of finite domains only. The general case is not difficult, but we do not need it here. As a side effect of the simplified proof we obtain the Kleene characterization of fixpoints (in the general case valid for continuous mappings).

Lecture 04 Slide 20 of 39 Deductive databases: negation, fixpoint logics Lecture 04

The case of finite domains

Proof of Knaster-Tarski theorem Let Dom be a finite domain and let f be monotone. We first prove that

  • i∈ω

f i(∅) is a fixpoint. Under our assumptions: ∅ ⊆ f (∅) — since ∅ is the least element of 2Dom f (∅) ⊆ f (f (∅)) — by monotonicity of f . . . f k(∅) ⊆ f k+1(∅) for any k ∈ ω. Dom is finite. For any k ∈ ω, either f k(∅) = f k+1(∅) or f k+1(∅) contains at least one element which is not in f k(∅). By finiteness

  • f Dom, this can happen at most a finite number of times.

Therefore

  • i∈ω

f i(∅) is a fixpoint of f .

Lecture 04 Slide 21 of 39 Deductive databases: negation, fixpoint logics Lecture 04

The case of finite domains

Proof of Knaster-Tarski theorem – continued Let us prove that

  • i∈ω

f i(∅) is the least fixpoint. Let B be another fixpoint of f , i.e., f (B) = B. Now: ∅ ⊆ B f (∅) ⊆ f (B) = B — by monotonicity of f . . . f k(∅) ⊆ f k(B) = B for any k ∈ ω. Therefore,

k∈ω{f k(∅) | k ∈ ω} ⊆ B.

Lecture 04 Slide 22 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Fixpoints defined by means of first-order formulas

Set-theoretical concept Representation in logic a set A characteristic formula A(x) A(x) = t iff x ∈ A the empty set ∅ f the whole domain Dom t inclusion ⊆ implication → equality = equivalence ≡ complement − negation ¬ intersection ∩ conjunction ∧ union ∪ disjunction ∨

Lecture 04 Slide 23 of 39

slide-33
SLIDE 33

Deductive databases: negation, fixpoint logics Lecture 04

Examples

1 Let f (P) def

≡ (P ∪ Q). Then the corresponding definition in the classical first-order logic is f (P(x))

def

≡ (P(x) ∨ Q(x)). Then the least and the greatest fixpoints of f are:

  • the least fixpoint:

f (f) ≡ (f ∨ Q(x)) ≡ Q(x), f 2(f) ≡ f (f (f)) ≡ f (Q(x)) ≡ (Q(x) ∨ Q(x)) ≡ Q(x) (i.e., Q(x) is the least fixpoint)

  • the greatest fixpoint:

f (t) ≡ (t ∨ Q(x)) ≡ t, (i.e., t is the greatest fixpoint).

2 Let f (P(x)) def

≡ R(x, x) ∨ ∃y[R(x, y) ∧ P(y)] be a mapping

  • ver first formulas ordered by implication — what are the

least and the greatest fixpoints of f ?

Lecture 04 Slide 24 of 39 Deductive databases: negation, fixpoint logics Lecture 04

The duality of the least and greatest fixpoints

Least and greatest fixpoints are dual to each other. For every function f : 2Dom − → 2Dom we define the dual fixpoint operator f d(X) = −f (−X). If f is monotone then f d is also monotone and Lfp X.f (X) = −Gfp X.f d(X) Gfp X.f (X) = −Lfp X.f d(X). Therefore: Gfp X.f (X) = −Lfp X.f d(X) = −

  • i∈ω

(f d)

i(∅) =

  • i∈ω

−f i(−∅) =

  • i∈ω

f i(Dom)

Lecture 04 Slide 25 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Examples

1 Let f (X(x)) def

≡ R(x, x) ∨ ∃y[R(x, y) ∧ X(y)]. Then f d(X(x))

def

≡ ¬

  • R(x, x) ∨ ∃y[R(x, y) ∧ ¬X(y)]
  • ≡ ¬R(x, x) ∧ ∀y[¬R(x, y) ∨ X(y)]

2 Let f (X) def

= R ∨ X. Then f d(X) def = ¬f (¬X) = ¬R ∧ X. Now:

  • Lfp X.f d(X) = f
  • Gfp X.f d(X) = ¬R
  • Lfp X.f (X) = R = ¬Gfp X.f d(X)
  • Gfp X.f (X) = t = ¬Lfp X.f d(X).

Lecture 04 Slide 26 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Positive and monotone formulas

  • A literal is either an atomic formula (then it is also called the

positive literal) or its negation (then it is also called the negative literal).

  • A classical first-order formula is in the negation normal form,

if the negation sign ¬ appears in literals only (if at all) and it contains no other connectives than ∧, ∨, ¬.

  • A formula A is positive wrt a given atomic formula S iff S

appears in A only in the form of a positive literal.

  • A formula A is negative wrt a given atomic formula S iff S

appears in A only in the form of a negative literal.

  • A formula A(S) is monotone wrt an atomic formula S iff for

any formulas B, C, if B → C is true then A(B) → A(C) is true, too.

Lecture 04 Slide 27 of 39

slide-34
SLIDE 34

Deductive databases: negation, fixpoint logics Lecture 04

Examples

  • P(x) ∨ ¬R(x, y, z) ∧ ¬S(x, y) is positive wrt P and negative

wrt R and S

  • P(x) ∨ [R(x, y, z) ∧ ¬P(y)] is neither positive wrt P nor

negative wrt P

  • P(x) ∧ ¬P(x) is monotone wrt P, but not positive wrt P.

Lemma Any formula of the classical propositional or first-order logic positive wrt a given atomic formula S is also monotone wrt S.

Lecture 04 Slide 28 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Fixpoint calculus (fixpoint logic)

Definition By the monotone fixpoint logic we mean an extension of classical first-order logic, obtained by allowing also fixpoint formulas of the form Lfp X(¯ x).A(X) Gfp X(¯ x).A(X), where A(X) is a monotone first-order or fixpoint formula. Important Testing first-order formulas for monotonicity is undecidable. Therefore much more frequent fixpoint logic applied in practice is the positive fixpoint logic.

Lecture 04 Slide 29 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Positive fixpoint logic

Definition By the positive fixpoint logic (or fixpoint logic, for short) we mean an extension of the classical first-order logic, obtained by allowing also fixpoint formulas of the form: Lfp X(¯ x).A(X) Gfp X(¯ x).A(X), where A(X) is a positive first-order or fixpoint formula.

Lecture 04 Slide 30 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Semantics of fixpoint calculus

The semantics of Lfp X.f (X) and Gfp X.f (X) is the least and the greatest fixpoint of f (X), i.e. the least and the greatest relation X such that X ≡ f (X). Since f is assumed monotone or positive w.r.t. X, such fixpoints exist. More precisely Let a relational structure be fixed. Any valuation v of free variables can be extended to valuation assigning Boolean values to fixpoint formulas as follows: v(Lfp X(¯ x).A(X)) = the least (w.r.t. ⊆) relation S such that S(x) ≡ v(A(S)) v(Gfp X(¯ x).A(X)) = the greatest (w.r.t. ⊆) relation S such that S(x) ≡ v(A(S)).

Lecture 04 Slide 31 of 39

slide-35
SLIDE 35

Deductive databases: negation, fixpoint logics Lecture 04

Examples

  • The transitive closure of a binary relation R can be defined by

the following fixpoint formula: Tc[R(x, y)] ≡ Lfp X(x, y).[R(x, y) ∨ ∃z(R(x, z) ∧ X(z, y))].

  • The transitive closure of the complement of R is now trivial:

Tc’[R(x, y)] ≡ Lfp X(x, y).[¬R(x, y)∨∃z(¬R(x, z)∧X(z, y))]. Note that the formula under Lfp X(x, y). is positive w.r.t.

  • X. Negations applied to predicates not bound by fixpoint
  • perators are allowed.

Lecture 04 Slide 32 of 39 Deductive databases: negation, fixpoint logics Lecture 04

The initial problem revisited: train connections without bus connections

Recall that we are given relations bus(P1, P2) and train(P1, P2). Again we first define relations connectedB(P1, P2) and connectedT(P1, P2): connectedB(P1, P2) ≡ Lfp X(P1, P2).[bus(P1, P2)∨ ∃P(bus(P1, P) ∧ X(P, P2))] connectedT(P1, P2) ≡ Lfp X(P1, P2).[train(P1, P2)∨ ∃P(train(P1, P) ∧ X(P, P2))] Query: connectedT(P1, P2) ∧ ¬connectedB(P1, P2).

Lecture 04 Slide 33 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Example

Assume we are given a unary relation Wise and a binary relation Colleague defined on the set Persons and suppose we want to calculate the relation Wisest as the greatest relation satisfying the following constraint, meaning that Wisest are those who are wise and have only wisest colleagues: ∀x Wisest(x) → (Wise(x) ∧ ∀y(Colleague(x, y) → Wisest(y))). The Wisest relation is defined by the fixpoint formula Gfp X(x).[Wise(x) ∧ ∀y(Colleague(x, y) → X(y))], since we are interested in calculating the greatest such relation.

Lecture 04 Slide 34 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Example

Winning strategy Consider a game with states a, b, .... The game is between two

  • players. The possible moves of the games are stored in a binary

relation M. A tuple M(a, b) indicates that when in a state a, one can chose to move to state b. A player loses if he or she is in a state from which there are no

  • moves. The goal is to compute the set of winning states (i.e., the

set of states such that there exists a winning strategy for a player in this state). These are defined by a unary predicate W . Then winning states are then defined by: W (x) ≡ Lfp X(x).{∃y[M(x, y) ∧ ∀z(M(y, z) → X(z))]}.

Lecture 04 Slide 35 of 39

slide-36
SLIDE 36

Deductive databases: negation, fixpoint logics Lecture 04

Complexity of fixpoint calculus

1 Checking whether a fixpoint formula is satisfiable or whether

it is a tautology are not partially computable problems.

2 Given a fixpoint formula, checking its satisfiability or validity

  • ver a given finite domain relational structure is in PTime

w.r.t. the size of the structure.

3 Data complexity of fixpoint logics (positive as well as

monotone) is in QPTime.

4 Assuming that the database domain is linearly ordered,

fixpoint logics (positive as well as monotone) capture QPTime.

Lecture 04 Slide 36 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Proof of 2 (sketch)

In order to see that satisfiability or validity over a given finite domain relational structure is in PTime, assume that a give fixpoint formula defines a k-argument relation R(x1, . . . , xk) ≡ Lfp X(x1, . . . , xk).f (X, x1, . . . , xk) and that the domain contains n elements. It follows that there are at most nk tuples in R. As indicated in slide 21, R(x1, . . . , xk) ≡

  • i∈ω

f i(f, x1, . . . , xk), where f i(f, x1, . . . , xk) | i ∈ ω is non-decreasing. The number of iterations to compute the fixpoint is not greater than nk (each iteration either stabilizes at the least fixpoint, or adds at least one tuple). Each iteration (by inductive argument) requires at most a polynomial time to calculate the result.

Lecture 04 Slide 37 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Translating Datalog into fixpoint queries

Assume that the following Datalog rules are all rules with R in their heads, where we assume that ¯ Z consists of variables only: R(¯ Z):− A1(R, ¯ Z1) . . . R(¯ Z):− Ap(R, ¯ Zp) The rules can equivalently be expressed by a single formula: ∃ ¯ Y

  • A1(R, ¯

Z1) ∨ . . . ∨ Ap(R, ¯ Zp)

  • → R(¯

Z), where ¯ Y consists of all variables appearing in ¯ Z1, . . . , ¯ Zp and not appearing in ¯ Z.

Lecture 04 Slide 38 of 39 Deductive databases: negation, fixpoint logics Lecture 04

Translating Datalog into fixpoint queries

Now R can be defined by the following fixpoint: R(¯ Z) ≡ Lfp X(¯ Z).∃ ¯ Y

  • A1(X, ¯

Z1) ∨ . . . ∨ Ap(X, ¯ Zp)

  • .

Remark Let us emphasize that the full semantics of Datalog is best presented by the use of simultaneous fixpoints. This subject is beyond the scope of this lecture (but see the literature).

Lecture 04 Slide 39 of 39

slide-37
SLIDE 37

Reasoning in first-order logic

CUGS: Logic II

Lecture 05

Lecture 05 Slide 1 of 29 Reasoning in first-order logic Lecture 05

The problem

How to prove the validity of first-order formulas? For example, how can one prove:

  • ∀x[p(x) ∧ q(x)] → [∀xp(x) ∧ ∀xq(x)]
  • ∀xP(a, x, x) ∧ ¬∃x∃y∃z[P(x, y, z) ∧ ¬P(f (x), y, f (z))]

implies ∃zP(f (a), z, f (f (a))). Most importantly, can such proofs be done automatically?

Lecture 05 Slide 2 of 29 Reasoning in first-order logic Lecture 05

Complexity of reasoning in first-order logic

  • The problem of checking whether a given first-order formula

is a tautology is uncomputable (Church 1936, Turing 1937) but it is partially computable (G¨

  • del 1929).
  • The problem of checking whether a given first-order formula

is satisfiable, is not partially computable (Church 1936).

Lecture 05 Slide 3 of 29 Reasoning in first-order logic Lecture 05

Tableaux

γ and δ formulas γ γ(a) δ δ(a) ∀x A(x) A(a) ∃x A(x) A(a) ¬∃x A(x) ¬A(a) ¬∀x A(x) ¬A(a) Semantic tableau A semantic tableau for formula A is a tree T each node of which is labeled with a set of formulas. The tableau construction extends the construction given for propositional calculus.

Lecture 05 Slide 4 of 29

slide-38
SLIDE 38

Reasoning in first-order logic Lecture 05

Tableaux (for formulas without function symbols)

Construction of a semantic tableau for formula A Initially T consists of a single node labeled with {A}. Until possible choose an unmarked leaf l labeled with U(l) and apply:

  • if U(l) contains a pair of complementary literals

{p(a1, . . . , ak), ¬p(a1, . . . , ak))} then mark the leaf closed

  • if U(l) does not contain a pair of complementary literal

and contains non-literals, choose a non-literal A ∈ U(l)

  • if A is an α or β formula, apply rules provided

for propositional calculus

  • if A is a γ formula, create a child node l′ for l

and label l′ with U(l′) = U(l) ∪ {γ(a)}, where a is a constant preferably appearing in U(l).

  • if A is a δ formula, create a child node l′ for l

and label l′ with U(l′) = (U(l) − {A}) ∪ {δ(a)}, where a is a constant not appearing in U(l).

Lecture 05 Slide 5 of 29 Reasoning in first-order logic Lecture 05

Soundness and completeness

Closed and open branches A branch in a tableau is closed if it terminates in a leaf marked

  • closed. Otherwise the branch is open.

Soundness Let A be a formula and let T be a tableau for A. If all branches

  • f T are closed then A is unsatisfiable (equivalent to f).

Completeness Let A be a valid formula (equivalent to t). Then there is a tableau T for ¬A with all branches closed.

Lecture 05 Slide 6 of 29 Reasoning in first-order logic Lecture 05

Solving the first problem

Proof of ∀x[p(x) ∧ q(x)] → [∀xp(x) ∧ ∀xq(x)] ∀x[p(x) ∧ q(x)] ∧ [¬∀xp(x) ∨ ¬∀xq(x)] ↓ ∀x[p(x) ∧ q(x)], [¬∀xp(x) ∨ ¬∀xq(x)] ւ ց ∀x[p(x) ∧ q(x)], ¬∀xp(x) ∀x[p(x) ∧ q(x)], ¬∀xq(x) ↓ ↓ ∀x[p(x) ∧ q(x)], ¬p(a) ∀x[p(x) ∧ q(x)], ¬q(b) ↓ ↓ p(a) ∧ q(a), ¬p(a) p(b) ∧ q(b), ¬q(b) ↓ ↓ p(a), q(a), ¬p(a)

  • closed

p(b), q(b), ¬q(b)

  • closed

Lecture 05 Slide 7 of 29 Reasoning in first-order logic Lecture 05

PNF: Prenex Normal Form

We say A is in prenex normal form, abbreviated by Pnf, if all its quantifiers (if any) are in its prefix, i.e., it has the form: Q1x1 Q2x2 . . . Qnxn [A(x1, x2, . . . , xn)], where n ≥ 0, Q1, Q2, . . . , Qn are quantifiers (∀, ∃) and A is quantifier-free. Examples

1 A(x) ∧ B(x, y) ∧ ¬C(y) is in Pnf 2 ∀x∃y [A(x) ∧ B(x, y) ∧ ¬C(y)] is in Pnf 3 A(x) ∧ ∀x [B(x, y) ∧ ¬C(y)] as well as

∀x [A(x) ∧ B(x, y) ∧ ¬∃yC(y)] are not in Pnf.

Lecture 05 Slide 8 of 29

slide-39
SLIDE 39

Reasoning in first-order logic Lecture 05

Transforming formulas into PNF

Any predicate formula can be equivalently transformed into the Pnf

1 Transform the formula into Nnf 2 Replace subformulas according to the table below, until Pnf

is obtained, where Q denotes any quantifier, ∀ or ∃. Rule Subformula Replaced by 12 Qx A(x), A(x) without variable z Qz A(z) 13 ∀x A(x) ∧ ∀x B(x) ∀x [A(x) ∧ B(x)] 14 ∃x A(x) ∨ ∃x B(x) ∃x [A(x) ∨ B(x)] 15 A ∨ Qx B, where A contains no x Qx (A ∨ B) 16 A ∧ Qx B, where A contains no x Qx (A ∧ B)

Lecture 05 Slide 9 of 29 Reasoning in first-order logic Lecture 05

Example

Transforming a formula into Pnf A(z) ∨ ∀x [B(x, u) ∧ ∃yC(x, y) ∧ ∀z D(z)]

(15)

← → ∀x {A(z) ∨ [B(x, u) ∧ ∃yC(x, y) ∧ ∀z D(z)]}

(16)

← → ∀x {A(z) ∨ ∃y [B(x, u) ∧ C(x, y) ∧ ∀z D(z)]}

(15)

← → ∀x∃y {A(z) ∨ [B(x, u) ∧ C(x, y) ∧ ∀z D(z)]}

(16)

← → ∀x∃y {A(z) ∨ ∀z [B(x, u) ∧ C(x, y) ∧ D(z)]}

(12)

← → ∀x∃y {A(z) ∨ ∀t [B(x, u) ∧ C(x, y) ∧ D(t)]}

(15)

← → ∀x∃y∀t {A(z) ∨ [B(x, u) ∧ C(x, y) ∧ D(t)]}.

Lecture 05 Slide 10 of 29 Reasoning in first-order logic Lecture 05

Functions and terms

Terms Let F be a set of function symbols. The following grammar defines terms. The rule for atomic formula is modified to take a term list as its argument. term ::= x for any x ∈ V term ::= a for any a ∈ A term ::= f (term list) for any f ∈ F term list ::= term term list ::= term, term list atomic formula ::= p(term list) for any p ∈ P. Examples Examples of terms: a, x, f (a, x), f (g(x), y), g(f (a, g(b))). Examples of atomic formulas: p(a, b), p(g(f (a, g(b)), f (a, x)).

Lecture 05 Slide 11 of 29 Reasoning in first-order logic Lecture 05

Semantics

Interpretations Let U be a set of formulas such that {p1, . . . , pm} are all relation symbols, {f1, . . . , fl} are all the function symbols and {a1, . . . , ak} are all constant symbols appearing in U. An interpretation is a tuple D, {R1, . . . , Rm}, {F1, . . . , Fl}, {d1, . . . , dk}, where

  • D is a non-empty domain
  • Ri is an assignment of a relation on D to pi
  • Fi is an assignment of a function on D to fi
  • di ∈ D is an assignment of an element of D to ai.

Lecture 05 Slide 12 of 29

slide-40
SLIDE 40

Reasoning in first-order logic Lecture 05

Semantics

Let t be a term, I an interpretation and σI an assignment. The value of t under σI, denoted by vσI(t) is defined by

1 vσI(ai) = di 2 vσI(x) = σI(x), where x ∈ V 3 vσI(fi(t1, . . . , tn)) = Fi(vσI(t1), . . . , vσI(tn)).

For atomic formulas:

  • vσI(pi(t1, . . . , tn)) = t iff (vσI(t1), . . . , vσI(tn)) ∈ Ri.

Lecture 05 Slide 13 of 29 Reasoning in first-order logic Lecture 05

Skolem normal form

A formula is in the Skolem normal form iff it is in the Pnf and contains no existential quantifiers. Transforming formulas into the Skolem normal form Eliminate existential quantifiers from left to right:

1 when we have ∃xA(x), remove ∃x and replace x in A

by a new constant symbol

2 when we have ∀x1 . . . ∀xk∃xA(x1, . . . , xk, x),

remove ∃x and replace x in A by a term f (x1, . . . , xk), where f is a new function symbol.

Lecture 05 Slide 14 of 29 Reasoning in first-order logic Lecture 05

Skolem normal form

Example

1 Skolem normal form for ∃x∀y∀z∃u[R(x, y, z, u)]

is ∀y∀z[R(a, y, z, f (y, z)]

2 Skolem normal form for ∃x∀y∃z∀u∃w[R(x, y, z, u, w)]

is ∀y∀u[R(a, y, g(y), u, h(y, u)]. Important A formula is satisfiable iff its Skolem normal form is satisfiable.

Lecture 05 Slide 15 of 29 Reasoning in first-order logic Lecture 05

Generalizing resolution to predicate calculus

Consider the following two formulas: ∀x [rare(x) → expensive(x)] ∀y [expensive(y) → guarded(y)] Clause form: ¬rare(x) ∨ expensive(x) ¬expensive(y) ∨ guarded(y) Can we apply (res)? NO! Because expensive(x) and expensive(y) are not the same. On the other hand, ∀y could equivalently be replaced by ∀x and resolution could be applied.

Lecture 05 Slide 16 of 29

slide-41
SLIDE 41

Reasoning in first-order logic Lecture 05

Generalizing resolution to predicate calculus

Consider: ¬rare(x) ∨ expensive(x) rare(VermeerPainting) We would like to conclude expensive(VermeerPainting). But again (res) cannot be applied. We need a general method to solve such problems. It is offered by unification.

Lecture 05 Slide 17 of 29 Reasoning in first-order logic Lecture 05

Unification

Given two expressions, unification depends od substituting variables by expressions so that both input expressions become identical. If this is possible, the given expressions are said to be unifiable. Examples

1 To unify expressions father(x) and father(mother(John))

it suffices to substitute x by expression mother(John).

2 To unify expressions (x + f (y)) and (

√ 2 ∗ z + f (3)), it suffices to substitute x by √ 2 ∗ z and y by 3

3 Expressions father(x) and mother(father(John))

cannot be unified.

Lecture 05 Slide 18 of 29 Reasoning in first-order logic Lecture 05

Expression trees

Expressions can be represented as trees. For example, expressions f (g(x, h(y))) and f (g(f (v), z)) can be represented as:

f (g(x, h(y)))

f g x h y

f (g(f (v), z))

f g f z v

In order to unify expressions we have to detect nonidentical sub-trees and try to make them identical.

Lecture 05 Slide 19 of 29 Reasoning in first-order logic Lecture 05

The unification algorithm

Input: expressions e, e′ Output: substitution of variables which makes e and e′ identical

  • r inform that such a substitution does not exist.

1 traverse trees corresponding to expressions e, e′ 2 if the trees are identical then stop 3 let t and t′ be subtrees that have to be identical, but are not

  • if t and t′ are function symbols/constants then conclude that

the substitutions do not exist and stop

  • otherwise t or t′ is a variable; let t be a variable, then

substitute all occurrences of t by the expression represented by t′ assuming that t does not occur in t′ (if it occurs then conclude that the substitutions do not exist and stop)

4 change the trees, according to the substitution determined

in the previous step and repeat from step 2.

Lecture 05 Slide 20 of 29

slide-42
SLIDE 42

Reasoning in first-order logic Lecture 05

Resolution rule for the predicate calculus

Resolution rule, denoted by (res), is formulated for first-order clauses as follows: L1(¯ t1) ∨ . . . ∨ Lk−1(¯ tk−1) ∨ Lk(¯ tk) ¬Lk( ¯ t′

k) ∨ M1(¯

s1) ∨ . . . ∨ Ml(¯ sl) L1(¯ t1

′) ∨ . . . ∨ Lk−1(¯

t′

k−1) ∨ M1(¯

s′

1) ∨ . . . ∨ Ml(¯

s′

l ),

where:

  • L1, . . . , Lk, M1, . . . , Ml are literals
  • tk and t′

k are unifiable

  • primed expressions are obtained from non-primed expressions

by applying substitutions unifying tk and t′

k.

Lecture 05 Slide 21 of 29 Reasoning in first-order logic Lecture 05

Examples

  • (res) with x = John, y = Mary:

¬parent(x, y) ∨ x = father(y) ∨ x = mother(y) parent(John, Mary) John = father(Mary) ∨ John = mother(Mary)

  • (res) with y = x:

¬rare(x) ∨ expensive(x), ¬expensive(y) ∨ guarded(y) ¬rare(x) ∨ guarded(x)

  • (res) with x = VermeerPainting:

¬rare(x) ∨ expensive(x), rare(VermeerPainting) expensive(VermeerPainting)

Lecture 05 Slide 22 of 29 Reasoning in first-order logic Lecture 05

Factorization rule for the predicate calculus

The rule Factorization rule, denoted by (fctr): Unify some terms in a clause and remove from the clause all repetitions of literals. Examples

1 (fctr) with x = Jack, y = mother(Eve):

parent(x, y) ∨ parent(Jack, mother(Eve)) parent(Jack, mother(Eve))

2 (fctr) with z = x, u = y:

P(x, y) ∨ S(y, z, u) ∨ P(z, u) P(x, y) ∨ S(y, x, y)

Lecture 05 Slide 23 of 29 Reasoning in first-order logic Lecture 05

Example of proof by resolution

Consider the following formulas, where med stands for “medicine student” (recall that clauses are assumed to be implicitly universally quantified): ∆ =        [med(x) ∧ attends(x, y)] → [likes(x, y) ∨ obliged(x, y)] likes(x, math) → ¬med(x) med(John) attends(John, math) Do the above formulas imply that John is obliged to take math?

Lecture 05 Slide 24 of 29

slide-43
SLIDE 43

Reasoning in first-order logic Lecture 05

Example of proof by resolution

We consider implication: ∆ → obliged(John, math) To apply the resolution method we negate this formula and obtain ∆ ∧ ¬obliged(John, math), i.e., [med(x) ∧ attends(x, y)] → [likes(x, y) ∨ obliged(x, y)] likes(x, math) → ¬med(x) med(John) attends(John, math) ¬obliged(John, math).

Lecture 05 Slide 25 of 29 Reasoning in first-order logic Lecture 05

Example of proof by resolution

Proof

1 ¬med(x) ∨ ¬attends(x, y) ∨ likes(x, y) ∨ obliged(x, y) 2 ¬likes(x, math) ∨ ¬med(x) 3 med(John) 4 attends(John, math) 5 ¬obliged(John, math) 6 ¬med(x) ∨ ¬attends(x, math) ∨ ¬med(x) ∨ obliged(x, math)

(res): 1, 2 with y = math

7 ¬med(x) ∨ ¬attends(x, math) ∨ obliged(x, math) (fctr): 6 8 ¬attends(John, math) ∨ obliged(John, math) (res): 3, 7

with x = John

9 obliged(John, math) (res): 4, 8 10 f

(res): 5, 9.

Lecture 05 Slide 26 of 29 Reasoning in first-order logic Lecture 05

Solving the second problem

Prove that ∀xP(a, x, x) ∧ ¬∃x∃y∃z[P(x, y, z) ∧ ¬P(f (x), y, f (z))] implies ∃zP(f (a), z, f (f (a))). We start with negating the above implication, transforming it to the clausal form and then try to derive f.

1 P(a, x′, x′)

– for clarity x is renamed

2 ¬P(x, y, z) ∨ P(f (x), y, f (z)) 3 ¬P(f (a), z, f (f (a))) 4 P(f (a), x′, f (x′))

– (res): 1, 2 with x = a, y = x′, z = x′

5 f

– (res): 3, 4 with x′ = z = f (a)

Lecture 05 Slide 27 of 29 Reasoning in first-order logic Lecture 05

Example

Prove that ∀x∀y[P(x, y) ∨ P(a, y)] implies ∃x∀yP(x, y) We start with negating the above implication, transforming it to the clausal form and then try to derive f.

1 P(x, y) ∨ P(a, y) 2 ¬P(x′, g(x′))

– g is a Skolem function and x is renamed

3 P(a, y)

– (fctr): 1 with x = a

4 f

– (res) with x′ = a and y = g(a).

Lecture 05 Slide 28 of 29

slide-44
SLIDE 44

Reasoning in first-order logic Lecture 05

Solving the third problem

Automated theorem proving? Both resolution and tableaux are complete and can be

  • implemented. However, one should remember that the problem

is only partially computable, i.e., if a formula is a tautology then sooner or later its proof will be found. If it is not a tautology, such algorithms can loop forever.

Lecture 05 Slide 29 of 29 Foundations of Semantic Web

CUGS: Logic II

Lecture 06

Lecture 06 Slide 1 of 31 Foundations of Semantic Web Lecture 06

The problem

Consider “personal agents” on the Semantic Web. Typical tasks:

  • receive some tasks and preferences from a person
  • seek information from Web sources, communicate with web

services and other agents

  • compare information about user requirements and preferences,

make certain choices give answers to the user. For such tasks one needs ontologies and taxonomies. The question is to find a decidable formalism allowing such agents to act.

Lecture 06 Slide 2 of 31 Foundations of Semantic Web Lecture 06

Taxonomies and ontologies

Taxonomies A taxonomy is a hierarchy of concepts, much like class class inheritance in object-oriented design. For example,

  • Human is a subconcept of Mammal
  • Male, Female are subconcepts of Human, etc.

Ontologies An ontology is a taxonomy enriched by additional constraints, like relationships between concepts. For example, Owns can be a relationship between humans and things.

Lecture 06 Slide 3 of 31

slide-45
SLIDE 45

Foundations of Semantic Web Lecture 06

Ontology languages

Ontology languages allow users to write explicit, formal conceptualizations of domain models. The main requirements are:

  • a well-defined syntax
  • efficient reasoning support
  • a formal semantics
  • sufficient expressive power
  • convenience of expression.

Example RDF, RDFS, various versions of OWL (Full, DL, Lite).

Lecture 06 Slide 4 of 31 Foundations of Semantic Web Lecture 06

Reasoning about knowledge in ontology languages

Typical problems

  • Concept membership – does an object belong to a concept?
  • Concept inclusion – is a concept a subconcept of another

concept?

  • Equality of concepts – do two concepts consist of the same
  • bjects?
  • Consistency – is ontology specification consistent?
  • Classification – which concept a given object belongs to?

Lecture 06 Slide 5 of 31 Foundations of Semantic Web Lecture 06

RDFS

RDFS versus Datalog RDFS Datalog Statement(a, P, b) P(a, b). type(a, C) C(a). C subClassOf D D(X):− C(X). P subPorpertyOf Q Q(X, Y ):− P(X, Y ). domain(P, C) C(X):− P(X, Y ). range(P, C) C(Y ):− P(X, Y ).

Lecture 06 Slide 6 of 31 Foundations of Semantic Web Lecture 06

OWL - exemplary constructs

OWL versus Datalog OWL Datalog C sameClassAs D C(X):− D(X). D(X):− C(X). P samePropertyAs Q P(X, Y ):− Q(X, Y ). Q(X, Y ):− P(X, Y ). transitiveProperty(P) P(X, Z):− P(X, Y ), P(Y , Z). inverseProperty(P, Q) Q(X, Y ):− P(Y , X). P(X, Y ):− Q(Y , X).

Lecture 06 Slide 7 of 31

slide-46
SLIDE 46

Foundations of Semantic Web Lecture 06

OWL - exemplary constructs

OWL versus Datalog OWL Datalog (C1 ∩ C2) subClassOf D D(X):− C1(X), C2(X). C subClassOf (D1 ∩ D2) D1(X):− C(X). D2(X):− C(X). (C1 ∪ C2) subClassOf D D(X):− C1(X). D(X):− C2(X). However For C subClassOf (D1 ∪ D2) translation into Datalog is not possible! The same for some other constructs.

Lecture 06 Slide 8 of 31 Foundations of Semantic Web Lecture 06

Description Logics

  • Description logics (DLs) are a family of knowledge

representation languages which can be used to represent the terminological knowledge of an application domain in a structured and formally well-understood way.

  • DLs describe domain in terms of concepts (classes), roles

(relationships) and individuals.

  • DLs, as decidable fragments of first-order logic, have

well-defined formal semantics.

  • DLs are used as the main formal background for a number
  • f languages used in the semantic web technology, including
  • ntology engineering, reasoning with ontology-based markup

(meta-data), service description and discovery.

Lecture 06 Slide 9 of 31 Foundations of Semantic Web Lecture 06

DL Architecture

  • Knowledge base
  • TBox (Terminological Box – constraints / rules)

Parent = Human ⊓ ∃hasChild.Human Father = Parent ⊓ Male Mother = Parent ⊓ Female

  • ABox (Assertion Box – data)

John : Father Mary : Mother hasChild(John, Jack)

  • Inference system.
  • Interface.

Lecture 06 Slide 10 of 31 Foundations of Semantic Web Lecture 06

The Description Logic ALC

Notation We use A and B to denote concept names, R and S for role names, and a and b for individual names. C and D are used to denote arbitrary concepts. Concepts Concepts in ALC are formed using the following BNF grammar: C, D ::= ⊤ | ⊥ | A | ¬C | C ⊓ D | C ⊔ D | ∀R.C | ∃R.C TBox, ABox

  • A TBox is a finite set of axioms of the form C ⊑ D or C .

= D.

  • An ABox is a finite set of assertions of the form a:C (concept

assertion) or R(a, b) (role assertion).

  • A knowledge base is a pair consisting of a TBox and an ABox.

Lecture 06 Slide 11 of 31

slide-47
SLIDE 47

Foundations of Semantic Web Lecture 06

Semantics of ALC

Let I be an interpretation (like in first-order logic). A AI ⊆ UI primitive concept R RI ⊆ UI × UI primitive role ⊤ UI top ⊥ ∅ bottom ¬C UI − C I complement C ⊓ D C I ∩ DI conjunction C ⊔ D C I ∪ DI disjunction ∀R.C {x | ∀y[R(x, y) → C I(y)]}

  • univ. quant.

∃R.C {x | ∃y[R(x, y) ∧ C I(y)]}

  • exist. quant.

Lecture 06 Slide 12 of 31 Foundations of Semantic Web Lecture 06

Reasoning Problems in ALC

Satisfiability Sat, Consistency Cons One of the basic inference problems in DLs, which we denote by Sat, is to check satisfiability of a knowledge base. Other inference problems in DLs are usually reducible to this

  • problem. For example, the problem of checking consistency
  • f a concept w.r.t. a TBox (further denoted by Cons) is linearly

reducible to Sat. Complexity Both problems, Sat and Cons, in the basic description logic ALC are EXPTIME-complete. Despite that the upper-bound EXPTIME has been known for the complexity of the satisfiability problem in ALC, implemented tableau provers for description logics usually have non-optimal complexity 2EXPTIME.

Lecture 06 Slide 13 of 31 Foundations of Semantic Web Lecture 06

Example

Is a given ABox satisfiable? I.e., does it have a model? For example, consider the ABox A consisting of John : ∃hasChild.Male, John : ∃hasChild.Female, John : ∀hasChild.Happy, hasChild(John, Peter), Peter : Male ⊓ Doctor. Is A satisfiable? Is A ∪ {Peter : ¬Happy} satisfiable?

Lecture 06 Slide 14 of 31 Foundations of Semantic Web Lecture 06

Applications

  • Is the knowledge base specified by an ABox consistent?
  • Does John have a happy daughter?

i.e., does A | = John : ∃hasChild.(Female ⊓ Happy) hold? i.e., is A ∪ {John : ∀hasChild.(¬Female ⊔ ¬Happy)} unsatisfiable?

  • An inclusion axiom C ⊑ D is valid (in every interpretation)

iff the ABox {x : C, x : ¬D} is unsatisfiable.

Lecture 06 Slide 15 of 31

slide-48
SLIDE 48

Foundations of Semantic Web Lecture 06

OWL - exemplary constructs

OWL versus ALC OWL ALC SHI C sameClassAs D C . = D P samePropertyAs Q P . = Q transitiveProperty(P) not in ALC P+ ⊑ P inverseProperty(P, Q) not in ALC P ⊑ Q− (C1 ∩ C2) subClassOf D C1 ⊓ C2 ⊑ D C subClassOf (D1 ∩ D2) C ⊑ D1 ⊓ D2 (C1 ∪ C2) subClassOf D C1 ⊔ C2 ⊑ D C subClassOf (D1 ∪ D2) C ⊑ D1 ⊔ D2

Lecture 06 Slide 16 of 31 Foundations of Semantic Web Lecture 06

Tableau calculus for ALC

Rules for checking satisfiability of a knowledge base Start with a set S consisting of ALC formulas appearing in ABox and TBox. Transform them to the negation normal forms (like in first-order logic). Expand the proof tree according to rules: ⊓: if x : C ⊓ D is in S and x : C, x : D are not both in S then create successor node containing S ∪ {x : C, x : D}. ⊔: if x : C ⊔ D is in S and neither x : C nor x : D is in S then create successor node containing S ∪ {x : E}, where E is either C or D (nondeterministic choice).

Lecture 06 Slide 17 of 31 Foundations of Semantic Web Lecture 06

Tableau calculus for ALC

Rules for checking satisfiability of a knowledge base ∀: if x : ∀R.C is in S, R(x, y) is in S and y : C is not in S then create successor node containing S ∪ {y : C}. ∃: if x : ∃R.C is in S, there is no z such that both R(x, z) and z : C are in S then create successor node containing S ∪ {R(x, y), y : C}, where y is a new variable. Theorem The tableau calculus for ALC is sound and complete.

Lecture 06 Slide 18 of 31 Foundations of Semantic Web Lecture 06

Tableau calculus for ALC

Convention To simplify notation we list only formulas which are added to S (so each node n consists of all formulas starting from the root until n). Example: ∀Child.Male ⊓ ∃Child.¬Male x : ∀Child.Male ⊓ ∃Child.¬Male x : ∀Child.Male ⊓ − rule x : ∃Child.¬Male ⊓ − rule Child(x, y), y : ¬Male ∃ − rule y : Male ∀ − rule The tableau is closed, so our formula is not satisfiable.

Lecture 06 Slide 19 of 31

slide-49
SLIDE 49

Foundations of Semantic Web Lecture 06

Tableau calculus for ALC

Example: ∀Child.Male ⊓ ∃Child.Male x : ∀Child.Male ⊓ ∃Child.Male x : ∀Child.Male ⊓ − rule x : ∃Child.Male ⊓ − rule Child(x, y), y : Male ∃ − rule y : Male ∀ − rule The tableau is completed (nothing new can be added), so our formula is satisfiable (interpretation: universe consists of two

  • bjects x, y and we set Child(x, y), y : Male.

Lecture 06 Slide 20 of 31 Foundations of Semantic Web Lecture 06

Tableau calculus for ALC

Example Check the satisfiability of the ABox: john : Parent ⊓ ∀Child.Male mary : ¬Male Child(john, mary). A tableau: john : Parent ⊓ ∀Child.Male mary : ¬Male Child(john, mary) john : Parent, john : ∀Child.Male ⊓ − rule mary : Male ∀ − rule. Tableau is closed, so our ABox is not satisfiable.

Lecture 06 Slide 21 of 31 Foundations of Semantic Web Lecture 06

Tableau calculus for ALC

TBoxes Just use the negation normal form:

  • C ⊑ D is equivalent to ¬C ⊔ D
  • C = D is equivalent to (¬C ⊔ D) ⊓ (¬D ⊔ C).

Example Human ⊑ ∃Parent.Human is to be replaced by ¬Human ⊔ ∃Parent.Human.

Lecture 06 Slide 22 of 31 Foundations of Semantic Web Lecture 06

A description logics landscape

Constructors for concepts Constructor name Syntax Semantics concept name A AI ⊆ ∆I top ⊤ ∆I bottom ⊥ ∅ complement (C) ¬C ∆I − C I intersection E ⊓ F E I ∩ F I union (U) E ⊔ F E I ∪ F I

  • univ. quant. A

∀R.E {x | {y | RI(x, y)} ⊆ E I

  • exist. quant. (E)

∃R.E {x | {y | RI(x, y)} ∩ E I = ∅} cardinality (N) ≥ nR {x | #{y | RI(x, y)} ≥ n} ≤ nR {x | #{y | RI(x, y)} ≤ n}

  • qual. card. (Q)

≥ nR.C {x | #{y | RI(x, y) ∧ C I(y)} ≥ n} ≤ nR.C {x | #{y | RI(x, y) ∧ C I(y)} ≤ n}.

Lecture 06 Slide 23 of 31

slide-50
SLIDE 50

Foundations of Semantic Web Lecture 06

The description logics landscape

Basic description logic AL (Schmidt-Schauss and Smolka 1991) Various description languages are distinguished by the constructors that allow to specify complex concepts and roles. The basic language AL. In AL one can use atomic concepts, top, bottom, negation applied to atomic concepts, intersection, universal quantification and existential quantification of the form ∃R.⊤. Some other description logics The other languages of this family are extensions of AL. They are denoted by indicating particular constructs that are allowed. For example ALUEC allows to use union (indicated by U), full existential quantification (indicated by E) and unrestricted complement (indicated by C), while in ALUC full existential quantification is excluded.

Lecture 06 Slide 24 of 31 Foundations of Semantic Web Lecture 06

Logic SHI (tranSitive, Hierarchy, Inverse)

Inverse and transitive roles Let R be a role. The inverse role for R is R−1 def = {(y, x) | R(x, y)} R is transitive if R ◦ R ⊆ R, where R ◦ R def = {(x, y) | ∃z[R(x, z) ∧ R(z, y)]. SHI SHI is an extension of ALC, where we additionally allow inverse roles and axioms of the form R ◦ R ⊆ R R ⊑ S

Lecture 06 Slide 25 of 31 Foundations of Semantic Web Lecture 06

Logic SHI

Example Role axioms: hasChild ⊑ hasDescendant hasDescendant ◦ hasDescendant ⊑ hasDescendant hasParent ⊑ hasChild−1 hasChild−1 ⊑ hasParent Exemplary TBox: happyAncestor ⊑ ∀hasDescendant.Happy

Lecture 06 Slide 26 of 31 Foundations of Semantic Web Lecture 06

Logic SHIQ

Simple roles A role R is simple if:

  • R ◦ R ⊆ R does not appear among role axioms
  • there is no S ◦ S ⊆ S among role axioms such that

S ⊑ R is also among role axioms. SHIQ SHIQ is an extension of SHI, where we additionally allow cardinality constraints (Q) applied to simple roles only. Examples

  • ≥ 2hasChild.Male
  • ≤ 1hasParent.Female

Lecture 06 Slide 27 of 31

slide-51
SLIDE 51

Foundations of Semantic Web Lecture 06

The description logics landscape

Complexity of reasoning Logic | = C ⊑ D | = a : C AL P P ALE NP PSPACE ALC PSPACE PSPACE ALCO PSPACE PSPACE SHIQ EXPTIME EXPTIME.

Lecture 06 Slide 28 of 31 Foundations of Semantic Web Lecture 06

Inference of taxonomies

A scenario Consider the following TBox: Woman = Person ⊓ Female Man = Person ⊓ ¬Woman Mother = Woman ⊓ ∃hasChild.Person Father = Man ⊓ ∃hasChild.Person Parent = Mother ⊔ Father. What taxonomy is entailed by this TBox? Formally, we need to check whether subconcept properties hold. These are expressed by C ⊑ D, e.g., Father ⊑ ∃hasChild.Person.

Lecture 06 Slide 29 of 31 Foundations of Semantic Web Lecture 06

Inference of taxonomies

Person Female Woman ¬Woman Man Parent = Father ⊔ Mother ∃hasChild.Person Father Mother

Lecture 06 Slide 30 of 31 Foundations of Semantic Web Lecture 06

Description logics reasoners and tools

Some of the most popular reasoners dealing with OWL and Description Logics:

  • FaCT++ – a free open-source C++–based reasoner
  • RacerPro – a commercial (free trials and research licenses

are available) Lisp-based reasoner

  • MSPASS – a free open-source C reasoner for various

description logics. More general tool Try Prot´ eg´ e (free, rich functionality related to Semantic Web).

Lecture 06 Slide 31 of 31

slide-52
SLIDE 52

Modal logics

CUGS: Logic II

Lecture 07

Lecture 07 Slide 1 of 31 Modal logics Lecture 07

The problem

How to formalize, e.g., knowledge or obligation? Can we use the classical logic to express Known or Obliged? For example, how to express sentences like “it is known that people having income are obliged to pay taxes” and to have sound (and intuitive) reasoning using such sentences.

Lecture 07 Slide 2 of 31 Modal logics Lecture 07

Extensional versus intensional relations

Extensional and intensional relations By an extensional relation we mean any relation which allows us to replace its arguments by equal elements. Intensional relations are those that are not extensional. Extensional and intensional operators By an extensional operator we mean any operator which allows us to replace its arguments by equivalent elements. Intensional

  • perators are those that are not extensional.

Lecture 07 Slide 3 of 31 Modal logics Lecture 07

Examples

Extensional relations

1 All relations in traditional mathematics are extensional, e.g.,

≤, ≥, “to be parallel”, to be perpendicular, etc.

2 many relations in natural language are extensional, e.g.,

“likes”, “has”, “betterThan”, etc.. For example, likes is extensional (why?) Extensional operators

1 All connectives in propositional calculus are extensional. 2 Quantifiers ∀, ∃ are extensional.

Lecture 07 Slide 4 of 31

slide-53
SLIDE 53

Modal logics Lecture 07

Paradox of Eubulides (4th century BC)

Electra paradox Consider sentence “Electra knows that Orestes is her brother. The sentence contains relation “P knows Q” with meaning “P knows that Q is her brother”. It appears to be an intensional relation, as the following paradox shows, where MM is a masked man, so that Electra is unable to recognize whether MM is or is not Orestes: Knows(Electra, Orestes) ∧ Orestes = MM∧ (1) ¬Knows(Electra, MM). (2) Using classical logic, from (1) we deduce Knows(Electra, MM) – a contradiction with (2). In consequence “Knows” is not extensional.

Lecture 07 Slide 5 of 31 Modal logics Lecture 07

Examples of intensional operators

  • The following operators are intensional (why?):
  • “P believes in Q”
  • “it is necessary that P holds”
  • “P is obligatory”
  • “P is allowed”
  • Give another examples of intensional operators.

Lecture 07 Slide 6 of 31 Modal logics Lecture 07

Propositional modal logics

Modalities Intensional relations give rise to so-called modal operators (modalities, in short). Logics allowing such operators are called modal logics and sometimes intensional logics. Modalities In the simplest case one deals with a single one-argument modality and its dual ♦, i.e., ♦A

def

𠪪A

Lecture 07 Slide 7 of 31 Modal logics Lecture 07

Example

To understand modalities we give them meaningful names. Suppose that the intended meaning of A is “A is sure”. Then its dual, ♦A, is defined to be: ♦A

def

≡ ¬¬A ≡ ¬“¬A is sure” ≡ “¬A is not sure” ≡ “¬A is doubtful” ≡ “A is doubtful”

Lecture 07 Slide 8 of 31

slide-54
SLIDE 54

Modal logics Lecture 07

Readings of modalities

Modalities and ♦ have many possible readings, dependent on a particular application. The following table summarizes the most frequent readings of modalities. a possible reading of A a possible reading of ♦A A is necessary A is possible A is obligatory A is allowed always A sometimes A in the next time moment A in the next time momentA A is known A is believed all program states satisfy A some program state satisfies A A is provable A is not provable

Lecture 07 Slide 9 of 31 Modal logics Lecture 07

Types of modal logics

Types of modal logics reflect the possible readings of modalities. The following table summarizes types of logics corresponding to the possible readings of modalities. Reading of modalities Type of modal logic necessary, possible aletic logics

  • bligatory, allowed

deontic logics always, sometimes, in the next time temporal logics known, believed epistemic logics all (some) states of a program satisfy logics of programs (un)provable provability logics

Lecture 07 Slide 10 of 31 Modal logics Lecture 07

Application areas

  • temporal reasoning
  • understanding (fragments of) natural language
  • commonsense reasoning
  • specification and verification of software
  • multiagent systems
  • understanding law and legal reasoning
  • etc.

Lecture 07 Slide 11 of 31 Modal logics Lecture 07

How to chose/design a logic for a particular application

Two approaches Recall (see Lecture I), that there are two approaches to defining logics, the semantical and syntactical approach. In consequence there are two approaches to chosing/defining modal logics suitable for a given application:

  • syntactical approach, which depends on providing a set
  • f axioms describing the desired properties of modal operators
  • semantical approach, which starts with suitable models and

then develops syntactic characterization of modalities.

Lecture 07 Slide 12 of 31

slide-55
SLIDE 55

Modal logics Lecture 07

Syntactical approach

Example: knowledge operator Suppose we are interested in formalizing knowledge operator. One can postulate, for instance, the following properties:

1 A → ♦A — if A is known then A is believed in 2 A → A — if A is known then it actually holds 3 A → ♦A — if A holds then it is believed.

Important questions

  • are the properties consistent?
  • do the properties express all the desired phenomena?
  • is a property a consequence of another property?

Usually answers to such questions are given by semantical investigations.

Lecture 07 Slide 13 of 31 Modal logics Lecture 07

Semantical approach

Example: temporal reasoning Consider modal logic, where A means “always A” and ♦A means “sometimes A”. To define meaning of and ♦ we first have to decide what is the time structure, e.g.,

  • consisting of points or intervals?
  • continuous or discrete?
  • linear or branching?
  • is there the earliest moment (starting point) or there is an

unbounded past?

  • is there the latest moment (end point) or there is an

unbounded future?

Lecture 07 Slide 14 of 31 Modal logics Lecture 07

Semantical approach

Example continued Assume that time is discrete and linear, with time points labeled by integers, as in the figure below.

  • 2
  • 1

1 2 . . . . . .

Now “always A” and “sometimes A” can be more or less be defined as follows: “always A”

def

≡ ∀i [A is satisfied in time point i] “sometimes A”

def

≡ ∃i [A is satisfied in time point i]

Lecture 07 Slide 15 of 31 Modal logics Lecture 07

Lemmon’s classification of modal logics

Lemmon introduced a widely accepted classification of modal logics. The basis for Lemmon’s classification is the logic K. K is defined to extend the classical propositional logic by adding rules:

1 rule (K): (A → B) ⊢ A → B 2 rule (Gen): A ⊢ A

Other normal modal logics are defined by additional axioms expressing the desired properties of modalities.

Lecture 07 Slide 16 of 31

slide-56
SLIDE 56

Modal logics Lecture 07

Some well-known axioms

D

def

≡ A → ♦A T

def

≡ A → A 4

def

≡ A → A E

def

≡ ♦A → ♦A B

def

≡ A → ♦A M

def

≡ ♦A → ♦A G

def

≡ ♦A → ♦A H

def

≡ (♦ A ∧ ♦B)→[♦( A ∧ B) ∨ ♦( A ∧ ♦B) ∨ ♦(B ∧ ♦ A)] Grz

def

≡ ((A → A) → A) → A Dum

def

≡ ((A → A) → A) → (♦A → A) W

def

≡ (A → A) → A.

Lecture 07 Slide 17 of 31 Modal logics Lecture 07

Some well-known axioms

  • D comes from deontic
  • T is a traditional name of the axiom (after Feys)
  • 4 is characteristic for logic S4 of Lewis
  • E comes from Euclidean

(this axiom is often denoted by 5)

  • B comes after Brouwer
  • M comes after McKinsey
  • G comes after Geach
  • H comes after Hintikka
  • Grz comes after Grzegorczyk
  • Dum comes after Dummett
  • W comes from reverse well founded (AKA L¨
  • b axiom).

Lecture 07 Slide 18 of 31 Modal logics Lecture 07

Lemmon’s classification of modal logics

In Lemmon’s classification KX0...Xm denotes the logic extending K in which formulas X0, ..., Xm are accepted as axioms. Well-known modal logics KT = T = logic of G¨

  • del/Feys/Von Wright

KT4 = S4 KT4B = KT4E = S5 K4E = K45 KD = deontic T KD4 = deontic S4 KD4B = deontic S5

Lecture 07 Slide 19 of 31 Modal logics Lecture 07

Lemmon’s classification of modal logics

Well-known modal logics KTB = Brouwer logic KT4M = S4.1 KT4G = S4.2 KT4H = S4.3 KT4Dum = D = Prior logic KT4Grz = KGrz = Grzegorczyk logic K4W = KW = L¨

  • b logic.

Lecture 07 Slide 20 of 31

slide-57
SLIDE 57

Modal logics Lecture 07

Kripke semantics for modal logics

There are many semantics for modal logics. Here we shall follow the Kripke-like style of defining semantics for modal logics (in fact, similar semantics was a couple of years earlier defined by Kanger. Then, in the same year, the similar semantics was given by Hintikka and also Guillaume). Possible worlds The basis for the Kripke semantics is the universe of possible

  • worlds. Formulas are evaluated in the so-called actual worlds.

Other worlds, alternatives to the actual world, reflect possible situations.

Lecture 07 Slide 21 of 31 Modal logics Lecture 07

Kripke semantics for modal logics

Semantics of modalities Modalities and ♦ have the following intuitive meaning:

  • formula A holds in a given world w, if A holds in all worlds

alternative for w

  • formula ♦A holds in a given world w, if A holds in some world

alternative for w. A A A A . . .

❃ ✿ s

♦A ? A ? . . .

❃ ✿ s

Lecture 07 Slide 22 of 31 Modal logics Lecture 07

Example

Consider the situation where one drops a coin twice. Kripke structure describing all possible situations:

S H T H T H T

✯ ✯ q ❘ ✶ ❥

H = “head” T = “tail” S = “start”

1 does ♦T hold? 2 does H hold? 3 does ♦♦H hold?

Lecture 07 Slide 23 of 31 Modal logics Lecture 07

Example

Consider:

S H H H T H T

✯ ✯ q ❘ ✶ ❥

Again:

1 does ♦T hold? 2 does H hold? 3 does ♦♦H hold?

Lecture 07 Slide 24 of 31

slide-58
SLIDE 58

Modal logics Lecture 07

Formal definition of Kripke semantics

Kripke structures

  • By a Kripke frame we mean any relational structure of the

form W, R, where W is any set and R is a binary relation defined on W, called an accessibility relation. Elements of W are called worlds.

  • By a Kripke structure we mean triple

K, w, v, where

  • K = W, R is a Kripke frame
  • w ∈ W is the actual world
  • v is a mapping, v : F × W −

→ {t, f}, assigning truth values to propositional variables in worlds

Lecture 07 Slide 25 of 31 Modal logics Lecture 07

Formal definition of Kripke semantics

Semantics

  • the semantics of the classical connectives is as usual, where

the valuation of propositional variables is given by the actual world of a given Kripke structure

  • K, w, v |

=M α iff for any w′ such that R(w, w′) holds, we have that K, w′, v | =M α, where R is the relation of the Kripke frame K (i.e., ∀w′[R(w, w′) → α(w′)])

  • K, w, v |

=M ♦α iff there is w′ such that R(w, w′) and K, w′, v | =M α, where R is the relation of the Kripke frame K (i.e., ∃w′[R(w, w′) ∧ α(w′)]). Remark Notice a strong similarity of description logics to modal logics(!)

Lecture 07 Slide 26 of 31 Modal logics Lecture 07

Correspondences in Kripke semantics

What Kripke frames validate modal axioms? Formula Property of R D ∀x∃y[R(x, y)] (seriality) T ∀x[R(x, x)] (reflexivity) 4 ∀x, y, z[(R(x, y) ∧ R(y, z)) → R(x, z)] (transitivity) E ∀x, y, z[(R(x, y) ∧ R(x, z)) → R(y, z)] (Euclidicity) B ∀x, y[R(x, y) → R(y, x)] (symmetry) G ∀x, y, z[(R(x, y) ∧ R(x, z)) → ∃w(R(y, w) ∧ R(z, w)] (directedness) ♦α → α ∀x, y, z[R(x, y) ∧ R(x, z)) → y = z] (R is a partial function) ♦α ≡ α ∀x∃!y[R(x, y)] (R is a function) α → α ∀x, y[R(x, y) → ∃z[R(x, z) ∧ R(z, y)]] (density)

Lecture 07 Slide 27 of 31 Modal logics Lecture 07

How to specify models?

1 Decide what are worlds, what do they correspond to and what

is their contents.

2 Decide where accessibility relation comes from (what do

“arrows” reflect). Example Assume we want to investigate where can one be 5 minutes form

  • now. Then worlds describe possible places and arrows correspond

to possibilities of moving in 5 minutes.

Lecture 07 Slide 28 of 31

slide-59
SLIDE 59

Modal logics Lecture 07

Exercises

1 Design a logic for formalizing the reasoning of perception,

where A is intended to mean that “an event satisfying formula A is being observed”.

2 Propose a deontic logic (for “obligatory” – “allowed”

modalities)

3 Characterize a commonsense implication.

Lecture 07 Slide 29 of 31 Modal logics Lecture 07

Multimodal logics

In many applications one needs to introduce many modalities. In such a case we have multimodal logics. Examples What modalities appear in the following sentences:

1 “When it rains one ought to take umbrella”. 2 “It is necessary that anyone should be allowed to have right to

vote”.

3 “If you know what I believe to, then you might be wiser than

myself”.

Lecture 07 Slide 30 of 31 Modal logics Lecture 07

Syntax and semantics

We extend language by modalities 1, . . . , k and their duals ♦1, . . . , ♦k. The semantics is given by Kripke structures with accessibility relations R1, . . . , Rk providing semantics of modalities. “It is known that people having income are obliged to pay taxes” This sentence translates into Known

1

  • hasIncome → Obligatory
  • 2

(paysTaxes)

  • having the semantics (with x as the actual world):

∀y

  • R1(x, y) →
  • hasIncome(y) → ∀z[R2(y, z) → paysTaxes(z)]
  • .

Lecture 07 Slide 31 of 31 Many-valued logics

CUGS: Logic II

Lecture 08

Lecture 08 Slide 1 of 28

slide-60
SLIDE 60

Many-valued logics Lecture 08

The problem

1 Incomplete knowledge:

“temperature in Paris is now 12◦C ” – t or f? — it is t or f, but do we know the value? If not, which one is to be accepted in practical reasoning?

2 Inaccurate knowledge: “my eyes are grey”

— t or f? If uncertain, one cannot assume t nor f.

3 Nonsense: “Colorless green ideas sleep furiously” – t or f?

— The classical logic would have to say either t or f, but maybe we find it nonsense (as Noam Chomsky did)?

4 Meaning: “˚

Ait¨ a ka¨ ı ell ˚ at” — Is the following expression t or f (or maybe meaningless)?

Lecture 08 Slide 2 of 28 Many-valued logics Lecture 08

3-valued logic of Lukasiewicz L3 (1920)

Truth values and connectives

  • Logical values: t, f, n (“neutral”).
  • Connectives: ¬, →, ↔, ∨, ∧.

Examples

1 Voting is basically three-valued:

t (I do support), f (I do not support) or n (I abstain).

2 Do I support a given political party? 3 Do I like certain types of cars, certain sounds, tastes or certain

colors?

Lecture 08 Slide 3 of 28 Many-valued logics Lecture 08

Truth tables for L3

The semantics of many-valued logics is usually given by truth

  • tables. However, the reasoning is usually supported by proof

systems. Negation The same in most three-valued logic. f n t ¬ t n f

Lecture 08 Slide 4 of 28 Many-valued logics Lecture 08

Truth tables for L3

Disjunction and conjunction ∨ f n t f f n t n n n t t t t t ∧ f n t f f f f n f n n t f n t Implication and equivalence → f n t f t t t n n t t t f n t ↔ f n t f t n f n n t n t f n t

Lecture 08 Slide 5 of 28

slide-61
SLIDE 61

Many-valued logics Lecture 08

Reasoning in L3

Examples

1 Suppose that John neither likes nor dislikes orange juice and

likes apple juice. What is the truth value of sentences:

1 “John likes orange juice or apple juice”? 2 “John likes orange juice and apple juice”? 2 Is the following sentence true:

“John likes orange juice and apple juice, therefore he likes

  • range juice”?

Lecture 08 Slide 6 of 28 Many-valued logics Lecture 08

Reasoning in L3

Can neutrality model indefiniteness? Consider sentence: “Eve will vote for party XYZ provided that the party will not change its current policy”. This sentence can be formulated as implication noChange → EveWillVote. (1) Suppose that at the moment it is undetermined whether sentences noChange and EveWillVote are t or f. Using L3 we would have to assume that the value of implication (1) is t. However, it should stay undetermined, since noChange might further appear t while, for some reasons, EveWillVote might appear f.

Lecture 08 Slide 7 of 28 Many-valued logics Lecture 08

Wajsberg’s axiomatization of L3

  • axioms:

1 A → (B → A) 2 (A → B) →

  • (B → C) → (A → C)
  • 3 (¬B → ¬A) → (A → B)

4

  • (A → ¬A) → A
  • → A
  • rule:

A, A → B ⊢ B. Theorem Wajsberg’s axiomatization is sound and complete for L3.

Lecture 08 Slide 8 of 28 Many-valued logics Lecture 08

Kripke-like semantics for L3

Definition Kripke structure for L3 is a pair {0, 1, 2}, ≤, where ≤ is the standard ordering on numbers 0, 1, 2. Assume that the consequence relation is defined on atoms (i.e., the meaning of x A is given when A is a propositional variable) and that always satisfies condition: x A and x ≤ y implies y A. Semantics of L3 is then defined by:

  • x ¬A iff 2 − x A
  • x A ∨ B iff x A or x B
  • x A ∧ B iff x A and x B.

Lecture 08 Slide 9 of 28

slide-62
SLIDE 62

Many-valued logics Lecture 08

  • L3: some facts

Fact

  • ¬(A ∧ B) = (¬A ∨ ¬B)
  • ¬(A ∨ B) = (¬A ∧ ¬B)
  • (A → B) = (¬B → ¬A)
  • A ∨ B = (A → B) → B
  • A ∧ B = ¬(¬A ∨ ¬B)

The following do not hold

  • (A ∨ ¬A) = t
  • (A ∧ ¬A) = f
  • (A → B) = (¬A ∨ B)

Lecture 08 Slide 10 of 28 Many-valued logics Lecture 08

Three-valued logic of Kleene K3 (strong Kleene logic, 1952)

Truth values and connectives

  • Logical values: t, f, in (“indefinite”).
  • Connectives: ¬, →, ↔, ∨, ∧.

Examples

1 In reals, what is the truth value of formula √−2 = 1? 2 Some programs loop forever. What are properties of their final

states?

3 What is the value of formula brother(Null, John), when

Null represents a missing value in a database table?

4 A robot asked about a certain situation might temporarily

answer in (not yet determined).

Lecture 08 Slide 11 of 28 Many-valued logics Lecture 08

Truth tables for K3

Disjunction and conjunction ∨ f in t f f in t in in in t t t t t ∧ f in t f f f f in f in in t f in t Implication and equivalence → f in t f t t t in in in t t f in t ↔ f in t f t in f in in in in t f in t

Lecture 08 Slide 12 of 28 Many-valued logics Lecture 08

Three-valued logic of Boˇ cvar (Bochvar 1939)

Truth values and connectives

  • Logical values: t, f, ns (“nonsense”).
  • Connectives: ¬, →, ↔, ∨, ∧.

Remark In the natural language ns often means f. This meaning should not be mixed with the concept of ns, as understood here.

Lecture 08 Slide 13 of 28

slide-63
SLIDE 63

Many-valued logics Lecture 08

Truth tables for the logic of Boˇ cvar

Disjunction and conjunction ∨ f ns t f f ns t ns ns ns t t t t t ∧ f ns t f f f f ns f ns ns t f ns t Implication and equivalence → f ns t f t ns t ns ns ns ns t f ns t ↔ f ns t f t ns f ns ns ns ns t f ns t

Lecture 08 Slide 14 of 28 Many-valued logics Lecture 08

Three-valued logic of Hallden 1949

Truth values and connectives

  • Logical values: t, f, m (“meaningless”).
  • Connectives: ¬, ⊢, ∧ (⊢ α stands for “α is meaningful”).

Other connectives

  • α ∨ β

def

≡ ¬(¬α ∧ ¬β)

  • α → β

def

≡ ¬α ∨ β

  • α ↔ β

def

≡ (α → β) ∧ (β → α)

  • −α

def

≡ ¬ ⊢ α (−α stands for “α is meaningless”).

Lecture 08 Slide 15 of 28 Many-valued logics Lecture 08

Three-valued logic of Hallden

Examples

1 Many sentences are meaningful only in a certain context, e.g.,

what is the meaning of “He did it.”? We know that somebody has done something, but who, when, what?

2 Classical tautologies often provide no “real” meaning, e.g.,

“If whenever you do not eat then you eat, then you eat.”

3 Sometimes a query is meaningless, when it refers to

concepts/tables which are not understood by a database/web service.

Lecture 08 Slide 16 of 28 Many-valued logics Lecture 08

Truth tables for the logic of Hallden

Meaningfulness f m t ⊢ t f t Conjunction ∧ f m t f f m f m m m m t f m t

Lecture 08 Slide 17 of 28

slide-64
SLIDE 64

Many-valued logics Lecture 08

External negation

The negation considered so far is called the internal negation. The external negation ∼ is defined by table: f u t ∼ t t f Example Consider “There is a car to the right” (abbreviated further by P). Assume that we know that P is not t. Thus P is f or u, i.e., f ∨ u, which usually is u (at least in all logics we have considered). According to the CWA, when we query the database about ∼P in the given circumstances, we receive the answer t. Observe, that actually P is u, thus ∼P = ∼u = t, as should be.

Lecture 08 Slide 18 of 28 Many-valued logics Lecture 08

Many-valued logics

Three-valued logics can be generalized to deal with more truth

  • values. Systems of many-valued logics are usually defined by fixing:
  • the set of truth values
  • the truth degree functions which interpret the propositional

connectives

  • the semantical interpretation of the quantifiers
  • the designated truth values, which act as substitutes for the

truth value t (a formula is considered valid under some interpretation I iff it has a designated truth value under I).

Lecture 08 Slide 19 of 28 Many-valued logics Lecture 08

Many-valued logics

Examples of well-known many valued logics

1 Logics of

Lukasiewicz (1920).

2 Logics of Post (1921). 3 logics of G¨

  • del (1932).

4 Logics of Kleene (1952). 5 Fuzzy logic of Zadeh (1965). 6 Logic of Belnap (1977).

Lecture 08 Slide 20 of 28 Many-valued logics Lecture 08

Many valued logic of Lukasiewicz and Tarski (1930)

  • Lukasiewicz and Tarski have generalized

L3 by allowing valuations to take any value in [0, 1]. Let PV be the set of propositional variables. An L-valuation is a mapping e : PV − → [0, 1]. It is extended to all formulas:

  • e(¬A) def

= 1 − e(A)

  • e(A ∨ B) def

= max[e(A), e(B)]

  • e(A ∧ B) def

= min[e(A), e(B)]

  • e(A → B) def

= 1 when e(A) ≤ e(B) 1 − e(A) + e(B)

  • therwise.
  • e(A ≡ B) def

=    1 when e(A) = e(B) 1 − e(B) + e(A) when e(A) < e(B) 1 − e(A) + e(B)

  • therwise.

Lecture 08 Slide 21 of 28

slide-65
SLIDE 65

Many-valued logics Lecture 08

Many valued logic of Lukasiewicz and Tarski (1930)

Logics Ln, Lℵ0 and Lℵ

  • a formula A is a tautology of Ln iff e(A) = 1 for every

L-valuation e : PV − →

  • m

n − 1 : 0 ≤ m ≤ n − 1

  • a formula A is a tautology of Lℵ0 iff e(A)=1 for every

L-valuation e : PV − → {r ∈ [0, 1] : r is a rational number}

  • a formula A is a tautology of Lℵ iff e(A)=1 for every

L-valuation e.

Lecture 08 Slide 22 of 28 Many-valued logics Lecture 08

Many valued logic of Lukasiewicz and Tarski (1930)

Fact

1 A ∨ B ≡ (A → B) → B 2 A ∧ B ≡ ¬(¬A ∨ ¬B).

Theorem

1 The set of tautologies of L2 is that of the classical

propositional logic

2 Ln+1 = Ln 3 Lℵ0=Lℵ ⊆Ln, for any n.

Lecture 08 Slide 23 of 28 Many-valued logics Lecture 08

Many valued logic of G¨

  • del (1932)

Logics of G¨

  • del, Gn and G∞

Defined by either a finite set of reals within [0, 1], Wn

def

=

  • m

n − 1 : 0 ≤ m ≤ n − 1

  • r the whole interval [0, 1] as the truth degree set.

The degree 1 is the only designated truth degree.

Lecture 08 Slide 24 of 28 Many-valued logics Lecture 08

Many valued logic of G¨

  • del

Definition Let PV be the set of propositional variables. Any valuation e : PV − → Wn or e : PV − → [0, 1] is extended to all formulas by setting:

  • e(¬A) def

= 1 for e(A) = 0

  • therwise
  • e(A ∨ B) def

= max[e(A), e(B)]

  • e(A ∧ B) def

= min[e(A), e(B)]

  • e(A → B) def

= 1 when e(A) ≤ e(B) e(B)

  • therwise.
  • e(A ≡ B) def

= 1 when e(A) = e(B) min{e(A), e(B)}

  • therwise.

Lecture 08 Slide 25 of 28

slide-66
SLIDE 66

Many-valued logics Lecture 08

Many-valued logics versus modal logics

Remarks

  • The idea of

Lukasiewicz has been that three-valued logics can substitute modal logics.

  • In general, one cannot do that (for example S5) cannot be

characterized by a finitely many valued logic.

  • There are many valued modal logics: worlds may assign to

propositions more than two truth values. Also accessibility relations may be many-valued.

Lecture 08 Slide 26 of 28 Many-valued logics Lecture 08

Many-valued logics versus modal logics

The idea of Lukasiewicz

  • Lukasiewicz considered the following table appropriate for

characterizing possibility and necessity: A ♦A A (= ¬♦¬A) f f f n t f t t t

Lecture 08 Slide 27 of 28 Many-valued logics Lecture 08

Many-valued modal logics

The idea

1 many-valued worlds, reflecting “problematic” information as

to propositions

2 many-valued accessibility relation, reflecting “problematic”

information as to accessibility of worlds from a given world. Example Assume that situation is analyzed in time intervals of 5 minutes. Accessibility relation reflects what may happen in 5 minutes from the point of view of a given world.

  • a location of a given object in 5 minutes from “now” might

not be determined

  • it might be unsure whether one can reach a particular world

from the actual world within 5 minutes.

Lecture 08 Slide 28 of 28 Fuzzy reasoning

CUGS: Logic II

Lecture 09

Lecture 09 Slide 1 of 37

slide-67
SLIDE 67

Fuzzy reasoning Lecture 09

The problem

How to express vague concepts and reason with them? For example, consider the following rule supplied by a user/domain expert: If the water level in a tank is high then the tank’s valve is to be opened widely. How can it be used in controllers/software systems?

Lecture 09 Slide 2 of 37 Fuzzy reasoning Lecture 09

Fuzzy logics

The idea In the classical sense set membership is two-valued (an element belongs or does not belong to a set). We can then consider any set, say A, via its membership function: A : Dom − → {t, f}. Recall that Lukasiewicz and Tarski (1930) and later G¨

  • del (1932)

proposed to generalize the membership function to be: A : Dom − → [0, 1]. Intuitively, A provides the degree to which elements belong to the set represented by A. In 1965 this idea has been rediscovered and further developed by Zadeh, who gave it the name “fuzzy sets”.

Lecture 09 Slide 3 of 37 Fuzzy reasoning Lecture 09

Example

Denote by highSpeed the set consisting of high speeds of a car on a highway. Then, e.g.,

  • highSpeed(3km

h ) = 0

  • highSpeed(110km

h ) = 0.72

  • highSpeed(160km

h ) = 1 What would be an intuitive classical definition (with two-valued membership function) of highSpeed?

Lecture 09 Slide 4 of 37 Fuzzy reasoning Lecture 09

Operations and inclusion of fuzzy sets

Complement, union, intersection, inclusion Let A and B be fuzzy sets (given by their membership functions). The membership function is extended to operations on sets and set inclusion as follows (other definitions are possible but those below are most common):

  • for all x ∈ Dom, (−A)(x) def

= 1 − A(x)

  • for all x ∈ Dom, (A ∪ B)(x) def

= max{A(x), B(x)}

  • for all x ∈ Dom, (A ∩ B)(x) def

= min{A(x), B(x)}

  • A ⊆ B

def

≡ for all x ∈ Dom, A(x) ≤ B(x).

Lecture 09 Slide 5 of 37

slide-68
SLIDE 68

Fuzzy reasoning Lecture 09

Example

Consider sets of young persons and tall persons, which are subsets

  • f {Anne, Eve, Jack, Mary, Steve}:

person Anne Eve Jack Mary Steve young 0.6 0.5 0.7 0.4 0.9 tall 0.4 0.7 0.6 0.7 0.8 Then: person Anne Eve Jack Mary Steve young∪tall 0.6 0.7 0.7 0.7 0.9 young∩tall 0.4 0.5 0.6 0.4 0.8 −tall 0.6 0.3 0.4 0.3 0.2

Lecture 09 Slide 6 of 37 Fuzzy reasoning Lecture 09

Example

Suppose we have a choice of five meals in a restaurant. Assume further that fuzzy sets “tasty” and “expensive” are given by meal 1 2 3 4 5 tasty 0.3 0.8 0.7 0.4 0.8 expensive 0.2 0.6 0.9 0.6 0.9 Then we have (notice some drawbacks in the results): meal 1 2 3 4 5 inexpensive (− expensive) 0.8 0.4 0.1 0.4 0.1 tasty and (∩) inexpensive 0.3 0.4 0.1 0.4 0.1

Lecture 09 Slide 7 of 37 Fuzzy reasoning Lecture 09

Example – continued

Consider an alternative definition intersection:

  • for all x ∈ Dom, (A ∩ B)(x) def

= A(x) ∗ B(x) With the new definitions we have: meal 1 2 3 4 5 tasty and (∩) inexpensive 0.24 0.32 0.07 0.16 0.08 which is much better. Remark The choice of definitions is then application-dependent. The definition of union, fitting the new definition of intersection may be:

  • for all x ∈ Dom, (A ∪ B)(x) def

= A(x) + B(x) − A(x) ∗ B(x).

Lecture 09 Slide 8 of 37 Fuzzy reasoning Lecture 09

Linguistic variables

Linguistic variables take the verbal values such as “hot”, “moderately hot”, “cold”. The expressions “temperature is hot”, “temperature is cold” are called fuzzy propositions and are assigned values reflecting their truth degrees.. Example IF temperature IS very cold THEN stop fan IF temperature IS cold THEN turn down fan IF temperature IS normal THEN maintain level IF temperature IS hot THEN speed up fan.

Lecture 09 Slide 9 of 37

slide-69
SLIDE 69

Fuzzy reasoning Lecture 09

Linguistic modifiers

Definition A linguistic modifier, is an operation that modifies the meaning of a term. From the perspective of fuzzy sets, linguistic modifiers are

  • perations on fuzzy sets.

Examples “very”, “extremely”, “slightly”, ...

Lecture 09 Slide 10 of 37 Fuzzy reasoning Lecture 09

Representing modifiers

“Very”

  • The modifier “very” which, e.g., transforms the statement

“Jack is young” to “Jack is very young.” The modifier “very” is usually defined to be very A(x) def = A(x)2. If young(Jack) = 0.7 then:

  • very young(Jack) = 0.72 = 0.49
  • very very young
  • 0.49

(Jack) = 0.492 = 0.2401.

Lecture 09 Slide 11 of 37 Fuzzy reasoning Lecture 09

Representing modifiers

“Extremely”, “slightly”

  • The modifier “extremely” can be defined by

extremely A(x) def = A(x)3. E.g., extremely young(Jack) = 0.73 = 0.343.

  • The modifier “slightly” can be defined by

slightly A(x) def =

  • A(x).

E.g., slightly young(Jack) = √ 0.7 ≈ 0.837.

Lecture 09 Slide 12 of 37 Fuzzy reasoning Lecture 09

Fuzzy reasoning

The idea In fuzzy reasoning we are interested in truth degree of complex sentences on the basis of truth degrees of atomic sentences. Any formula can be viewed as a set of objects satisfying the formula. As in the classical logic, conjunction and disjunction reflect intersection and union on sets. The choice of implication is less

  • bvious. One of common choices is to accept the G¨
  • del’s
  • implication. However, in applications like fuzzy controllers one

frequently uses other definitions of implication.

Lecture 09 Slide 13 of 37

slide-70
SLIDE 70

Fuzzy reasoning Lecture 09

Examples

1 Formula tall(x) representing the set of tall persons has the

truth value equal to the fuzzy membership of x in the (fuzzy) set tall. E.g., tall(John) might be 0.7 (if John ∈ tall to the degree 0.7).

2 Consider formula

walk(x) ∨ cinema(x) and suppose that walk(Eve) = 0.6 and cinema(Eve) = 0.5. Then the truth degree of walk(Eve) ∨ cinema(Eve) is max{0.6, 0.7} = 0.7.

Lecture 09 Slide 14 of 37 Fuzzy reasoning Lecture 09

Solving the initial problem (partially)

Recall the rule: “If the water level in a tank is high then the tank’s valve is opened widely”. Assume that if the tank is filled in p% then the water level is high to degree p/100 and that the same characteristics applies to the term “widely”. The sentence has a pattern A → B, where

  • A stands for “Water level in a tank is high”
  • B stands for “Tank’s valve is opened widely”.

Lecture 09 Slide 15 of 37 Fuzzy reasoning Lecture 09

Solving the initial problem (partially)

Suppose that:

  • the tank is filled in 60% (i.e., A holds to the degree 0.6)
  • the valve is opened in 50% (i.e., B holds to the degree 0.5).

Then, assuming G¨

  • del’s implication,

e(A → B) def = 1 when e(A) ≤ e(B) e(B)

  • therwise.

A → B is true to degree 0.5. In the case when

  • the tank is filled in 50% (i.e., A holds to the degree 0.5)
  • the valve is opened in 60% (i.e., B holds to the degree 0.6).

Then (with G¨

  • del’s implication) A → B is true to the degree 1.0.

Lecture 09 Slide 16 of 37 Fuzzy reasoning Lecture 09

Some problems

  • How to chose operations on fuzzy sets?
  • Fuzzification: how to provide definitions of input concepts?
  • How to provide rules and interpret them?
  • Defuzzification: how to interpret the result of reasoning?

Lecture 09 Slide 17 of 37

slide-71
SLIDE 71

Fuzzy reasoning Lecture 09

Fuzzification

Definition Fuzzification is the process of changing a real scalar value into a fuzzy value. This is achieved with the different types of fuzzifiers. Fuzzification of a real-valued variable is is based on intuition, experience and analysis of the set of rules and conditions associated with the input data variables. There is no fixed set of procedures for the fuzzification, but there are typical techniques.

Lecture 09 Slide 18 of 37 Fuzzy reasoning Lecture 09

Fuzzification techniques

Triangular fuzzifiers L C R

1 f (x) def =    when x < L or x > R 1 − |C − x| (R − L)/2 when L ≤ x ≤ R.

Lecture 09 Slide 19 of 37 Fuzzy reasoning Lecture 09

Fuzzification techniques - triangular fuzzifiers

Example Consider a PID (Proportional-Integral-Derivative Controller). A PID controller calculates an “error” value as the difference between a measured process variable and a desired value. The controller attempts to minimize the error by adjusting the process control inputs. PID controllers are based on a model involving integrals and

  • derivatives. On the other hand, we have basically three rules here:

If INPUT is as required, then DO NOTHING. If INPUT is TOO LOW, then INCREASE. If INPUT is TOO HIGH, then DECREASE.

Lecture 09 Slide 20 of 37 Fuzzy reasoning Lecture 09

Fuzzification techniques - triangular fuzzifiers

Example – continued Controllers are used in industry to regulate temperature, pressure, flow rate, chemical composition, speed and practically every other variable for which a measurement and a regulator exists. Typical examples:

  • automobile cruise control
  • water level control.

Here we deal with the concept of being close to a given fixed value and triangular fuzzifiers are well-justified.

Lecture 09 Slide 21 of 37

slide-72
SLIDE 72

Fuzzy reasoning Lecture 09

Fuzzification techniques

Trapezoidal fuzzifiers L C R

1 W

  • f (x) def

=            when x < L or x > R x − L C − W /2 − L when L ≤ x ≤ C − W /2 1 when L + W /2 < x < R − W /2 R − x R − C − W /2 when R − W /2 ≤ x ≤ R

Lecture 09 Slide 22 of 37 Fuzzy reasoning Lecture 09

Fuzzification techniques - trapezoidal fuzzifiers

Examples

  • the concept of “middle age”
  • the concept of “medium price”
  • approximations of computationally complex functions (like

probability distribution) Example “John is in his middle age.” “People in middle age are often in midlife crisis.” “Midlife crisis often causes dramatic self-doubt.”

Lecture 09 Slide 23 of 37 Fuzzy reasoning Lecture 09

Fuzzy rules

Fuzzy systems are built to replace human experts with a software agents using fuzzy logic using rules expressed in the form of Horn clauses: IF list of fuzzy propositions THEN fuzzy proposition Computational interpretation Assign to consequent the value which results from applying the conjunction operator to values assigned to antecedents.

Lecture 09 Slide 24 of 37 Fuzzy reasoning Lecture 09

Fuzzy rules – example

Example “People in middle age are often in midlife crisis.”: (1) mc(X) :– often(ma(X)). note the presence of linguistic modifier “often”. One has to reduce the value of the resulting mc, e.g., by multiplying it by 0.9 (this should reflect statistical data). “Midlife crisis often causes dramatic self-doubt”: (2) dsd(X) :– often′(mc(X)).

Lecture 09 Slide 25 of 37

slide-73
SLIDE 73

Fuzzy reasoning Lecture 09

Fuzzy rules – example continued

Assume that:

  • people in age [0, 36] are definitely not in their middle age, so

are people in age [50, +∞)

  • people in age [40, 46] are surely in their middle age
  • and for the others “being in middle age” is vague.

This gives rise to the trapezoidal fuzzifier with L = 36, R = 50, C = 43, W = 6.

Lecture 09 Slide 26 of 37 Fuzzy reasoning Lecture 09

Fuzzy rules – example continued

If John is 39 years old, often(X) def = 0.9 ∗ X and

  • ften′(X) def

= 0.8 ∗ X (80% of those in midlife crisis face dramatic self-doubt) then:

  • ma(John) =

x − L C − W /2 − L = 39 − 36 43 − 6/2 − 36 = 3/4

  • mc(John) = 0.9 ∗ ma(John) = 0.675
  • dsd(John) = 0.8 ∗ mc(John) = 0.8 ∗ 0.675 = 0.54.

The truth degree of John being in dramatic self-doubt is then 0.54. Question What is there are several rules providing different values?

Lecture 09 Slide 27 of 37 Fuzzy reasoning Lecture 09

Defuzzification techniques

Definition Defuzzification is the process of calculating a single value on the basis of possibly many different values delivered by rules. Remark Observe that there might be rules delivering such values implicitly, when their conclusions are dependent on one another. For example, rules designed to decide how much pressure to apply might result in “decrease pressure” (0.3), “maintain pressure” (0.4), “increase pressure (0.5)”. Defuzzification transforms this result into a single number indicating what action to perform. The simplest but least useful defuzzification method is to choose the highest membership. The problem is that one loses information, while the result may be used in further reasoning.

Lecture 09 Slide 28 of 37 Fuzzy reasoning Lecture 09

Defuzzification: weighted average

Definition The output value V is given by: V def =

n

  • i=1

(wi ∗ Vi)

n

  • i=1

wi , (3) where n is the number of rules and for 1 ≤ i ≤ n, wi is the weight

  • f the i-th rule and Vi is a value delivered by the i-th rule.

Lecture 09 Slide 29 of 37

slide-74
SLIDE 74

Fuzzy reasoning Lecture 09

Example

Consider the example, where outcomes from rules have been:

  • 0.3 for “decrease pressure”
  • 0.4 for “maintain pressure”
  • 0.5 for “increase pressure”.

Note that one first has to aggregate the outcomes so that they will concern the value of a single attribute/property. We achieve the aggregation by observing that all rules contribute to the value of the attribute “pressure”.

Lecture 09 Slide 30 of 37 Fuzzy reasoning Lecture 09

Example continued

Assume that the current pressure is 4.0 atm, “decrease pressure” reduces pressure by 0.2 atm and “increase pressure” increases it by 0.4 atm. So:

  • pressure 4.0 − 0.2 = 3.8 atm is supported with strength 0.3
  • pressure 4.0 atm is supported with strength 0.4
  • pressure 4.4 atm is supported with strength 0.5.

Weights for rules are provided on the basis of their importance and

  • strength. Assuming weights 0.35, 0.40, 0.25, by (3) the value of

pressure to be achieved is 0.35 ∗ 3.8 atm + 0.40 ∗ 4.0 atm + 0.25 ∗ 4.4 atm 0.35 + 0.40 + 0.25 = 4.03 atm, meaning that pressure is to be maintained (any decrease/increase would result in a worse approximation of the desired value).

Lecture 09 Slide 31 of 37 Fuzzy reasoning Lecture 09

Monoidal t-norm logics: a more general picture

Motivation In fuzzy reasoning one uses logical connectives with substantially different semantics. The question is how are they related to each

  • ther.

T-norm (triangular norm) - the idea A t-norm generalizes intersection in a lattice and conjunction in

  • logic. The name triangular norm reflects its use in probabilistic

metric spaces, where t-norms generalize triangle inequality of

  • rdinary metric spaces.

Lecture 09 Slide 32 of 37 Fuzzy reasoning Lecture 09

T-norms

Definition A t-norm is a function T : [0, 1] × [0, 1] − → [0, 1] which satisfies the following properties:

  • commutativity: T(x, y) = T(y, x)
  • monotonicity: x ≤ z and y ≤ u imply T(x, y) ≤ T(z, u)
  • associativity: T(x, T(y, z)) = T(T(x, y), z)
  • 1 acts as identity element: T(x, 1) = x.

Continuity A t-norm is called continuous if it is continuous as a function, in the usual sense (similarly for left- and right-continuity).

Lecture 09 Slide 33 of 37

slide-75
SLIDE 75

Fuzzy reasoning Lecture 09

T-norms: examples

Lukasiewicz t-norm: ⊤Luk(x, y) = max{0, x + y − 1}.

  • Minimum t-norm: ⊤min(x, y) = min{x, y}, also called the

  • del t-norm.
  • Product t-norm: ⊤prod(x, y) = x ∗ y
  • Nilpotent minimum: ⊤nM(x, y) =
  • min{x, y}

if x + y > 1

  • therwise

.

Lecture 09 Slide 34 of 37 Fuzzy reasoning Lecture 09

Implications

Implications are residua For any left-continuous t-norm ⊤, there is a unique binary

  • peration ⇒ on [0, 1] such that for all x, y, z ∈ [0, 1]:

⊤(z, x) ≤ y if and only if z ≤ (x ⇒ y) This operation is called the residuum of the t-norm. Basic properties of residua If ⇒ is the residuum of a left-continuous t-norm ⊤, then

  • (x ⇒ y) = sup{z | ⊤(z, x) ≤ y}
  • (x ⇒ y) = 1 if and only if x ≤ y
  • (1 ⇒ y) = y.

Lecture 09 Slide 35 of 37 Fuzzy reasoning Lecture 09

T-conorms

T-conorms: the idea T-conorms (also called S-norms) are dual to t-norms under the

  • peration which assigns 1 − x to x (corresponding to fuzzy

negation). Definition Given a t-norm, the complementary conorm is defined by ⊥(x, y) = 1 − ⊤(1 − x, 1 − y). (This generalizes De Morgan’s laws.) Some properties Any t-conorm is commutative, monotonic and associative with 0 as the identity element (⊥(x, 0) = x).

Lecture 09 Slide 36 of 37 Fuzzy reasoning Lecture 09

The summary

In the following table residua are given for x > y (recall that for x ≤ y, any residuum is 1). t-norm(x, y) conorm(x, y) residuum(x, y) max{0, x+y −1} min{x+y, 1} 1−x+y ( Lukasiewicz) min{x, y} max{x, y} y (G¨

  • del)

x ∗ y x + y − x ∗ y y/x (Goguen) Nilpotent

  • max{x, y} if x+y <1

1

  • therwise.

max{1 − x, y}

Lecture 09 Slide 37 of 37

slide-76
SLIDE 76

Temporal reasoning

CUGS: Logic II

Lecture 10

Lecture 10 Slide 1 of 36 Temporal reasoning Lecture 10

The problem

1 How to express and formally verify properties of reactive

programs, like

  • “repeating a request will force a response”
  • “permanent holding a request will force a response”
  • “two processes will never enter a critical region”?

2 How to express natural language expressions involving tenses? 3 How to model actions and change?

Lecture 10 Slide 2 of 36 Temporal reasoning Lecture 10

Modal approach to temporal reasoning

Introduced by Prior (1957) under the name tense logic Interpretation of , ♦ In temporal logics:

  • A is interpreted as “A always holds”

(denoted by Prior as G A)

  • ♦A is interpreted as “A eventually holds”

(denoted by Prior as F A). Temporal past operators Prior considered also operators: – H “it has always been the case that ...” – P “it has at some time been the case that ...”.

Lecture 10 Slide 3 of 36 Temporal reasoning Lecture 10

Modal approach to temporal reasoning

Examples

  • G A → F A (“what will always be, will be”)
  • F A → FF A (“if it will be the case that A,

it will be – in between – that it will be”)

  • (failure → alarm)
  • (requestPrint → ♦print).

Lecture 10 Slide 4 of 36

slide-77
SLIDE 77

Temporal reasoning Lecture 10

Semantics

Time structures Time structures are Kripke structures, where accessibility relation reflects time order <. Interpretation of tense operators

  • P A is true at timepoint t iff A is true at some timepoint t′

such that t′ < t

  • F A is true at timepoint t iff A is true at some timepoint t′

such that t < t′

  • H A is true at timepoint t iff A is true at all timepoints t′

such that t′ < t

  • G A is true at timepoint t iff A is true at all timepoints t′

such that t < t′.

Lecture 10 Slide 5 of 36 Temporal reasoning Lecture 10

Minimal tense logic

Axioms

  • axioms of classical propositional calculus
  • A → HF A

– “what is, has always been going to be”

  • A → GP A

– “what is, will always have been”

  • H (A → B) → (H A → H B)

– “whatever has always followed from what always has been, always has been”

  • G (A → B) → (G A → G B)

– “whatever will always follow from what always will be, always will be”

Lecture 10 Slide 6 of 36 Temporal reasoning Lecture 10

Minimal tense logic

Rules

  • modus ponens
  • A

H A

  • A

G A Theorem Minimal tense logic is sound and complete over all structures

  • f time.

Lecture 10 Slide 7 of 36 Temporal reasoning Lecture 10

Binary temporal operators

Kamp 1968

  • A since B – “A has been true since a time when B was true”
  • A until B – “A will be true until a time when B is true”.

Expressiveness

  • It is possible to define the unary tense operators in terms of

since and until:

  • P B ≡ t since B
  • F B ≡ t until B
  • One can express all first-order temporal properties on

continuous, strictly linear time structures (which is not true for the unary operators).

Lecture 10 Slide 8 of 36

slide-78
SLIDE 78

Temporal reasoning Lecture 10

Temporal properties of programs

Invariance (safety) properties

  • p → q (all states reached by a program after the state

satisfying p will satisfy q)

  • (atFirst → p) → (atEnd → q) (partial correctness w.r.t

conditions p and q, where propositional variable atFirst is true

  • nly at the beginning of the specified program, while atEnd is

true only when the program reaches its terminal state)

  • (¬q ∨ ¬p) (the program cannot enter critical regions p and

q simultaneously – mutual exclusion).

Lecture 10 Slide 9 of 36 Temporal reasoning Lecture 10

Temporal properties of programs

Eventuality properties

  • p → ♦q (there is a program state satisfying q reached by

a program after the state satisfying p)

  • (atFirst → p) → ♦(atEnd ∧ q) (total correctness w.r.t.

conditions p and q)

  • ♦p → ♦q (repeating a request p will force a response q)
  • p → ♦q (permanent holding a request p will force

a response q).

Lecture 10 Slide 10 of 36 Temporal reasoning Lecture 10

Models of time

Interpretations revisited We characterize time by interpretations S, ρ, where

  • S = {s1, . . . , sn} is a set of states (worlds)
  • ρ is a binary relation on S; the intended meaning of ρ(s, s′) is

that s′ is a “future” of s. We often write s′ ∈ ρ(s) to indicate that ρ(s, s′) holds. Remark Time is usually characterized semantically. Axioms reflect the required properties. We assume linear and discrete time with states “labeled” by natural numbers and ρ being the standard order ≤

  • n natural numbers.

Lecture 10 Slide 11 of 36 Temporal reasoning Lecture 10

Operator “nexttime”

Assume we have a binary relation τ on states. The intended meaning of s′ ∈ τ(s) is that the state s′ occurs in the next time

  • moment. The nexttime operator, denoted by is defined by

s, τ | = A iff there is s′ ∈ τ(s) such that s′, τ | = A. We define ρ to be the reflexive and transitive closure of τ. In such a case we have: | = p → p | = p → ♦p.

Lecture 10 Slide 12 of 36

slide-79
SLIDE 79

Temporal reasoning Lecture 10

Models of time

Examples of correspondences between axioms and time structure

  • ⊢ A → A – relation ρ is reflexive

(recall that we now consider ≤ rather than <)

  • ⊢ A → A – relation ρ is transitive
  • ⊢ A ≡ ¬ ¬A – relation ρ is linear.

Note that the presence of makes time discrete.

Lecture 10 Slide 13 of 36 Temporal reasoning Lecture 10

Sound and complete Hilbert-like proof system L for linear time propositional temporal logic

Axioms

1 any substitution instance of a valid propositional formula (PC) 2 ⊢ (A → B) → (A → B) – distribution of over → 3 ⊢ (A → B) → (A → B) – distribution of over → 4 ⊢ A → (A ∧ A ∧ A) – expansion of 5 ⊢ (A → A) → (A → A) – induction 6 ⊢ A ≡ ¬ ¬A – linearity.

Rules Modus ponens ⊢ A, ⊢ A → B B and generalization ⊢ A ⊢ A.

Lecture 10 Slide 14 of 36 Temporal reasoning Lecture 10

Some derived rules

Generalization for : ⊢ A → B A → B for : ⊢ A → B A → B Induction ⊢ A → A ⊢ A → A

Lecture 10 Slide 15 of 36 Temporal reasoning Lecture 10

Examples of formal proofs

⊢ p → ♦p 1 ⊢ ¬p → ¬p Expansion, PC 2 ⊢ ¬¬p → ¬¬p PC, 1 3 ⊢ p → ¬¬p PC, 2 4 ⊢ p → ♦p 3, Definition ⊢ p → ♦p 1 ⊢ ¬p → ¬p Expansion, PC 2 ⊢ ¬ ¬p → ¬¬p PC, 1 3 ⊢ p → ¬¬p Linearity, 2 4 ⊢ p → ♦p 3, Definition

Lecture 10 Slide 16 of 36

slide-80
SLIDE 80

Temporal reasoning Lecture 10

Examples of formal proofs

⊢ (p ∧ q) ≡ (p ∧ q) 1 ⊢ (p ∧ q) → p PC, Generalization 2 ⊢ (p ∧ q) → q PC, Generalization 3 ⊢ (p ∧ q) → (p ∧ q) PC, 1, 2 4 ⊢ (p →¬q)→(p →¬q) distribution of over → 5 ⊢ ¬(p →¬q)→¬ (p →¬q) PC, 4 6 ⊢ ¬ p ∨ ¬q ∨ ¬ (p → ¬q) PC, 5 7 ⊢ ¬ p ∨ ¬ q ∨ ¬(p → ¬q) 6, linearity 8 ⊢ ¬(p ∧ q) ∨ (p ∧ q) PC, 7 9 ⊢ (p ∧ q) → (p ∧ q) PC, 8 10 ⊢ (p ∧ q) ≡ (p ∧ q) PC, 3, 9

Lecture 10 Slide 17 of 36 Temporal reasoning Lecture 10

Semantic tableaux

α-, β- and X-rules We add the following rules to tableaux rules for classical propositional calculus (recall that α-rules have a conjunctive flavor and β-rules have a disjunctive flavor): α α1 α2 β β1 β2 X X1 A A A ♦A A ♦A A A ¬♦A ¬A ¬ ♦A ¬A ¬A ¬ A ¬ A ¬A The status of X-rules X-rules are used to build the contents of the next node.

Lecture 10 Slide 18 of 36 Temporal reasoning Lecture 10

Semantic tableaux

Building semantic tableaux

  • For classical propositional connectives and temporal α- and

β-formulas the construction is traditional.

  • If a leaf l consists only of literals and formulas of the form:

A1, . . . , Am, ¬ Am+1, . . . , ¬ An, then

  • if A1, . . . , Am, ¬Am+1, . . . , ¬An is a contents of a node l′ which

is an ancestor of l then make l′ a successor of l

  • otherwise create a new successor node containing formulas:

A1, . . . , Am, ¬Am+1, . . . , ¬An (observe that we do not include literals which have not been in the scope of ).

Lecture 10 Slide 19 of 36 Temporal reasoning Lecture 10

A problem

Consider ¬p

p ↓ p, p ↓

  • p

Remark Note that ¬p is not a tautology.

Lecture 10 Slide 20 of 36

slide-81
SLIDE 81

Temporal reasoning Lecture 10

Discussion

Remark

  • Using the tableau from slide 20 one can construct a model for

p by transforming the cycle into an infinite branch. If such a loop appears, one can stop the algorithm and answer that the input formula is not a tautology.

  • To make this principle working we need to really make the

branch a model also when formulas of the form ♦p (or ¬p) appear in the cycle. Namely, whenever such formula appears in a node, there should be a “future” node containing p (respectively, ¬p).

Lecture 10 Slide 21 of 36 Temporal reasoning Lecture 10

Completed and closed tableaux

Definition

  • A semantic tableau is completed if:

– all its leaves contain a formula and its negation

  • r

– tableau rules do not create new nodes.

  • A path is called fulfilling if:
  • whenever a node n contains ♦A then there is a node n′

reachable from n along the path and containing A

  • whenever a node n contains ¬A then there is a node n′

reachable from n along the path and containing ¬A.

Lecture 10 Slide 22 of 36 Temporal reasoning Lecture 10

Semantic tableaux: soundness and completeness

Definition A semantic tableau is closed if all its leaves contain a formula and its negation. Soundness and completeness A formula A is unsatisfiable iff there is a closed tableau for A with all cycles unfulfilling.

Lecture 10 Slide 23 of 36 Temporal reasoning Lecture 10

Semantic tableaux: example

(p ∧ q) → (p ∧ q) (p ∧ q) ∧ ¬(p ∧ q) ↓ (p ∧ q), ¬(p ∧ q) ւ ց (p ∧ q), ¬ p (p ∧ q), ¬ q ↓ ↓ p ∧ q, ¬p p ∧ q, ¬q ↓ ↓ p, q, ¬p p, q, ¬q

Lecture 10 Slide 24 of 36

slide-82
SLIDE 82

Temporal reasoning Lecture 10

Semantic tableaux: example

(p ∧ q) → p

(p ∧ q) ∧ ¬p ↓ (p ∧ q), ¬p ↓ p ∧ q, (p ∧ q), ¬p ↓ p, q, (p ∧ q), ¬p ւ ց p, q, (p ∧ q), ¬p p, q, (p ∧ q), ¬ p ↓

  • (p ∧ q), ¬p

Lecture 10 Slide 25 of 36 Temporal reasoning Lecture 10

CTL: Computation Tree Logic – Clarke, Emerson 1981

The idea

  • Branching time (tree-like structures).
  • Statements about all or some paths starting in a state.

Syntax Extends the syntax of linear time temporal logic (with , , ♦, until) by allowing to use (note syntactic restrictions):

  • ∀φ – all paths satisfy φ
  • ∃φ – there is a path satisfying φ:

φ ::= f | t | ¬φ | φ ∨ φ | φ ∧ φ | φ → φ | φ ≡ φ | ∀ φ | ∃ φ | ∀φ | ∃φ | ∀♦φ | ∃♦φ | ∀ φ until φ | ∃ φ until φ

Lecture 10 Slide 26 of 36 Temporal reasoning Lecture 10

CTL: Computation Tree Logic

Examples Let lc mean “I like chocolate” and wo mean “it’s warm outside”:

  • ∀lc – “I will like chocolate from now on, no matter what

happens”

  • ∃♦lc – “it’s possible I’ll eventually like chocolate sometime”
  • ∀♦∃lc – “it’s always possible (∀♦) that I will start liking

chocolate forever”

  • ∀ lc until wo – “from now until it’s warm outside, I will like

chocolate”.

Lecture 10 Slide 27 of 36 Temporal reasoning Lecture 10

CTL*

Syntax No syntactic restrictions as to mixing temporal operators. For example, ∃♦(p until q) is not expressible in CTL, but expressible in CTL*. Complexity of satisfiability problem Logic Complexity LTL PSpace-complete CTL ExpTime-complete CTL* 2ExpTime-complete

Lecture 10 Slide 28 of 36

slide-83
SLIDE 83

Temporal reasoning Lecture 10

Temporal reasoning in classical first-order logic

The idea Add each time-dependent proposition or predicate with an extra argument representing time with chosen granularity. For example, born(12 − 04 − 1980, John) lecture(10 : 15, logic, Monday) run 100m(08 : 10 : 00, 9 : 61). Next, consider the first-order language with a binary predicate < denoting the temporal ordering relation and a constant now denoting the current moment.

Lecture 10 Slide 29 of 36 Temporal reasoning Lecture 10

Temporal reasoning in classical first-order logic

Examples

  • Definitions of unary temporal operators:

P A

def

≡ ∃t[t < now ∧ A(t)] F A

def

≡ ∃t[now < t ∧ A(t)] H A

def

≡ ∀t[t < now ∧ A(t)] G A

def

≡ ∀t[now < t ∧ A(t)].

  • P town(Mark, London) is equivalent to

∃t[t < now ∧ town(t, Mark, London)].

Lecture 10 Slide 30 of 36 Temporal reasoning Lecture 10

Reification

The idea Use statements of the form:

  • Holds(t, A) meaning “A is true at timepoint t
  • Holds(t, t′, A) meaning “A is true in the interval between

timepoint t and t′. Examples

  • Holds(12, safe)
  • Holds(13, 18, danger).

Lecture 10 Slide 31 of 36 Temporal reasoning Lecture 10

State and event-type reification – Allen 1984

Modeling states and events

  • Propositions reporting states (such as “the situation is safe”)

have homogeneous temporal incidence, in that they must hold

  • ver any subintervals of an interval over which they hold

(e.g., if the situation is safe from 1 o’clock to 6 o’clock then it is safe from 1 to 2, from 2 to 3, and so on).

  • Propositions reporting events (such as “John walks from place

A to place B”) have inhomogeneous temporal incidence: such a proposition is not necessarily true in any proper subinterval of an interval of which it is true (e.g., if John walks from A to B over the interval from 1 o’clock to 1 : 15, then it is not the case that he walks from A to B from 1 to 1 : 05 – rather, over that interval he walks from A to C = B).

Lecture 10 Slide 32 of 36

slide-84
SLIDE 84

Temporal reasoning Lecture 10

State and event-type reification

Modeling states and events

  • Use Holds to model states.
  • Use Occurs to model events.

In this form, the inference does not require any additional logical apparatus over and above standard first-order predicate logic Example In the following formulas (t, t′) denote time intervals in the

  • bvious way:

Holds(Asleep(Mary), (1pm, 6pm)) Occurs(Walk to(John, Home, Park), (1pm, 1.15pm))

Lecture 10 Slide 33 of 36 Temporal reasoning Lecture 10

State and event-type reification

Some axioms The homogeneity of states and inhomogeneity of events is assured by axioms such as: ∀s∀i∀i′[Holds(s, i) ∧ In(i′, i) → Holds(s, i′)] ∀e∀i∀i′[(Occurs(e, i) ∧ In(i′, i)) → ¬Occurs(e, i′)], where In expresses the proper subinterval relation.

Lecture 10 Slide 34 of 36 Temporal reasoning Lecture 10

Event-token reification – Davidson 1967

The problem Give a formal justification of the validity of such inferences as: “John saw Mary in London on Tuesday. Therefore, John saw Mary on Tuesday.” The idea The key idea is that each event-forming predicate is endowed with an extra argument ranging over event-tokens, that is, particular dated occurrences.

Lecture 10 Slide 35 of 36 Temporal reasoning Lecture 10

Event-token reification

Example The inference from previous slide is expressed as: ∃e [See(e, John, Mary) ∧ Place(e, London) ∧Time(e, Tuesday)] therefore, ∃e[See(e, John, Mary) ∧ Time(e, Tuesday)].

Lecture 10 Slide 36 of 36