Getting started with Isabelle/Isar Makarius Wenzel TU M unchen - - PowerPoint PPT Presentation

getting started with isabelle isar
SMART_READER_LITE
LIVE PREVIEW

Getting started with Isabelle/Isar Makarius Wenzel TU M unchen - - PowerPoint PPT Presentation

Getting started with Isabelle/Isar Makarius Wenzel TU M unchen August 2007 1. Foundations: logical framework 2. Forward reasoning: proof context 3. Backward reasoning: proof state PRELUDE: Notions of proof Isabelle tactic scripts lemma A


slide-1
SLIDE 1

Getting started with Isabelle/Isar

Makarius Wenzel TU M¨ unchen August 2007

  • 1. Foundations: logical framework
  • 2. Forward reasoning: proof context
  • 3. Backward reasoning: proof state
slide-2
SLIDE 2

PRELUDE: Notions of proof

slide-3
SLIDE 3

Isabelle tactic scripts

lemma A apply (rule-tac x = x in allE) apply simp-all apply blast? apply (subgoal-tac B) apply auto?

  • ops

Problems:

  • machine instructions
  • dependent on hidden goal state
  • goal accumulates local parameters, local premises, conclusions
  • not modular, not scalable
  • hard to maintain, hard to re-use (derivative work!)

Notions of proof 2

slide-4
SLIDE 4

Mathematical vernacular

[Davey and Priestley, 1990, pages 93–94] The Knaster-Tarski Fixpoint Theorem. Let L be a complete lattice and f : L → L an order-preserving map. Then {x ∈ L | f (x) ≤ x} is a fixpoint of f.

  • Proof. Let H = {x ∈ L | f (x) ≤ x} and a = H. For all x ∈ H

we have a ≤ x, so f (a) ≤ f (x) ≤ x. Thus f (a) is a lower bound

  • f H, whence f (a) ≤ a. We now use this inequality to prove the

reverse one (!) and thereby complete the proof that a is a fixpoint. Since f is order-preserving, f (f (a)) ≤ f (a). This says f (a) ∈ H, so a ≤ f (a).

Notions of proof 3

slide-5
SLIDE 5

Isabelle/Isar proof text

theorem Knaster-Tarski: assumes mono: Vx y. x ≤ y = ⇒ f x ≤ f y shows f ( {x. f x ≤ x}) = {x. f x ≤ x} (is f ?a = ?a) proof − have ∗: f ?a ≤ ?a (is - ≤ ?H ) proof fix x assume x ∈ ?H then have ?a ≤ x .. also from x ∈ ?H have f . . . ≤ x .. moreover note mono finally show f ?a ≤ x . qed also have ?a ≤ f ?a proof from mono and ∗ have f (f ?a) ≤ f ?a . then show f ?a ∈ ?H .. qed finally show f ?a = ?a . qed

Notions of proof 4

slide-6
SLIDE 6

Isabelle/Pure proof term

Knaster-Tarski ≡

λH : -.

  • rder-antisym · - · - ·

(Inter-greatest · - · - · (λX Ha: -.

  • rder-subst2 · - · - · ?f · - · (Inter-lower · - · - · Ha) ·

(iffD1 · - · - · (mem-Collect-eq · - · (λx. ?f x ≤ x)) · Ha) · H )) · (Inter-lower · - · - · (iffD2 · - · - · (mem-Collect-eq · - · (λa. ?f a ≤ a)) · (H · ?f ( {x. ?f x ≤ x}) · {x. ?f x ≤ x} · (Inter-greatest · - · - · (λX Ha: -.

  • rder-subst2 · - · - · ?f · - · (Inter-lower · - · - · Ha) ·

(iffD1 · - · - · (mem-Collect-eq · - · (λx. ?f x ≤ x)) · Ha) · H )))))

Notions of proof 5

slide-7
SLIDE 7

PART I: Foundations

slide-8
SLIDE 8

The Pure framework

slide-9
SLIDE 9

Pure syntax and primitive rules

⇒ function type constructor :: (α ⇒ prop) ⇒ prop universal quantifier = ⇒ :: prop ⇒ prop ⇒ prop implication

[x :: α] . . . . b(x) :: β λx. b(x) :: α ⇒ β (⇒I ) b :: α ⇒ β a :: α b(a) :: β (⇒E) [x] . . . . B(x)

  • Vx. B(x) (VI )
  • Vx. B(x)

B(a) (VE) [A] . . . . B A = ⇒ B (= ⇒I ) A = ⇒ B A B (= ⇒E)

Foundations: The Pure framework 8

slide-10
SLIDE 10

Equality

≡ :: prop ⇒ prop ⇒ prop Axioms for t ≡ u: α, β, η, refl, subst, ext, iff Unification: solving equations modulo αβη

  • Huet: full higher-order unification (infinitary enumeration!)
  • Miller: higher-order patterns (unique result)

Foundations: The Pure framework 9

slide-11
SLIDE 11

Hereditary Harrop Formulas

Define the following sets: x variables A atomic formulae (without = ⇒/) x∗. A∗ = ⇒ A Horn Clauses H def = x∗. H∗ = ⇒ A Hereditary Harrop Formulas (HHF) Conventions for results:

  • outermost quantification x. B x is rephrased via schematic

variables B ?x

  • equivalence (A =

⇒ (x. B x)) ≡ (x. A = ⇒ B x) produces canonical HHF

Foundations: The Pure framework 10

slide-12
SLIDE 12

Rules everywhere

slide-13
SLIDE 13

Natural Deduction rules

Examples:

A B A ∧ B A = ⇒ B = ⇒ A ∧ B [A] . . . . B A − → B (A = ⇒ B) = ⇒ A − → B P 0 [n][P n] . . . . P (Suc n) P n P 0 = ⇒ (Vn. P n = ⇒ P (Suc n)) = ⇒ P n

Foundations: Rules everywhere 12

slide-14
SLIDE 14

Representing goals

Protective marker: # :: prop ⇒ prop # ≡ λA :: prop. A Initialization: C = ⇒ #C (init) General situation: subgoals imply main goal B1 = ⇒ . . . = ⇒ Bn = ⇒ #C Finalization: #C C (finish)

Foundations: Rules everywhere 13

slide-15
SLIDE 15

Rule composition

  • A =

⇒ B B ′ = ⇒ C B θ = B ′θ

  • A θ =

⇒ C θ (compose)

  • A =

⇒ B ( H = ⇒ A) = ⇒ ( H = ⇒ B) (= ⇒-lift)

  • A

a = ⇒ B a ( x. A ( a x)) = ⇒ (

  • x. B (

a x)) (-lift)

Foundations: Rules everywhere 14

slide-16
SLIDE 16

Higher-order resolution (back-chaining)

rule:

  • A

a = ⇒ B a goal: (V x. H x = ⇒ B ′ x) = ⇒ C goal unifier: (λ

  • x. B (

a x)) θ = B ′θ (V x. H x = ⇒ A ( a x)) θ = ⇒ C θ (resolution) goal: (V x. H x = ⇒ A x) = ⇒ C assm unifier: A θ = H i θ (for some H i) C θ (assumption)

Both inferences are omnipresent in Isabelle/Isar:

  • resolution: e.g. OF attribute, rule method, also command
  • assumption: e.g. assumption method, implicit proof ending

Foundations: Rules everywhere 15

slide-17
SLIDE 17

The Isar proof language

slide-18
SLIDE 18

Isar primitives

fix x 1 . . . x n :: τ universal parameters assm ≪inference≫ a: A1 . . . An generic assumptions then indicate forward-chaining of facts have b: B1 . . . Bn local claim show b: B1 . . . Bn local claim, result refines goal using b1 . . . bn indicate use of facts unfolding b1 . . . bn unfold definitional equations proof method? structured refinement qed method? structured ending {

  • pen block

} close block next switch block let pat = t term abbreviation (matching) note c = b1 . . . bn reconsidered facts

Foundations: The Isar proof language 17

slide-19
SLIDE 19

Derived elements (1)

by method1 method2 = proof method1 qed method2 .. = by rule . = by this from b = note b then with b = from b and this assume = assm ≪discharge#≫ def x ≡ t = fix x assm ≪expand ≫ x ≡ t Γ ∪ A ⊢ C Γ ⊢ # A = ⇒ C (discharge#) Γ ∪ x ≡ t ⊢ C t Γ ⊢ C x (expand)

Foundations: The Isar proof language 18

slide-20
SLIDE 20

Derived elements (2): calculations

also0 = note calculation = this alson+1 = note calculation = trans [OF calculation this] finally = also from calculation moreover = note calculation = calculation this ultimately = moreover from calculation Example:

have a = b sorry also have . . . = c sorry also have . . . = d sorry finally have a = d .

Note: term “. . .” abbreviates the argument of the last statement

Foundations: The Isar proof language 19

slide-21
SLIDE 21

Derived elements (3): forward elimination

  • btain

x where B x proof = have reduction: thesis. ( x. B x = ⇒ thesis) = ⇒ thesis proof fix x assm ≪eliminate reduction≫ B x Γ ⊢ thesis. ( x. B x = ⇒ thesis) = ⇒ thesis Γ ∪ B y ⊢ C Γ ⊢ C (eliminate) Examples:

assume ∃ x. B x then obtain x where B x .. assume A ∧ B then obtain A and B ..

Foundations: The Isar proof language 20

slide-22
SLIDE 22

Isar proof context elements

{ fix x have B x sorry } note Vx. B x { assume A have B sorry } note A = ⇒ B { def x ≡ a have B x sorry } note B a {

  • btain x where B x sorry

have C sorry } note C

Foundations: The Isar proof language 21

slide-23
SLIDE 23

Isar statements

slide-24
SLIDE 24

Statement context and conclusion

statement ≡ context-element∗ conclusion context-element ≡ fixes var and . . . | defines var ≡ term and . . . | assumes name: prop and . . . conclusion ≡ shows prop and . . . Example: r = ⊢ x y. A x = ⇒ B y = ⇒ C x y

theorem r: fixes x and y assumes A x and B y shows C x y proof − from A x and B y show C x y sorry qed

Foundations: Isar statements 23

slide-25
SLIDE 25

Forward conclusions

Derived conclusion:

  • btains

x where B x . . . = fixes thesis assumes x. B x = ⇒ thesis and . . . shows thesis Example: r = ⊢ P = ⇒ (x y. A x = ⇒ B y = ⇒ thesis) = ⇒ thesis

theorem r: assumes P

  • btains x and y where A x and B y

proof − from P have A u and B v sorry then show thesis .. qed

Foundations: Isar statements 24

slide-26
SLIDE 26

Example: Natural Deduction rules

conjI : assumes A and B shows A ∧ B conjE: assumes A ∧ B obtains A and B disjI 1: assumes A shows A ∨ B disjI 2: assumes B shows A ∨ B disjE: assumes A ∨ B obtains A B impI : assumes A = ⇒ B shows A − → B impE: assumes A − → B and A obtains B allI : assumes Vx. B x shows ∀ x. B x allE: assumes ∀ x. B x obtains B a exI : assumes B a shows ∃ x. B x exE: assumes ∃ x. B x obtains x where B x classical: obtains ¬ thesis Peirce: obtains thesis = ⇒ A

Foundations: Isar statements 25

slide-27
SLIDE 27

PART II: Forward Reasoning

slide-28
SLIDE 28

No Goals!

slide-29
SLIDE 29

Atomic proofs

Omitted proofs: sorry Automated proofs: by simp by blast by auto Single-step proofs: by rule ≡ .. by this ≡ . by assumption

Forward Reasoning: No Goals! 28

slide-30
SLIDE 30

Analyzing atomic proofs

General atomic proof: by (initial-method) (terminal-method) Structured expansion: proof (initial-method) qed (terminal-method) Tactical transformation: apply (initial-method) apply (terminal-method) apply (assumption+)? done

Forward Reasoning: No Goals! 29

slide-31
SLIDE 31

Vacuous proofs

Idea: open a logical playground, after solving a trivial claim.

lemma True proof { fix x assume A x have B x sorry } qed

→ Isar proof body as mathematical notepad

Forward Reasoning: No Goals! 30

slide-32
SLIDE 32

Facts

slide-33
SLIDE 33

Producing facts

By assumption (“lambda”):

assume a: A

By proof (“let”):

have b: B sorry

By abbreviation (“let”):

note c = facts

Forward Reasoning: Facts 32

slide-34
SLIDE 34

Referencing facts

By explicit name:

assume a: A note a

By implicit name:

assume A note this

By proposition (modulo αβη-unification):

assume A note A assume Vx. B x note B a

Forward Reasoning: Facts 33

slide-35
SLIDE 35

Composing facts

Instantiation:

assume Vx. P x note this — P ?x note this [of a] — P a note this — P a

Composition / backchaining:

assume 1: A assume 2: A = ⇒ B note 2 [OF 1] — B assume 3: B = ⇒ C note 3 [OF 2] — A = ⇒ C

Forward Reasoning: Facts 34

slide-36
SLIDE 36

Immediate operations

Symmetric results:

assume x = y note this [symmetric] — y = x assume x = y note this [symmetric] — y = x

Ad-hoc simplification (take care!):

assume P ([] @ xs) note this [simplified] — P xs

Forward Reasoning: Facts 35

slide-37
SLIDE 37

Primitive transitive reasoning

assume 1: a = b assume 2: b = c assume 3: c = d note 1 — a = b note trans [OF this 2] — a = c note trans [OF this 3] — a = d

→ Isar calculations organize this nicely

Forward Reasoning: Facts 36

slide-38
SLIDE 38

Structured calculations

slide-39
SLIDE 39

Isar calculations

Example fragment:

assume a = b also — calculation: a = b assume b = c also — calculation: a = c assume c = d also — calculation: a = d

Finished calculation:

assume a = b also assume b = c also assume c = d finally have a = d .

Forward Reasoning: Structured calculations 38

slide-40
SLIDE 40

Terms abbreviations

  • (is pattern) binds schematic variables by higher-order matching
  • ?thesis abbreviates the original goal
  • “. . .” abbreviates the argument of the last statement

Typical calculational proof:

lemma a = d (is ?lhs = ?rhs) proof − have ?lhs = b sorry also have . . . = c sorry also have . . . = ?rhs sorry finally show ?thesis . qed

Forward Reasoning: Structured calculations 39

slide-41
SLIDE 41

Common calculational patterns

Mixed transitivity:

assume a = b also assume b < c also assume c ≤ d finally have a < d .

Substitution (take care!):

assume P a b c also assume a = a ′ also assume b = b ′ also assume c = c ′ finally have P a ′ b ′ c ′ .

Forward Reasoning: Structured calculations 40

slide-42
SLIDE 42

Degenerate calculations

Arranging facts (without rule compositions):

assume A moreover assume B moreover assume C moreover assume D ultimately have something sorry

Forward rules:

assume A moreover assume B ultimately have A ∧ B ..

Forward Reasoning: Structured calculations 41

slide-43
SLIDE 43

Structured automated reasoning

Idea: not-quite degenerate calculations involving local blocks, with ultimate big-bang integration.

fix x y :: nat { assume x < y have something sorry } moreover { assume x = y have something sorry } moreover { assume y < x have something sorry } ultimately have something by arith

Forward Reasoning: Structured calculations 42

slide-44
SLIDE 44

PART III: Backward reasoning

slide-45
SLIDE 45

Canonical proof decomposition

Isar derives assumption + resolution steps from the text:

have Vx. A x = ⇒ B x proof − fix x assume A x — (assumption) show B x — (resolution) sorry qed have Vx. H1 = ⇒ A (a x) = ⇒ H2 = ⇒ B (a x) proof − fix y assume A y — (assumption) show B y — (resolution) sorry qed

Backward reasoning: Structured calculations 44

slide-46
SLIDE 46

Forward reasoning

assume r: A1 = ⇒ A2 = ⇒ A3 = ⇒ A4 = ⇒ B have 1: A1 sorry have 2: A2 sorry have 3: A3 sorry have 4: A4 sorry from 1 2 3 4 have B by (rule r)

Backward reasoning: Structured calculations 45

slide-47
SLIDE 47

Backward reasoning

assume r: A1 = ⇒ A2 = ⇒ A3 = ⇒ A4 = ⇒ B have B proof (rule r) show A1 sorry show A2 sorry show A3 sorry show A4 sorry qed

Backward reasoning: Structured calculations 46

slide-48
SLIDE 48

Mixed forward/backward reasoning (1)

assume r: A1 = ⇒ A2 = ⇒ A3 = ⇒ A4 = ⇒ B have 1: A1 sorry have 2: A2 sorry from 1 and 2 have B proof (rule r) show A3 sorry show A4 sorry qed

Backward reasoning: Structured calculations 47

slide-49
SLIDE 49

Mixed forward/backward reasoning (2)

assume r: A1 = ⇒ A2 = ⇒ (Vx. H x = ⇒ A3 x) = ⇒ A4 = ⇒ B have 1: A1 sorry have 2: A2 sorry from 1 and 2 have B proof (rule r) fix x assume H x show A3 x sorry next show A4 sorry qed

Backward reasoning: Structured calculations 48

slide-50
SLIDE 50

Flexible proof composition

Valid transformations:

  • rename / permute parameters (fix)
  • permute assumptions (assume)
  • permute goals (show)
  • share common parts of contexts (next not mandatory)
  • peephole optimizations (then vs. from vs. with etc.)
  • establish generalized claim (fix / assume / show)

Caveats:

  • logical scoping of contexts must be observed – cannot move

assume into a corresponding show body

  • single-step methods demand strict order of facts
  • complex rule premises are better solved in backward manner

Backward reasoning: Structured calculations 49

slide-51
SLIDE 51

Induction outlines

slide-52
SLIDE 52

Natural induction

Canonical proof outline (stemming from induction rule):

theorem fixes n :: nat shows P n proof (induct n) case 0 — show ?case sorry — P 0 next case (Suc n) — P n show ?case sorry — P (Suc n) qed

Backward reasoning: Induction outlines 51

slide-53
SLIDE 53

Example: induction and calculation (1)

theorem fixes n :: nat shows 2 ∗ (P i=0..n. i) = n ∗ (n + 1) sorry

Backward reasoning: Induction outlines 52

slide-54
SLIDE 54

Example: induction and calculation (2)

theorem fixes n :: nat shows 2 ∗ (P i=0..n. i) = n ∗ (n + 1) proof (induct n) case 0 show ?case sorry next case (Suc n) show ?case sorry qed

Backward reasoning: Induction outlines 53

slide-55
SLIDE 55

Example: induction and calculation (3)

theorem fixes n :: nat shows 2 ∗ (P i=0..n. i) = n ∗ (n + 1) proof (induct n) case 0 have 2 ∗ (P i=0..0. i) = (0::nat) by simp also have (0::nat) = 0 ∗ (0 + 1) by simp finally show 2 ∗ P {0..0} = (0::nat) ∗ (0 + 1) . next case (Suc n) show ?case sorry qed

Backward reasoning: Induction outlines 54

slide-56
SLIDE 56

Example: induction and calculation (4)

theorem fixes n :: nat shows 2 ∗ (P i=0..n. i) = n ∗ (n + 1) proof (induct n) case 0 have 2 ∗ (P i=0..0. i) = (0::nat) by simp also have (0::nat) = 0 ∗ (0 + 1) by simp finally show 2 ∗ P {0..0} = (0::nat) ∗ (0 + 1) . next case (Suc n) have 2 ∗ (P i=0..Suc n. i) = 2 ∗ (P i=0..n. i) + 2 ∗ (n + 1) by simp also have 2 ∗ (P i=0..n. i) = n ∗ (n + 1) by (rule Suc.hyps) also have n ∗ (n + 1) + 2 ∗ (n + 1) = Suc n ∗ (Suc n + 1) by simp finally show 2 ∗ P {0..Suc n} = Suc n ∗ (Suc n + 1) . qed

Backward reasoning: Induction outlines 55

slide-57
SLIDE 57

Natural induction: compact version

theorem fixes n :: nat shows P n proof (induct n) case 0 show ?case sorry next case (Suc n) show ?case sorry qed

Backward reasoning: Induction outlines 56

slide-58
SLIDE 58

Natural induction: explicit version

theorem fixes n :: nat shows P n proof (induct n) show P 0 sorry next fix n assume P n show P (Suc n) sorry qed

→ More elementary, but less scalable

Backward reasoning: Induction outlines 57

slide-59
SLIDE 59

Compound induction statements

theorem fixes n :: nat shows Vx:: ′a. A n x = ⇒ P n x proof (induct n) case 0 note 0.hyps — note 0.prems — A 0 x show ?case sorry — P 0 x next case (Suc n) note Suc.hyps — A n ?x = ⇒ P n ?x note Suc.prems — A (Suc n) x show ?case sorry — P (Suc n) x qed

Backward reasoning: Induction outlines 58

slide-60
SLIDE 60

In-situ goal reformulation

theorem fixes n :: nat shows Vx:: ′a. A n x = ⇒ P n x proof (induct n)

  • ops

theorem fixes n :: nat fixes x :: ′a assumes a: A n x shows P n x using a — method argument proof (induct n arbitrary: x) — method argument

  • ops

Backward reasoning: Induction outlines 59

slide-61
SLIDE 61

Inductive sets: definition

Example: balanced words over alphabet {A, B}

datatype alpha = A | B consts S :: alpha list set inductive S intros S1: [] ∈ S S2: w ∈ S = ⇒ [A] @ w @ [B] ∈ S S3: v ∈ S = ⇒ w ∈ S = ⇒ v @ w ∈ S lemma example: [A, B] ∈ S proof − have [] ∈ S by (rule S1) then have [A] @ [] @ [B] ∈ S by (rule S2) then show ?thesis by simp qed

Backward reasoning: Induction outlines 60

slide-62
SLIDE 62

Inductive sets: proof

lemma assumes w ∈ S shows P w using w ∈ S proof induct case S1 show ?case sorry — P [] next case (S2 w) — w ∈ S P w show ?case sorry — P ([A] @ w @ [B]) next case (S3 v w) — v ∈ S P v w ∈ S P w show ?case sorry — P (v @ w) qed

Backward reasoning: Induction outlines 61

slide-63
SLIDE 63

Try it yourself!

http://isabelle.in.tum.de/Isar/Bertinoro/