Lets Unify Theories of Programming He Jifeng Software Engineering - - PowerPoint PPT Presentation

let s unify theories of programming
SMART_READER_LITE
LIVE PREVIEW

Lets Unify Theories of Programming He Jifeng Software Engineering - - PowerPoint PPT Presentation

1 + Lets Unify Theories of Programming He Jifeng Software Engineering Institute East China Normal University Shanghai China + + 2 + Syllabus A simple theory Relational programming Link with


slide-1
SLIDE 1

1 +

✬ ✫ ✩ ✪

Let’s Unify Theories of Programming

He Jifeng Software Engineering Institute East China Normal University Shanghai China

+ +

slide-2
SLIDE 2

2 +

✬ ✫ ✩ ✪

Syllabus

  • A simple theory
  • Relational programming
  • Link with wp-calculus
  • Design calculus
  • Implementation
  • Design capturing
  • Linking theories and Galois connection
  • Algebra of programs
  • An operational model
  • Probabilistic programming
  • Link semantic models

+ +

slide-3
SLIDE 3

3 +

✬ ✫ ✩ ✪

Theories of Programming A theory of programming relates each program to the specification of what it is intended to achieve. A unifying theory is applicable to a general paradigm of computing, supporting the classification of many programming languages as correct instances of the paradigm.

+ +

slide-4
SLIDE 4

4 +

✬ ✫ ✩ ✪

A Diversity of Presentation

  • 1. A denotational method denotes each notation as some

value in a mathematical domain. It is a link between specifications and programs.

  • 2. An algebraic method says if two different written

programs happen to mean the same thing. It is widely used for program optimisation and verification.

  • 3. An operational presentation describes how a program

can be executed by an abstract machine. It supports compiler design, complexity analysis and program testing.

+ +

slide-5
SLIDE 5

5 +

✬ ✫ ✩ ✪

Unit 1: A Simple Theory

+ +

slide-6
SLIDE 6

6 +

✬ ✫ ✩ ✪

Ingredients (1) Alphabet: concepts related to the real world. (2) Signature : primitives and combinators (3) Laws : a set of equations that are provable.

+ +

slide-7
SLIDE 7

7 +

✬ ✫ ✩ ✪

Alphabet Free variables are used to stand for possible measurement taken from the experiment.

  • 1. initial state {x, y, . . . , z}.
  • 2. final state {x′, y′, . . . , z′}.
  • 3. status of programs {ok, st}.
  • 5. interaction {tr}.
  • 6. synchronisation {ref}.
  • 7. control {l, start, finish}

+ +

slide-8
SLIDE 8

8 +

✬ ✫ ✩ ✪

Observation and Specification An observation consists of a set of equations ascribing particular constant values to each variable x = 4 ∧ . . . ∧ z = false A specification gives a set of allowable observations x′ = x + 1 ∧ y′ = y ∧ z′ = z can be used to describe the effect of execution of assignment x := x + 1, where x, y and z are program variables used as global ones in the program.

+ +

slide-9
SLIDE 9

9 +

✬ ✫ ✩ ✪

Conjunction In general, a specification is written as a conjunction of a set of predicates S1 ∧ S2 ∧ . . . For example, the operation swap(x, y) can be specified by the predicate (x′ = y) ∧ (y′ = x) The sorting program can be described by bag(A′) = bag(A) ∧ ¬∃i, j ∈ domain(A) • i < j ∧ A′[i] > A′[j]

+ +

slide-10
SLIDE 10

10 +

✬ ✫ ✩ ✪

Laws Conjunction is idempotent, symmetric and associative. (∧ − 1) P ∧ P = P (∧ − 2) P ∧ Q = Q ∧ P (∧ − 3) P ∧ (Q ∧ R) = (P ∧ Q) ∧ R ∧ has true as its unit, false as its zero (∧ − 4) P ∧ true = P (∧ − 5) P ∧ false = false

+ +

slide-11
SLIDE 11

11 +

✬ ✫ ✩ ✪

Water Tank t: the time since the the start of the apparatus vt: the volume of the liquid in the tank at time t int, outt: the total amount of liquid poured and drained up to time t at, bt: the setting of the input valve and output valve at time t Water tank is governed by the following laws: (1) int + v0 = outt + vt (2) 0 ≤ at ≤ amax, 0 ≤ bt ≤ bmax (3) ˙ in = k × a, ˙

  • ut = k × b + δ × v

+ +

slide-12
SLIDE 12

12 +

✬ ✫ ✩ ✪

Correctness Let P and S have the same alphabet {x, . . . , z}. P satisfies S if ∀x, . . . , z • (P(x, . . . , z) ⇒ S(x, . . . , z)) We rewrite it as [P ⇒ S] (or P ⊒ S) For example (x′ = x + 1 ∧ y′ = y + 2) ⊒ (x′ ≥ x ∧ y′ ≥ y) Theorem ⊒ is a pre-order. (1) P ⊒ P (2) If P ⊒ Q and Q ⊒ R, then P ⊒ R

+ +

slide-13
SLIDE 13

13 +

✬ ✫ ✩ ✪

Refinement supports correct design

  • Stepwise Refinement

((D ⊒ S) ∧ (P ⊒ D)) implies (P ⊒ S)

  • Piecewise Refinement

((P ⊒ S) ∧ (Q ⊒ S)) implies ((P ∨ Q) ⊒ S)

  • Devide-and-Conquer

(((S1 ∧ S2) ⊒ S) ∧ (P1 ⊒ S1) ∧ (P2 ⊒ S2)) implies (P1 ∧ P2 ⊒ S)

  • Component Replacement

(P ⊒ S) implies (F(P) ⊒ F(S))

+ +

slide-14
SLIDE 14

14 +

✬ ✫ ✩ ✪

Reusable Components Let S(x, y) be a specification, and Q(y) an existing

  • component. We wish to deliver a product of form

P(x) ∧ Q(y), and use R/Q to denote the weakest product P(x). Theorem (P ∧ Q) ⊒ R iff P ⊒ R/Q (/ − 1) (R1 ∧ R2)/Q = (R1/Q) ∧ (R2/Q) (/ − 2) R/(Q1 ∧ Q2) = (R/Q1)/Q2 (/ − 3) true/Q = true (/ − 4) R/false = true

+ +

slide-15
SLIDE 15

15 +

✬ ✫ ✩ ✪

Linking Models Let S(a) be a specification, and P(c) a product. P is an implementation of S with respect to an interpretation A(c, a) if (∃c • P(c) ∧ A(c, a)) ⊒ S(a) We denote the above fact by P ⊒A S Theorem If P ⊒A1 Q and Q ⊒A2 S then P ⊒A1◦A2 S, where A1 ◦ A2 =d

f ∃b • A1(c, b) ∧ A2(b, a)

+ +

slide-16
SLIDE 16

16 +

✬ ✫ ✩ ✪

Unit 2 : A Theory of Relational Programming

+ +

slide-17
SLIDE 17

17 +

✬ ✫ ✩ ✪

Relations A relation is a triple (IN, OUT, P) where

  • IN and OUT are disjoint sets of names, and
  • P is a predicate of free variables of the set IN ∪ OUT

We define inαP =d

f IN, outαP = OUT.

+ +

slide-18
SLIDE 18

18 +

✬ ✫ ✩ ✪

Signature of Relational Programming Primitives

  • 1. assignment: x := e and empty: skip
  • 2. top: ⊤ and bottom: ⊥

Combinators

  • 1. conditional: P ⊳ b ⊲ Q
  • 2. composition: P; Q
  • 3. nondeterminism: P ⊓ Q
  • 4. recursion: µX • F(X)

+ +

slide-19
SLIDE 19

19 +

✬ ✫ ✩ ✪

Testable Conditions Any non-trivial program requires a facility to select between alternative actions in accordance with the truth

  • r falsity of some testable condition b.

The restriction that b contains no dashed variables ensures that it can be tested before starting either of the action.

+ +

slide-20
SLIDE 20

20 +

✬ ✫ ✩ ✪

Conditional If P and Q are predicates describing two programs with the same alphabet, then the conditional P ⊳ b ⊲ Q describes a program which behaves like P if b is true initially, or like Q otherwise. α(P ⊳ b ⊲ Q) =d

f

αP P ⊳ b ⊲ Q =d

f

(b ∧ P) ∨ (¬b ∧ Q)

+ +

slide-21
SLIDE 21

21 +

✬ ✫ ✩ ✪

Laws of Conditional Conditional choice is idempotent, skew-symmetric and associative. (cond − 1) P ⊳ b ⊲ P = P (cond − 2) P ⊳ b ⊲ Q = Q ⊳ ¬b ⊲ P (cond − 3) (P ⊳ b ⊲ Q) ⊳ c ⊲ R = P ⊳ b ∧ c ⊲ (Q ⊳ c ⊲ R) Conditional choice distributes over conditional, and has ⊳true⊲ as its unit. (cond − 4) P ⊳ b ⊲ (Q ⊳ c ⊲ R) = (P ⊳ b ⊲ Q) ⊳ c ⊲ (P ⊳ b ⊲ R) (cond − 5) P ⊳ true ⊲ Q = P

+ +

slide-22
SLIDE 22

22 +

✬ ✫ ✩ ✪

Additional Law (cond − 6) P ⊳ b ⊲ (Q ⊳ b ⊲ R) = P ⊳ b ⊲ R Proof LHS {(cond − 2)} = (Q ⊳ b ⊲ R) ⊳ ¬b ⊲ P {(cond − 3)} = Q ⊳ false ⊲ (R ⊳ ¬b ⊲ P) {(cond − 2, 5)} = RHS

+ +

slide-23
SLIDE 23

23 +

✬ ✫ ✩ ✪

Additional Law (cond − 7) P ⊳ b ⊲ (P ⊳ c ⊲ Q) = P ⊳ b ∨ c ⊲ Q Proof LHS {(cond − 2)} = (Q ⊳ ¬c ⊲ P) ⊳ ¬b ⊲ P {(cond − 3, 1)} = Q ⊳ ¬c ∧ ¬b ⊲ P {(cond − 2)} = RHS

+ +

slide-24
SLIDE 24

24 +

✬ ✫ ✩ ✪

Composition The most characteristic combinator of a sequential language is sequential composition, often denoted by

  • demicolon. If P and Q are predicates describing two

programns, their composition P; Q describes a program which may be executed by first executing P, and when P terminates then Q is started. The final state of P is passed on as the initial state of Q but this is only an intermediate state of (P; Q), and cannot be directly observed.

+ +

slide-25
SLIDE 25

25 +

✬ ✫ ✩ ✪

Composition Let outαP = inαQ (say {v}). Define P(v′); Q(v) =d

f

∃m • (P(m) ∧ Q(m)) inα(P; Q) =d

f

inαP

  • utα(P; Q) =d

f

  • utαQ

The bound variables m record the intermediate values of the program variables v, and so represent the intermediate state as control passes from P to Q.

+ +

slide-26
SLIDE 26

26 +

✬ ✫ ✩ ✪

Laws of Composition Composition is associative. (; −1) P; (Q; R) = (P; Q); R Composition distributes leftward over conditional. (; −2) (P ⊳ b ⊲ Q); R = (P; R) ⊳ b ⊲ (Q; R)

+ +

slide-27
SLIDE 27

27 +

✬ ✫ ✩ ✪

Assignment The assignment is the basic action in all procedural programming languages. Every assignment should technically be marked by its alphabet A, which is needed to define an important part of its meaning – that all the variables not mentioned on the left hand side remain unchanged. Let A = {x, . . . , z}, and FreeV ar(e) ⊆ A. x :=A e =d

f

(x′ = e ∧ . . . ∧ z′ = z) α(x :=A e) =d

f

A + {v′ | v ∈ A} Define skipA =d

f (x′ = x ∧ . . . ∧ z′ = z)

+ +

slide-28
SLIDE 28

28 +

✬ ✫ ✩ ✪

Laws of Assignment (asgn − 1) x := e = x, y := e, y The order of the listing is immaterial. (asgn − 2) (x, y, z := e, f, g) = (y, x, z := f, e, g) Evaluation of an expression or a condition uses the value most recently assigned to its variables. (asgn − 3) (x := e ; x := f(x)) = x := f(e) (asgn − 4) x := e ; (P ⊳ b(x) ⊲ Q) = (x := e; P) ⊳ b(e) ⊲ (x := e; Q) skip is the unit of composition. (asgn − 5) P; skip = P = skip; P

+ +

slide-29
SLIDE 29

29 +

✬ ✫ ✩ ✪

Non-determinism If P and Q are predicates describing the behaviour of programs with the same alphabet. The notation P ⊓ Q stands for a program which is executed either P or Q, but with no indication which one will be chosen. P ⊓ Q =d

f

P ∨ Q α(P ⊓ Q) =d

f

αP Nondeterministic choice may be easily implemented by arbitrary selection of either of the operands, and the selection may be made at any time, either before or after the program is compiled or even after it starts execution.

+ +

slide-30
SLIDE 30

30 +

✬ ✫ ✩ ✪

Laws of ⊓ ⊓ is symmetric, associative and idempotent. (⊓ − 1) P ⊓ Q = Q ⊓ P (⊓ − 2) P ⊓ (Q ⊓ R) = (P ⊓ Q) ⊓ R (⊓ − 3) P ⊓ P = P Theorem (Link between ⊒ and ⊓) (P ⊒ Q) iff (P ⊓ Q) = Q

+ +

slide-31
SLIDE 31

31 +

✬ ✫ ✩ ✪

Additional Law ⊓ distributes through itself. (⊓ − 4) P ⊓ (Q ⊓ R) = (P ⊓ Q) ⊓ (P ⊓ R) Proof RHS {(⊓ − 1, 2)} = (P ⊓ P) ⊓ (Q ⊓ R) {(⊓ − 3)} = LHS

+ +

slide-32
SLIDE 32

32 +

✬ ✫ ✩ ✪

Distributivity All programming combinators defined so far distribute through ⊓. (⊓ − 5) P ⊳ b ⊲ (Q ⊓ R) = (P ⊳ b ⊲ Q) ⊓ (P ⊳ b ⊲ R) (⊓ − 6) (P ⊓ Q); R = (P; R) ⊓ (Q; R) (⊓ − 7) P; (Q ⊓ R) = (P; Q) ⊓ (P; R) (⊓ − 8) P ⊓ (Q ⊳ b ⊲ R) = (P ⊓ Q) ⊳ b ⊲ (P ⊓ R) Corollary All combinators are monotonic in all arguments.

+ +

slide-33
SLIDE 33

33 +

✬ ✫ ✩ ✪

Lower And Upper Bounds Let S be a set of relations with the same alphabet. ⊓S denotes the greatest lower bound of relations of S, and is defined by (bound-1) (⊓S) ⊒ R iff P ⊒ R for all P ∈ S The least upper bound of relations of S is defined by (bound-2) (⊔S) ⊑ R iff P ⊑ R for all P ∈ S

+ +

slide-34
SLIDE 34

34 +

✬ ✫ ✩ ✪

Top and Bottom The top element of the lattice is ⊤ =d

f

⊓ {} = false The bottom one is ⊥ =d

f

⊔ {} = true

+ +

slide-35
SLIDE 35

35 +

✬ ✫ ✩ ✪

Distributivity and Disjunctivity (bound-3) (⊔S) ⊓ Q = ⊔ {X ⊓ Q | X ∈ S} (bound-4) (⊓S) ⊔ Q = ⊓ {X ⊔ Q | X ∈ S} Composition is universally disjunctive (bound-5) (⊓S) ; Q = ⊓ {X; Q | X ∈ S} (bound-6) P; (⊓S) = ⊓ {P; X | X ∈ S}

+ +

slide-36
SLIDE 36

36 +

✬ ✫ ✩ ✪

Tarski’s Fixed Point Theorem If F is a monotonic mapping on a complete lattice (C, ⊒), then the solutions of equation X = F(X) form a complete lattice, where µF =d

f

⊓ {P | P ⊒ F(P)} is the weakest solution, and νF =d

f

⊔ {P | P ⊑ F(P)} is the strongest solution. In particular µX = true = ⊥, νX = false = ⊤

+ +

slide-37
SLIDE 37

37 +

✬ ✫ ✩ ✪

Laws of Fixed Points (µ − 1) (P ⊒ F(P)) iff P ⊒ µF (µ − 2) F(µF) = µF (ν − 1) (P ⊑ F(P)) iff P ⊑ νF (ν − 4) F(νF) = νF

+ +

slide-38
SLIDE 38

38 +

✬ ✫ ✩ ✪

Proof of µ − 2 ∀Y • (Y ⊒ F(Y )) ⇒ (Y ⊒ µF) implies ∀Y • (Y ⊒ F(Y )) ⇒ (F(Y ) ⊒ F(µF)) implies ∀Y • (Y ⊒ F(Y )) ⇒ (Y ⊒ F(µF)) implies ⊓{Y | Y ⊒ F(Y )} ⊒ F(µF) equiv µF ⊒ F(µF) implies (µF ⊒ F(µF)) ∧ (F(µF) ⊒ F2(µF)) implies (µF ⊒ F(µF)) ∧ (F(µF) ⊒ µF) equiv µF = F(µF)

+ +

slide-39
SLIDE 39

39 +

✬ ✫ ✩ ✪

Unit 3: Link with wp-Calculus

+ +

slide-40
SLIDE 40

40 +

✬ ✫ ✩ ✪

Precondition and Postcondition A condition p has been defined as a predicate not containing dashed variables. It describes the values of global variables of a program Q before its execution

  • starts. Such a condition is therefore called a precondition
  • f the program.

If r is a condition, Let r′ be the result of placing a dash

  • n all its variables. As a result, r′ describes the values of

the variables of a program Q when it terminates. Such a condition is therefore called a postcondition of the program.

+ +

slide-41
SLIDE 41

41 +

✬ ✫ ✩ ✪

Hoare Triple The overall specification of the program can often be formalised as a simple implication p = ⇒ r′ Define p{Q}r =d

f Q ⊒ (p =

⇒ r′)

+ +

slide-42
SLIDE 42

42 +

✬ ✫ ✩ ✪

A Model for Hoare Logic (1) If p{Q}r and p{Q}s, then p{Q}(r ∧ s) (2) If p{Q}r and q{Q}r, then {(p ∨ q){Q}r (3) If p{Q}r then (p ∧ q){Q}(r ∨ s) (4) r(e){x := e}r(x) (5) If (p ∧ b){Q1}r and (p ∧ ¬b){Q2}r, then p{Q1 ⊳ b ⊲ Q2}r (6) If p{Q1}s and s{Q2}r, then p{Q1; Q2}r (7) If If p{Q1}r and p{Q2}r then p{Q1 ⊓ Q2}r (8) If (b ∧ c){Q}c then c{νX • Q; X ⊳ b ⊲ skip}(¬b ∧ c)

+ +

slide-43
SLIDE 43

43 +

✬ ✫ ✩ ✪

Proof of (8) Let Y =d

f (c =

⇒ ¬b′ ∧ c′). By (ν − 1) it is sufficient to prove (Q; Y ) ⊳ b ⊲ skip = ⇒ Y Assume the antecedents (Q; Y ) ⊳ b ⊲ skip) ∧ c = (b ∧ c ∧ Q); Y ∨ (¬b ∧ c ∧ skip) {(6)} = ⇒ ¬b′ ∧ c′

+ +

slide-44
SLIDE 44

44 +

✬ ✫ ✩ ✪

Condition vs Annotation Conditions can play a vital role in explaining the meaning of a program if they are included at appropriate points as an integral part of the program documentation. Such a condition is asserted or expected to be true at the point at which it is written. It is known as a Floyd assertion; formally it is defined to have no effect if it is true, but to cause failure if it is false.

+ +

slide-45
SLIDE 45

45 +

✬ ✫ ✩ ✪

Assertion And Assumption c⊥ =d

f

skip ⊳ c ⊲ ⊥ (assertion) c⊤ =d

f

skip ⊳ c ⊲ ⊤ (assumption) Theorem (1) b⊥; c⊥ = (b ∧ c)⊥ (2) b⊥ ⊓ c⊥ = (b ∧ c)⊥ (3) c⊥ ⊳ b ⊲ d⊥ = (c ⊳ b ⊲ d)⊥ (4) b⊥; b⊤ = b⊥

+ +

slide-46
SLIDE 46

46 +

✬ ✫ ✩ ✪

Correctness Definition of correctness can be rewritten in a number of different way [Q = ⇒ (p = ⇒ r′)] = [p = ⇒ ((Q = ⇒ r′] = [p = ⇒ (∀v′ • Q = ⇒ r′)] = [p = ⇒ ¬∃v′(Q ∧ ¬r′)] = [p = ⇒ ¬(Q; ¬r)] This last reformulation gives an answer to the following question: What is the weakest precodition under which execution of Q is guaranteed to achieve the condition r′?

+ +

slide-47
SLIDE 47

47 +

✬ ✫ ✩ ✪

Weakest Precondition Q wp r =d

f ¬(Q; ¬r)

(wp − 1) (x := e) wp r(x) = r(e) (wp − 2) (P; Q) wp r = P wp (Q wp r) (wp − 3) (P ⊳ b ⊲ Q) wp r = (P wp r) ⊳ b ⊲ (Q wp r) (wp − 4) (P ⊓ Q) wp r = (P wp r) ∧ (Q wp r)

+ +

slide-48
SLIDE 48

48 +

✬ ✫ ✩ ✪

Healthiness Conditions Theorem (1) If [r = ⇒ s] then [Q wp r = ⇒ Q wp s] (2) If [Q = ⇒ S], then [S wp r = ⇒ Q wp r] (3) Q wp (r ∧ s) = Q wp r ∧ Q wp s (4) Q wp false = false provided that Q; true = true

+ +

slide-49
SLIDE 49

49 +

✬ ✫ ✩ ✪

Right Inverse of Composition Define S/Q =d

f Q wp S

Theorem (P; Q) ⊒ S iff P ⊒ S/Q Theorem (1) S/(Q1; Q2) = (S/Q2)/Q1 (2) S/(Q1 ⊳ b ⊲ Q2) = (S/Q1) ⊳ b ⊲ (S/Q2) (3) S/(Q1 ⊓ Q2) = (S/Q1) ⊔ (S/Q2) (4) (S1 ∧ S2)/Q = (S1/Q) ⊔ (S2/Q)

+ +

slide-50
SLIDE 50

50 +

✬ ✫ ✩ ✪

Left Inverse of Composition Let P and S are predicates describing two programs. The notation P\S denotes the weakest postspecification

  • f P with respect to S, and is defined by

(P; Q) ⊒ S iff Q ⊒ P\S Theorem (P\S)/Q = P\(S/Q)

+ +

slide-51
SLIDE 51

51 +

✬ ✫ ✩ ✪

Unit 4: Design Calculus

+ +

slide-52
SLIDE 52

52 +

✬ ✫ ✩ ✪

Incomplete Observation Consider the program ⊥ ; (x, y, z := 1, 2, 3) In any normal implementation, it would fail to terminate, and so be equal to ⊥. Unfortunately, our theory gives ⊥; (x′ = 1 ∧ y′ = 2 ∧ z′ = 3) = (x′ = 1 ∧ y′ = 2 ∧ z′ = 3) This is the same as if the prior non-terminating program had been omitted.

+ +

slide-53
SLIDE 53

53 +

✬ ✫ ✩ ✪

Required Laws What we need are the following two laws ⊥; P = ⊥ (left zero law) and P; ⊥ = ⊥ (right zero law)

+ +

slide-54
SLIDE 54

54 +

✬ ✫ ✩ ✪

Initiation and Termination We add two logical variables into the alphabet

  • k records the observation that the program has been

started.

  • k′ records the observation that the program has

terminated. Remark:

  • k and ok′ are not global variables held in the store of

any program; and it is assumed that they will never be mentioned in any expression or assignment of the program text.

+ +

slide-55
SLIDE 55

55 +

✬ ✫ ✩ ✪

Design A design D is a relation which can be expressed by (ok ∧ P) ⇒ (ok′ ∧ Q) We write D = (P ⊢ Q), where predicates P and Q contain no ok or ok′, and (1) P is an assumption which the designer can rely on when D is initiated. (2) Q is the commitment which must be true when D terminates. ⊥ can be rewritten as ⊥ = true = (false ⊢ true)

+ +

slide-56
SLIDE 56

56 +

✬ ✫ ✩ ✪

Refinement Theorem [(P1 ⊢ Q1) ⇒ (P2 ⊢ Q2)] iff [P2 ⇒ P1] and [(P2 ∧ Q1) ⇒ Q2]

+ +

slide-57
SLIDE 57

57 +

✬ ✫ ✩ ✪

Proof LHS ≡ [D1[true, false/ok, ok′] ⇒ D2[true, false/ok, ok′]] ∧ [D1[true, true/ok, ok′] ⇒ D2[true, true/ok, ok′]] ∧ [D1[false/ok] ⇒ D2[false/ok]] ≡ [¬P1 ⇒ ¬P2] ∧ [(P1 ⇒ Q1) ⇒ (P2 ⇒ Q2)] ≡ [P2 ⇒ P1] ∧ [(P1 ⇒ Q1) ∧ P2 ⇒ Q2] ≡ RHS

+ +

slide-58
SLIDE 58

58 +

✬ ✫ ✩ ✪

Equivalence (D1 ≡ D2) =d

f (D1 ⊒ D2) and (D2 ⊒ D1)

Theorem (1) (P ⊢ Q) ≡ (P ⊢ (P ∧ Q)) (2) (P ⊢ Q) ≡ (P ⊢ (P ⇒ Q)) (3) (P ⊢ Q) ≡ (P ⊢ R) iff [(P ∧ Q) ⇒ R] and [R ⇒ (P ⇒ Q)] (4) (false ⊢ false) ≡ (false ⊢ true)

+ +

slide-59
SLIDE 59

59 +

✬ ✫ ✩ ✪

Nondeteminism Theorem (P1 ⊢ Q1) ⊓ (P2 ⊢ Q2) = (P1 ∧ P2) ⊢ (Q1 ∨ Q2) Proof LHS = (P1 ⊢ Q1) ∨ (P2 ⊢ Q2) = ¬ok ∨ ¬P1 ∨ (ok′ ∧ Q1) ∨ ¬ok ∨ ¬P2 ∨ (ok′ ∧ Q2) = ¬ok ∨ ¬(P1 ∧ P2) ∨ ok′ ∧ (Q2 ∨ Q2) = RHS

+ +

slide-60
SLIDE 60

60 +

✬ ✫ ✩ ✪

Conditional and Composition Theorem (P1 ⊢ Q1) ⊳ b ⊲ (P2 ⊢ Q2) = (P1 ⊳ b ⊲ P2) ⊢ (Q1 ⊳ b ⊲ Q2) Theorem (P1 ⊢ Q1); (P2 ⊢ Q2) = (¬(¬P1; true) ∧ ¬(Q1; ¬P2)) ⊢ (Q1; Q2)

+ +

slide-61
SLIDE 61

61 +

✬ ✫ ✩ ✪

Proof Proof LHS = D1[false/ok′]; D2[false/ok] ∨ D1[true/ok′]; D2[true/ok] = (¬ok ∨ ¬P1); true ∨ (¬ok ∨ ¬P1 ∨ Q1); D2[true/ok] = ¬ok ∨ (¬P1); true ∨ Q1; (¬P2 ∨ ok′ ∧ Q2) = RHS

+ +

slide-62
SLIDE 62

62 +

✬ ✫ ✩ ✪

Left Zero Law ⊥; (P ⊢ Q) = ⊥ Proof ⊥; (P ⊢ Q) = (false ⊢ false); (P ⊢ Q) = ¬(true; true) ∧ ¬(false; ¬P1) ⊢ (false; Q2) = false ⊢ false = ⊥

+ +

slide-63
SLIDE 63

63 +

✬ ✫ ✩ ✪

Left Unit Law skip; D = D where skip =d

f true ⊢ (x′ = x ∧ . . . ∧ z′ = z)

Proof skip; (P ⊢ Q) = (¬((¬true); true) ∧ ¬((x′ = x ∧ . . . ∧ z′ = z); ¬P) ⊢ (x′ = x ∧ . . . ∧ z′ = z); Q = P ⊢ Q

+ +

slide-64
SLIDE 64

64 +

✬ ✫ ✩ ✪

Complete Lattice of Designs Theorem (1) ⊓i(Pi ⊢ Qi) = (∧i Pi) ⊢ (∨iQi) (2) ⊔i(Pi ⊢ Qi) = (∨i Pi) ⊢ ∧i(Pi ⇒ Qi) The top design is defined by ⊤D =d

f true ⊢ false = ¬ok

It is a left zero of sequential composition in design calculus ⊤D; D = ⊤D

+ +

slide-65
SLIDE 65

65 +

✬ ✫ ✩ ✪

Assignment Revisited If the evaluation of expression e always yields a value, then (x := e) =d

f

true ⊢ (x′ = e ∧ . . . ∧ z′ = z)

+ +

slide-66
SLIDE 66

66 +

✬ ✫ ✩ ✪

Well-definedness of Expression Let WD(e) be a predicate which is true in just those circumstances in which e can be successfully evaluated WD(x) = true WD(e1 + e2) = WD(e1) ∧ WD(e2) WD(e1/e2) = WD(e1) ∧ WD(e2) ∧ (e2 = 0) Define (x := e) =d

f WD(e) ⊢ (x′ = e ∧ . . . ∧ z′ = z)

+ +

slide-67
SLIDE 67

67 +

✬ ✫ ✩ ✪

Algebraic Laws of Assignment We have to reestablish the following laws (asgn − 1) x := e = x, y := e, y (asgn − 2) (x, y, z := e, f, g) = (y, x, z := f, e, g) (asgn − 3) (x := e ; x := f(x)) = x := f(e) (asgn − 4) x := e ; (P ⊳ b(x) ⊲ Q) = (x := e; P) ⊳ b(e) ⊲ (x := e; Q) (asgn − 5) P; skip = P = skip; P

+ +

slide-68
SLIDE 68

68 +

✬ ✫ ✩ ✪

Variable Declaration To introduce a new variable x we use the form of declaration var x which permits the variable x to be used in the portion of the program that follows it. The complementary operation (called undeclaration) takes the form end x and terminates the region of permitted use of x. The portion of program Q in which a variable x may be used is called its scope; it is bracketed

  • n the left and on the right by the declaration and

undeclaration var x; Q; end x

+ +

slide-69
SLIDE 69

69 +

✬ ✫ ✩ ✪

Definition of Declaration Let A be an alphabet which includes x and x′. Then var x =d

f

∃x • skipA end x =d

f

∃x′ • skipA α(var x) =d

f

A \ {x} α(end x) =d

f

A \ {x′} Note that the alphabet constraints forbid the redeclaration of a variable within its own scope. For example, var x; var x is disallowed because x′ ∈ OUTα(var x) but x / ∈ INα(var x)

+ +

slide-70
SLIDE 70

70 +

✬ ✫ ✩ ✪

Declaration vs Existential Quantifier Declaration and undeclaration act exactly like existential quantification over their scope var x ; Q = ∃x • Q Q ; end x = ∃x′ • Q As a result, the algebraic laws for declaration closely match those for existential quantification.

+ +

slide-71
SLIDE 71

71 +

✬ ✫ ✩ ✪

Algebraic Laws Both declaration and undeclaration are commutative. (var-1) (var x; var y) = (var y; var x) (var-2) (end x; end y) = (end y; end x) (var-3) (var x; end y) = (end y; var x) provided that x and y are distinct. (var-4) If x is not free in b, then var x; (P ⊳ b ⊲ Q) = (var x; P) ⊳ b ⊲ (var x; Q) end x; (P ⊳ b ⊲ Q) = (end x; P) ⊳ b ⊲ (end x; Q)

+ +

slide-72
SLIDE 72

72 +

✬ ✫ ✩ ✪

Composition of declaration and undeclaration var x followed by end x has no effect whatsoever. (var-5) var x; end x = skip The sequential composition of end x with var x has no effect whenever it is followed by an update of x that does not rely on the previous value ofd x (var-6) (end x; var x := e) = (x := e) provided that x does not occur in e. Assignment to a variable just before the end of its scope is irrelevant. (var-7) (x := e; end x) = end x

+ +

slide-73
SLIDE 73

73 +

✬ ✫ ✩ ✪

Example: Compilation of assignments Many computers can execute only the simplest assignment, with just two or three operands, and most of these operands must be selected from among the available machine registers (such as a, b, c). effect(load x) = a, b := x, a effect(store x) = x, a := a, b effect(add) = a := a + b effect(subtract) = a := a − b effect(multiply) = a := a × b

+ +

slide-74
SLIDE 74

74 +

✬ ✫ ✩ ✪

Machine Code for (x := y × z + w) var a, b a, b := y, a load y a, b := z, a load z a := a × b multiply a, b := w, a load w a := a + b add x, a := a, b store x

+ +

slide-75
SLIDE 75

75 +

✬ ✫ ✩ ✪

Unit 5: Implementation

+ +

slide-76
SLIDE 76

76 +

✬ ✫ ✩ ✪

Machine Language

  • Alphabet: Machine state
  • Operations: Instruction set

+ +

slide-77
SLIDE 77

77 +

✬ ✫ ✩ ✪

Machine State

  • 1. A code memory m occupied can be modelled as a

function from addresses to machine instructions m : rom → instruction Once a machine code is stored in m, any update on m is forbidden.

  • 2. A data memory M maps addresses to integers:

M : ram → INT The state of M can be updated by executing a machine instruction.

+ +

slide-78
SLIDE 78

78 +

✬ ✫ ✩ ✪

Remark In practice, both m and M are part of storage provided by the machine. It is the resposibility of the designer of a compiler to ensure that the storage is properly devided to accomodate both target code and data of a sourse program.

+ +

slide-79
SLIDE 79

79 +

✬ ✫ ✩ ✪

Machine State (Cont’d)

  • 3. A program pointer P : rom always points to the

current instruction. By updating the contents of P the computer can choose to execute a specific branch of the machine code.

  • 4. A stack of data registers

A, B, C : INT are seen as the extension of data memory.

+ +

slide-80
SLIDE 80

80 +

✬ ✫ ✩ ✪

State Update Machine instructions are defined as assignments that update the state of the data memory and stack of registers. ldc(w) has a constant w as its operand, and its execution pushes w into the stack of registers ldc(w) =d

f A, B, C, P := w, A, B, P + 1

ldl(w) pushes the contents of M[w] to the stack of registers ldl(w) =d

f A, B, C, P := M[w], A, B, P + 1

+ +

slide-81
SLIDE 81

81 +

✬ ✫ ✩ ✪

State Update (Cont’d) stl(w) sends the contents of A back into the data memory location w stl(w) =d

f M[w], P := A, P + 1

swap(R1, R2) swaps the contents of the data registers R1 and R2 swap(R1, R2) =d

f R1, R2 := R2, R1

+ +

slide-82
SLIDE 82

82 +

✬ ✫ ✩ ✪

Evaluation of Expressions add =d

f

A, P := B + A, P + 1 sub =d

f

A, P := B − A, P + 1 mul =d

f

A, P := B × A, P + 1 divi =d

f

A, P := A/B, P + 1

  • r =d

f

A, P := 0, P + 1 ⊳ A = 0 ∧ B = 0 ⊲ A, P := 1, P + 1 and =d

f

A, P := 1, P + 1 ⊳ A = 1 ∧ B = 1 ⊲ A, P := 0, P + 1

+ +

slide-83
SLIDE 83

83 +

✬ ✫ ✩ ✪

Evaluation of Expressions (Cont’d) not =d

f

A, P := 0, P + 1 ⊳ A = 1 ⊲ A, P := 1, P + 1 eqc(w) =d

f

A, P := 1, P + 1 ⊳ A = w ⊲ A, P : 0, P + 1 gt =d

f

A, P := 1, P + 1 ⊳ B > A ⊲ A, P := 0, P + 1

+ +

slide-84
SLIDE 84

84 +

✬ ✫ ✩ ✪

Machine Programs Suppose a machine program is stored in the memory m between locations s and f. Its behaviour is the combined effect of running the sequence of machine instructions I(s, f, m) =d

f

     P := s (s ≤ P < f) ∗ m[P] (P = f)⊥     

+ +

slide-85
SLIDE 85

85 +

✬ ✫ ✩ ✪

Symbol Table At compiler time there are a number of items to which storage must be allocated in order to execute the target

  • code. In our case we asume that the target code always

resides in m, and the program variables are stored in the data memory M. We use Ψ to denote a symbol table which maps each program variables to an address of the data memory M. Because no location can be allocated to two identifiers, Ψ has to be injective.

+ +

slide-86
SLIDE 86

86 +

✬ ✫ ✩ ✪

Linking Program State with Machine State Define ˆ Ψ =d

f

     var x, .., z ; x, .. z := M[Ψx], ..., M[Ψz] ; end M, A, B, C      ˆ Ψ−1 =d

f

     var M, A, B, C ; M[Ψx], ..., M[Ψz] := x, .., z ; end x, .., z     

+ +

slide-87
SLIDE 87

87 +

✬ ✫ ✩ ✪

Valid Implementation A program state can be recovered after it is converted to a machine state and then retrieved back: Theorem (1) ˆ Ψ−1; ˆ Ψ = skip (2) ˆ Ψ ; ˆ Ψ−1 ⊑ skip A machine program stored in m is regarded as a realisation of Q if ˆ Ψ ; Q ⊑ I(s, f, m) ; ˆ Ψ−1

+ +

slide-88
SLIDE 88

88 +

✬ ✫ ✩ ✪

Weakest Specification of Valid Implementation For any source program Q, we define a machine program which acts like Q except that it operates on the data memory M: [Ψ](Q) =d

f

ˆ Ψ ; Q ; ˆ Ψ−1 Theorem (ˆ Ψ ; Q) ⊑ (I(s, f, m) ; ˆ Ψ−1) iff [Ψ](Q) ⊑ I(s, f, m)

+ +

slide-89
SLIDE 89

89 +

✬ ✫ ✩ ✪

Proof (ˆ Ψ ; Q) ⊑ (I(s, f, m); ˆ Ψ) {; is monotonic} = ⇒ (ˆ Ψ; Q; ˆ Ψ−1) ⊑ (I(s, f, m); ˆ Ψ; ˆ Ψ−1) {ˆ Ψ; ˆ Ψ−1 ⊑ skip} = ⇒ [Ψ](Q) ⊑ I(s, f, m) {; is monotonic} = ⇒ ([Ψ](Q); ˆ Ψ) ⊑ (I(s, f, m) ; ˆ Ψ) {ˆ Ψ−1; ˆ Ψ = skip} = ⇒ (ˆ Ψ ; Q) ⊑ (I(s, f, m); ˆ Ψ)

+ +

slide-90
SLIDE 90

90 +

✬ ✫ ✩ ✪

Implementation of Assignment Define eΨ =d

f e[M[Ψx][/x, .., M[Ψz]/x]

Theorem [Ψ](x := e) ⊑ M[Ψx] := eΨ

+ +

slide-91
SLIDE 91

91 +

✬ ✫ ✩ ✪

Proof

LHS {end M ; x := e = x := e; end M} = var x, .., z ; x, .., z := M[Ψx], .., M[Ψz] ; x := e ; end M, A, B, C ; ˆ Ψ−1 {merge assignment} ⊑ var x, .., z := eΨ, .., M[Ψz] M[Ψx], .., M[Ψz] := x, .., z ; end x, .., z {merge assignment} ⊑ var x, .., z := eΨ, .., M[Ψz] end x, .., z ; M[Ψx], .., M[Ψz] := eΨ, .., M[Ψz] {varx := e; end x = skip} = M[Ψx] := eΨ

+ +

slide-92
SLIDE 92

92 +

✬ ✫ ✩ ✪

Implementation of Sequential Composition Theorem [Ψ](Q; R) = [Ψ](Q); [Ψ](R) Proof LHS {def of [Ψ]} = ˆ Ψ−1; Q; R; ˆ Ψ {ˆ Ψ−1; ˆ Ψ = skip} = ˆ Ψ; Q; ˆ Ψ−1; ˆ Ψ; R; ˆ Ψ−1 {def of [Ψ]} = RHS

+ +

slide-93
SLIDE 93

93 +

✬ ✫ ✩ ✪

Implementation of Conditional

Theorem [Ψ](Q ⊳ b ⊲ R) = [Ψ](Q) ⊳ bΨ ⊲ [Ψ](R) Proof LHS {(Q ⊳ b ⊲ R); W = (Q; W) ⊳ b ⊲ (R; W)} = ˆ Ψ; (Q; ˆ Ψ−1) ⊳ b ⊲ (R; ˆ Ψ−1) {end w; (Q ⊳ b ⊲ R) = (end w ; Q) ⊳ b ⊲ (end w R)} = var x, .. z ; (x, .., z := M[Ψx], .., M[Ψz]; end M, A, B, C, P; Q ; ˆ Ψ−1) ⊳bΨ⊲ (x, .., z := M[Ψx], .., M[Ψz]; var w; (Q ⊳ b ⊲ R) = end M, A, B, C, P; R ; ˆ Ψ−1) (var w ; Q) ⊳ b ⊲ (var w R)} = RHS

+ +

slide-94
SLIDE 94

94 +

✬ ✫ ✩ ✪

Implementation of Iteration

Theorem [Ψ](b ∗ Q) ⊑ bΨ ∗ [Ψ](Q) Proof (ˆ Ψ−1 ; RHS; ˆ Ψ) {(Q ⊳ b ⊲ R); W = (Q; W) ⊳ b ⊲ (R; W)} = ˆ Ψ−1; {x := e; ⊳Q ⊳ b ⊲ R = ([Ψ](Q); RHS; ˆ Ψ) ⊳ bΨ ⊲ ˆ Ψ (x := e; Q) ⊳ b[e/x] ⊲ (x := e; R)} = (ˆ Ψ−1; [Ψ](Q); RHS; ˆ Ψ) ⊳b ⊲ (ˆ Ψ−1; ˆ Ψ) {ˆ Ψ−1; ˆ Ψ = skip} = (Q; (ˆ Ψ−1; RHS; ˆ Ψ)) ⊳ b ⊲ skip which implies (ˆ Ψ; RHS; ˆ Ψ−1 ⊒ b ∗ Q) = ⇒ (RHS ⊒ ˆ Ψ−1; (b ∗ Q); ˆ Ψ)

+ +

slide-95
SLIDE 95

95 +

✬ ✫ ✩ ✪

Empty Machine Program

The empty machine program is the unit of sequential composition. Theorem I(s, s, m); I(s, f, m) = I(s, f, m) = I(s, f, m); I(f, f, m) Proof I(s, s, m) {Def of I} = P := s ; (s ≤ P < s) ∗ m[P] ; (P = s)⊥ {false ∗ Q = skip} = P := s ; (P = s)⊥ {Q ⊳ true ⊲ R = Q} = P := s

+ +

slide-96
SLIDE 96

96 +

✬ ✫ ✩ ✪

Entry of Machine Code

Theorem If s ≤ j < f, then I(j, f, m) ⊑ P := j ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ Proof RHS {(b ∨ c) ∗ Q = b ∗ Q ; (b ∨ c) ∗ Q} = P := j ; (j ≤ P < f) ∗ m[P] ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {(P = f)⊥ = (P = f)⊥; (P; = f)} ⊒ I(j, f, m) ; (P := f) ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {false ∗ Q = skip} = I(j, f, m); P := f; (P = f)⊥ {b⊥; b⊥ = b⊥} = LHS

+ +

slide-97
SLIDE 97

97 +

✬ ✫ ✩ ✪

Elimination of Machine Code

Theorem If s < f and m[s] = jump(f − s − 1), then I(s, f, m) = I(f, f, m) Proof LHS {Unfolding of b ∗ Q} = P := s ; m[s] ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {def of jump(w)} = P := s ; P := P + f − s (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {false ∗ Q = skip} = P := f ; (P = f)⊥ {false ∗ Q = skip} = P := f; (f ≤ P < f) ∗ m[P]; (P = f)⊥

+ +

slide-98
SLIDE 98

98 +

✬ ✫ ✩ ✪

Void Initial State

If a machine program contains an instruction jumping backwards to its start address, then it does not matter whether the execution starts at its first instruction or that jump instruction. Theorem If s ≤ j < f, and m[j] = jump(s − j − 1) then (P := j ; (s ≤ P < f) ∗ m[P]) = (P := s ; (s ≤ P < f) ∗ m[P]) Proof LHS {unfolding iteration} = P := j ; m[j] ; (s ≤ P < f) ∗ m[P] {def of jump(w)} = P := j ; P := P + s − j ; (s ≤ P < f) ∗ m[P] {merge assignment} RHS

+ +

slide-99
SLIDE 99

99 +

✬ ✫ ✩ ✪

Concatenation The concatenation of two machine programs is a refinement of their sequential composition, since it is easy for the former to materialise its normal exit commitment. Theorem If s ≤ j ≤ f then I(s, f, m) ⊒ I(s, j, m); I(j, f, m)

+ +

slide-100
SLIDE 100

100 +

✬ ✫ ✩ ✪

Proof

RHS {def of I} = P := s; (s ≤ P < j) ∗ m[P] ; (P = j)⊥; P := j; (j ≤ P < f) ∗ m[P]; (P = f)⊥ {Entry of machine code} ⊑ P := s; (s ≤ P < j) ∗ m[P] ; (P = j)⊥; P := j; (s ≤ P < f) ∗ m[P]; (P = f)⊥ {(P = j)⊥ = (P = j)⊥; (P := j} ⊑ P := s ; (s ≤ P ≤ j) ∗ m[P] ; skip (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {b ∗ Q ; (b ∨ c) ∗ Q = (b ∨ c) ∗ Q} = P := s ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {def of I} = LHS

+ +

slide-101
SLIDE 101

101 +

✬ ✫ ✩ ✪

Conditional Jump vs Conditional A conditional jump can be used to choose a branch of machine codes according to the initial state of the register A Theorem If s < j − 1 < f and m[s] = tjump(j − s − 1) and m[j − 1] = jump(f − j − 1), then I(s, f, m) ⊒       I(j, f, m) ⊳A = 0⊲ (I(s + 1, j − 1, m); I(f, f, m))      

+ +

slide-102
SLIDE 102

102 +

✬ ✫ ✩ ✪

Proof

LHS {unfolding iteration} = P := s ; m[s] ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {def of tjump} = (P := j ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥) ⊳A = 0 ⊲ (P := s + 1; (s ≤ P < f) ∗ m[P] ; (P = f)⊥) {Entry of machine code} ⊒ I(j, f, m) ⊳ A = 0 ⊲ (P := s + 1 ; (s ≤ P < j − 1) ∗ m[P] ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥) {(P = k)⊥; P := j = (P = k)⊥} ⊒ I(j, f, m) ⊳ A = 0 ⊲ (P := s + 1 ; (s ≤ P < j − 1) ∗ m[P] ; (P = j − 1)⊥ ; P := j − 1 ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥) {V oid initial state} ⊒ RHS

+ +

slide-103
SLIDE 103

103 +

✬ ✫ ✩ ✪

Backward Jump vs Iteration Theorem Assume that s < j < f − 1. If I(s, j, m) ⊒ (A := 1 ⊳ b ⊲ A := 0) and m[j] = tjump(f − j − 1) and m[f − 1] = jump(s − f), then I(s, f, m) ⊒ µX •       (A := 1, ; I(j + 1, f − 1, m) ; X) ⊳ b ⊲ (A, P := 0, f)      

+ +

slide-104
SLIDE 104

104 +

✬ ✫ ✩ ✪

Proof

LHS {b ∗ Q ; (b ∨ c) ∗ Q = (b ∨ c) ∗ Q} ⊒ I(s, j, m) ; (P := j) ; (s ≤ P < f) ∗ m[P] ; (P = f)⊥ {I(s, j, m) ⊒ (A := 1 ⊳ b ⊲ A := 0)} ⊒ I(s, j, m) ; (I(f, f, m) ⊳ A = 0⊲ (I(j + 1, f − 1, m); (P := f); (s ≤ P < f) ∗ (s, f, m) ; (P = f)⊥) {Sequential composition} = (A := 1) ⊳ b ⊲ (A := 0) ; (I(f, f, m) ⊳ A = 0⊲ (I(j + 1, f − 1, m); I(s, f, m)) {; − ⊳ ⊲ distribuitivity} = (A := 1; I(j + 1, f − 1, m); I(s, f, m)) ⊳b ⊲ (A, P := 0, f)

+ +

slide-105
SLIDE 105

105 +

✬ ✫ ✩ ✪

Unit 6 : Design Capturing

+ +

slide-106
SLIDE 106

106 +

✬ ✫ ✩ ✪

Healthiness Condition 1 R makes no prediction on the final values of variables until the program has started. H1 R = (ok ⇒ R) Theorem H1 (Algebraic Representation) R satisfies H1 iff R satisfies the left zero and left unit laws.

+ +

slide-107
SLIDE 107

107 +

✬ ✫ ✩ ✪

Proof of Theorem H1 If R = skip; R = (¬ok ∨ skip); R = (¬ok; R) ∨ (skip; R) = (¬ok; true; R) ∨ (skip; R) = (¬ok; true) ∨ R = ¬ok ∨ R

+ +

slide-108
SLIDE 108

108 +

✬ ✫ ✩ ✪

Proof of Theorem H1 Only if true; R = true; (¬ok ∨ R) = (true; ¬ok) ∨ (true; R) = true

+ +

slide-109
SLIDE 109

109 +

✬ ✫ ✩ ✪

Healthiness Condition 2 Non-termination is something that is never wanted. H2 [R[false/ok′] ⇒ R[true/ok′]] Theorem H2 (Syntactical Representation) A predicate satisfies the healthiness conditions H1 and H2 iff it is a design.

+ +

slide-110
SLIDE 110

110 +

✬ ✫ ✩ ✪

Proof of Theorem H2 Only if R = ¬ok ∨ R = ¬ok ∨ (¬ok′ ∧ R[false/ok′]) ∨ (ok′ ∧ R[true/ok′]) = ¬ok ∨ (¬ok′ ∧ R[false/ok′]) ∨ (ok′ ∧ (R[false/ok′] ∨ R[true/ok′])) = ¬ok ∨ R[false/ok′] ∨ (ok′ ∧ R[true/ok′]) = ¬R[false/ok′] ⊢ R[true/ok′]

+ +

slide-111
SLIDE 111

111 +

✬ ✫ ✩ ✪

Healthiness Condition 3 If R fails to terminate, then its behaviour is unpredictable H3 R = R; skip Theorem H3 (P ⊢ Q) satisfies H3 iff P = ∀v′ • P

+ +

slide-112
SLIDE 112

112 +

✬ ✫ ✩ ✪

Proof of Theorem H3 Proof (P ⊢ Q); skip = (P ⊢ Q); (true ⊢ (v′ = v)) = ¬(¬P; true) ⊢ Q which implies that P ⊢ Q satisfies H3 iff P = ¬(¬P; true) = ∀v′ • P An assumption P satisfying the above condition is called a precondition which it only refers the initial values of variables.

+ +

slide-113
SLIDE 113

113 +

✬ ✫ ✩ ✪

Healthiness Condition 4 If the precondition of a program is satisfied, the program is required to deliver final values for the program variables. H4 R; true = true Theorem H4 b ⊢ Q satisfies H4 iff [b ⇒ (Q; true)]

+ +

slide-114
SLIDE 114

114 +

✬ ✫ ✩ ✪

Proof of Theorem H4 Proof (b ⊢ Q); true = (b ∧ ¬(Q; true)) ⊢ (Q; true) which implies that b ⊢ Q satisfies H4 iff b ∧ ¬(Q; true) = false

  • r equivalently

[b ⇒ (Q; true)]

+ +

slide-115
SLIDE 115

115 +

✬ ✫ ✩ ✪

Closure of Healthy Predicates Theorem If both P and Q satisfy Hi, so are P; Q, P ⊓ Q and P ⊳ b ⊲ Q.

+ +

slide-116
SLIDE 116

116 +

✬ ✫ ✩ ✪

Proof of Closure Theorem Proof Let P and Q in H1. We are going to show that P; Q satisfies left unit and left zero laws. true; (P; Q) = true; Q = true skip; (P; Q) = (skip; P); Q = P; Q

+ +

slide-117
SLIDE 117

117 +

✬ ✫ ✩ ✪

Mapping Relations To Designs A mapping M from relations to designs is a retraction if it satisfies the following conditions

  • Monotonic: P1 ⊒ P2 implies M(P1) ⊒ M(P2)
  • Idempotent: M2(P) = M(P)
  • Weakening: P ⊒ M(P)

We are going to show the healthiness conditions can be characterised by the corresponding retraction.

+ +

slide-118
SLIDE 118

118 +

✬ ✫ ✩ ✪

H1-healthy Predicates Define H1(P) =d

f

(ok ⇒ P) Theorem H1 is monotonic, idempotent and weakening P ⊒ H1(P) for all P Theorem P satisfies the healthiness condition H1 iff P = H1(P)

+ +

slide-119
SLIDE 119

119 +

✬ ✫ ✩ ✪

Homomorphism Theorem H1(P op Q) ⊒ (H1(P) op H1(Q)) Proof H1(P op Q) ⊒ H1(H1(P) op H1(Q)) = H1(P) op H1(Q)

+ +

slide-120
SLIDE 120

120 +

✬ ✫ ✩ ✪

Closure of Recursion Theorem µX • F(X) satisfies the healthiness condition H1, where F is formed by programming combinators. Proof H1(µX • F(X)) = H1(F(µX • F(X))) ⊒ F(H1(µX • F(X))) which implies that H1(µX • F(X)) ⊒ µX • F(X) Because H1 is weakening one has µX • F(X) ⊒ H1(µX • F(X))

+ +

slide-121
SLIDE 121

121 +

✬ ✫ ✩ ✪

H2-healthy Predicates Define H2(P) =d

f

P; ((ok ⇒ ok′) ∧ (x′ = x ∧ . . . ∧ z′ = z)) Theorem (1) H2 is monotonic, idempotent and weakening. (2) P satisfies H2 iff P = H2(P) (3) H2(P op Q) ⊒ (H2(P) op H2(Q)) (4) µX • F(X) satisfies H2, where F is formed by programming combinators.

+ +

slide-122
SLIDE 122

122 +

✬ ✫ ✩ ✪

H3-healthy Predicates Define H3(P) =d

f

(P; skip) Theorem (1) H3 is monotonic, idempotent and weakening. (2) P satisfies H3 iff P = H3(P) (3) H3(P op Q) ⊒ (H3(P) op H3(Q)) (4) µX • F(X) satisfies H3, where F is formed by programming combinators.

+ +

slide-123
SLIDE 123

123 +

✬ ✫ ✩ ✪

H4-healthy predicates Define H4(P) =d

f

((P; true) ⇒ P) Theorem (1) H4 is idempotent and weakening. (2) P satisfies H4 iff P = H4(P) (3) H4(P op Q) ⊒ (H4(P) op H4(Q)) (4) µX • F(X) satisfies H4, where F is formed by programming combinators.

+ +

slide-124
SLIDE 124

124 +

✬ ✫ ✩ ✪

Unit 7: Linking Theories

+ +

slide-125
SLIDE 125

125 +

✬ ✫ ✩ ✪

Subset theories If T is a subset of S, there is always a link from T to S R =d

f

λX ∈ T • X A more interesting mapping is to be sought in the other direction: it is an endofunction L : S → S which ranges over all the members of T range(L) = T

+ +

slide-126
SLIDE 126

126 +

✬ ✫ ✩ ✪

Examples idS =d

f

λX ∈ S • X

  • rP

=d

f

λX ∈ S • (P ∨ X) preP =d

f

λX ∈ S • (P; X) postQ =d

f

λX ∈ S • (X; Q)

+ +

slide-127
SLIDE 127

127 +

✬ ✫ ✩ ✪

Endofunctions Theorem (range of an idempotent) If L is idempotent, then X = L(X) iff X lies in the range of L. Theorem (monotonic endofunctions) If both L1 and L2 are monotonic, so is their composition L1 ◦ L2. Theorem (weakening endofunctions) If both L1 and L2 are monotonic weakening, so is their composition L1 ◦ L2.

+ +

slide-128
SLIDE 128

128 +

✬ ✫ ✩ ✪

Closure of Idempotent Theorem (idempotent endofunctions) If L1 and L2 are idempotent and satisfy L1 ◦ L2 = L2 ◦ L1 then L1 ◦ L2 is an idempotent. Examples (1) Ψ =d

f (H1 ◦ H2 ◦ H3) is an idempotent.

(2) Ψ ◦ H4 is also idempotent.

+ +

slide-129
SLIDE 129

129 +

✬ ✫ ✩ ✪

Link and Recursion L is a link if it is idempotent and weakening. Let S be a complete lattice, and T ⊆ S also a complete

  • lattice. Assume that F is a monotonic mapping on S,

and closed in T. Theorem (1) If L : S → S is a link and satisfies L ◦ F ⊒ F ◦ L, then µSX • F(X) = µTX • F(X) (2) If L is monotonic idempotent, and satisfies L ◦ F ⊒ F ◦ L, then L(µSX • F(X)) = µTX • F(X)

+ +

slide-130
SLIDE 130

130 +

✬ ✫ ✩ ✪

Proof of (1) L(µSX • F(X)) = L(F(µSX • F(X))) ⊒ F(L(µSX • F(X))) which and L(µSX • F(X)) ∈ T imply that µTX • F(X) ⊑ L(µSX • F(X)) ⊑ µSX • F(X)

+ +

slide-131
SLIDE 131

131 +

✬ ✫ ✩ ✪

Proof of (2) µTX • F(X) ⊒ µSX • F(X) ⇒ L(µTX • F(X)) ⊒ L(µSX • F(X)) ⇒ µTX • F(X) ⊒ L(µSX • F(X)) µSX • F(X) = F(µSX • F(X)) ⇒ L(µSX • F(X)) = L(F(µSX • F(X))) ⇒ L(µSX • F(X)) ⊒ F(L(µSX • F(X))) ⇒ L(µSX • F(X)) ⊒ µTX • F(X)

+ +

slide-132
SLIDE 132

132 +

✬ ✫ ✩ ✪

Goal of a Subset Theory A frequent goal in the definition of a subset theory is to ensure that the validity on that subset of certain additional algebraic laws. For example, the desired law may be of the form X = L(X) Now if L happens to be idempotent, the goal is simply achieved: just take the link L itself and define T =d

f range(L)

+ +

slide-133
SLIDE 133

133 +

✬ ✫ ✩ ✪

Goal of a Subset Theory As an additional advantage, it is frequently found that

  • ther useful laws become true in T.

For example, let L =d

f H1 = λX • (ok ⇒ X)

In the subset T =d

f range(L), both left zero law and

the left unit law are valid.

+ +

slide-134
SLIDE 134

134 +

✬ ✫ ✩ ✪

Unit 8: Galois Connection

+ +

slide-135
SLIDE 135

135 +

✬ ✫ ✩ ✪

Galois Connection Let (S, ⊒S) and (T, ⊒T) be complete lattices. L : S → T and R : T → S form a Galois connection if for all X ∈ S and all Y ∈ T (L(X) ⊒T Y ) iff (X ⊒S R(Y ))

+ +

slide-136
SLIDE 136

136 +

✬ ✫ ✩ ✪

Examples (1) (P ∧ X) ⊒ Y iff X ⊒ (P ⇒ Y ) (2) (weakest post-specification) (P; X) ⊒ Y iff X ⊒ ∀m • (P(m, v) ⇒ Y (m, v′)) (3) (weakest prespecification) (X; Q) ⊒ Y iff X ⊒ ∀m • (Q(v′, m) ⇒ Y (v, m))

+ +

slide-137
SLIDE 137

137 +

✬ ✫ ✩ ✪

Galois Connection and Inverse Theorem If L and R are monotonic, then they form a Galois connection iff (L ◦ R) ⊒T idT and (R ◦ L) ⊑S idS Theorem If (Li, Ri) for i = 1, 2 are Galois connections, then L1 = L2 iff R1 = R2

+ +

slide-138
SLIDE 138

138 +

✬ ✫ ✩ ✪

Proof Assume that L1 = L2 X ⊒ R1(Y ) ≡ L1(X) ⊒ Y ≡ L2(X) ⊒ Y ≡ X ⊒ R2(Y )

+ +

slide-139
SLIDE 139

139 +

✬ ✫ ✩ ✪

Disjunctivity and Conjunctivity L : S → T is universally disjunctive if for all sets U L(⊓S U) = ⊓T {L(X) | X ∈ U} L is disjunctive if the above equation holds for all nonempty sets U. Examples (1) λX • (P; X) is fully disjunctive. (2) λX • (X; Q) is fully disjunctive. (3) λX • (P ∨ X) is disjunctive, but not fully disjunctive.

+ +

slide-140
SLIDE 140

140 +

✬ ✫ ✩ ✪

Galois Connection vs Disjunctivity Theorem There exists R such that (L, R) is a Galois connection iff L is universally disjunctive.

+ +

slide-141
SLIDE 141

141 +

✬ ✫ ✩ ✪

Proof (⇒) L(⊓S U) ⊒T Y ≡ (⊓S U) ⊒S R(Y ) ≡ ∀X ∈ U • (X ⊒S R(Y )) ≡ ∀X ∈ U • (L(X) ⊒T Y ) ≡ ⊓T{L(X) | X ∈ U} ⊒T Y

+ +

slide-142
SLIDE 142

142 +

✬ ✫ ✩ ✪

Proof (cont’d) Let R(Q) =d

f ⊓S{X | L(X) ⊒T Q}

(⇐) L(P) ⊒T Q ≡ P ∈ {X | L(X) ⊒T Q} ⇒ P ⊒S R(Q) ⇒ L(P) ⊒T L(R(Q)) ≡ L(P) ⊒T ⊓T {L(X) | L(X) ⊒ Q} ⇒ L(P) ⊒T Q

+ +

slide-143
SLIDE 143

143 +

✬ ✫ ✩ ✪

Simulation and Galois Connection Let D and U be designs satisfying (D; U) ⊒ skip and skip ⊒ (U; D) In this case, D is called a simulation and U a co-simulation. Theorem If (D, U) is a pair of simulations, then L(X) =d

f (D; X; U)

and R(Y ) =d

f (U; Y ; D)

form a Galois connection over designs.

+ +

slide-144
SLIDE 144

144 +

✬ ✫ ✩ ✪

Galois Connection and Recursion Theorem Let F and G be monotonic, and let (L, R) be a Galois connection. If R ◦ F = G ◦ R, then R(µX • F(X)) = µX • G(X)

+ +

slide-145
SLIDE 145

145 +

✬ ✫ ✩ ✪

Unit 9: Algebraic Representation

+ +

slide-146
SLIDE 146

146 +

✬ ✫ ✩ ✪

The Notations of the Language < program > ::= true | skip | < variable list >:=< exp list > | < program > ⊳bool − exp⊲ < program > | < program >; < program > | < program > ⊓ < program > | < recursive identifier > | µ < recursive identifier > • < program >

+ +

slide-147
SLIDE 147

147 +

✬ ✫ ✩ ✪

Assignment Normal Form A total assignment is the first of normal forms x, y, . . . , z := e, f, . . . , g L1 (x, y . . . := e, f, . . .) = (x, y, . . . , z := e, f, . . . , z) L2 (x, . . . , z, y, . . . := e, . . . , g, f, . . .) = (x, . . . , y, z, . . . := e, . . . , f, g, . . .) L3 (v := g ; v := f(v)) = (v := f(g)) L4 (v := f) ⊳ b ⊲ (v := g) = (v := (f ⊳ b ⊲ g)) L5 (v := f) = (v := g) iff [f = g]

+ +

slide-148
SLIDE 148

148 +

✬ ✫ ✩ ✪

Non-deterministic Normal Form (v := f) ⊓ . . . ⊓ (v := h) Let A and B be finite sets of total assignments L1 (⊓A) ⊓ (⊓B) = ⊓ (A ∪ B) L2 (⊓A) ⊳ b ⊲ (⊓B) = ⊓ {v := g ⊳ b ⊲ h | (v := g) ∈ A ∧ (v := h) ∈ B} L3 (⊓A); (⊓B) = ⊓ {(v := g(h)) | (v := h) ∈ A ∧ (v := g) ∈ B} L4 (⊓A) ⊒AL R iff ∀(v := e) ∈ A • ((v := e) ⊒AL R) L5 (v := e) ⊒AL (v := g ⊓ . . . ⊓ v := h) iff [e ∈ {g, . . . , h}]

+ +

slide-149
SLIDE 149

149 +

✬ ✫ ✩ ✪

Non-termination Normal Form b ∨ (⊓A) L1 (b ∨ P) ⊓ (c ∨ Q) = (b ∨ c) ∨ (P ⊓ Q) L2 (b ∨ P) ⊳ d ⊲ (c ∨ Q) = (b ⊳ d ⊲ c) ∨ (P ⊳ d ⊲ Q) L3 (b ∨ P); (c ∨ Q) = (b ∨ {c(e) | (v := e) ∈ P}) ∨ (P; Q) L4 (b ∨ P) ⊒AL (c ∨ Q) iff [b ⇒ c] ∧ (P ⊒AL (c ∨ Q)) L5 (v := e) ⊒AL (c ∨ (v := g ⊓ . . . ⊓ v := h)) iff [c ∨ e ∈ {g, . . . , h}]

+ +

slide-150
SLIDE 150

150 +

✬ ✫ ✩ ✪

Infinite Normal Form S = {Si | i ∈ N} where each Si+1 is stronger than its predecessor (Si+1 ⊒ Si), for i ∈ N L1 (⊔S) ⊓ P = ⊔ (Si ⊓ P) L2 (⊔S) ⊳ b ⊲ P = ⊔i (Si ⊳ b ⊲ P) L3 (⊔S); P = ⊔i (Si; P) L4 P; (⊔S) = ⊔i (P; Si)

+ +

slide-151
SLIDE 151

151 +

✬ ✫ ✩ ✪

Continuity L5 (⊔S) ⊓ (⊔T) = ⊔i (Si ⊓ Ti) L6 (⊔S) ⊳ b ⊲ (⊔T) = ⊔i (Si ⊳ b ⊲ Ti) L7 (⊔S) ; (⊔T) = ⊔i (Si ; Ti) L8 µX • (⊔iSi(X)) = ⊔i (µX • Si(X)) L9 µX • F(X) = ⊔i (F i(true)) where F 0(X) =d

f true,

and F n+1(X) =d

f F(F n(X))

+ +

slide-152
SLIDE 152

152 +

✬ ✫ ✩ ✪

Computability How to compute the limit ⊔(bn ∨ Pn) The execution depends on the property (bn ∨ Pn) = (bn ∨ Pn+k) Let P = (v := e1 ⊓ . . . ⊓ v := en) and Q = (v := f1 ⊓ . . . ⊓ v := fm). (b ∨ P) ≤ (c ∨ Q) =d

f

[c ⇒ b] and [¬b ⇒ ({e1, . . . , en} = {f1, . . . , fm})]

+ +

slide-153
SLIDE 153

153 +

✬ ✫ ✩ ✪

Strong Monotonicity Theorem If X ≤ Y then F(X) ≤ F(Y ), for all programming

  • perators F.

Theorem {F n(true) | n ∈ N} is a ≤-descending chain. L1 ⊔S ⊑AL ⊔T iff Sn ⊑AL (⊔T) for all n L2 Let P = (v := e1 ⊓ . . . ⊓ v := em) and Qn = (v := f n

1 ⊓ . . . ⊓ v := f n k )

(b ∨ P) ⊑AL ⊔(cn ∨ Qn) iff [(∧kck) ⇒ b] and [cn∨ b ∨ ({f n

1 , . . . , f n k } ⊆ {e1, . . . , em})]

+ +

slide-154
SLIDE 154

154 +

✬ ✫ ✩ ✪

Completeness A reduction to normal form gives a method of testing the truth of any proposed implication between any pair of programs: reduce both of them to normal form, and test whether the inequation satisfies the conditions of previous two laws. The converse conclusion can also be drawn. Theorem [P ⇒ Q] iff (P ⊒AL Q)

+ +

slide-155
SLIDE 155

155 +

✬ ✫ ✩ ✪

Denotational Semantics If j is a list of constants, (state := j); P(state, state′) describes all possible observations of the final state of an execution of P that starts in initial state j. The implication [(state := k) ⇒ (state := j); P] means the state k is among the possible final state, i.e. P(j, k) [P ⇒ Q] iff [t ⇒ (s; P)] implies [t ⇒ (s; Q)] for all constant assignment.

+ +

slide-156
SLIDE 156

156 +

✬ ✫ ✩ ✪

Defining Equations R(true) = {(j, k) | j, k ∈ STATE} R(v := e(v)) = {(ok, v), (ok′, v′) |ok ⇒ (ok′∧v′ = e(v))} R(P; Q) = {(j, k) | ∃m • (j, m) ∈ R(P) ∧ (m, k) ∈ R(Q)} R(P ⊳ b ⊲ Q) = {(j, k) | (state := j); b ∧ (j, k) ∈ R(P) ∨ (state := j); ¬b ∧ (j, k) ∈ R(Q)} R(P ⊓ Q) = {(j, k) | (j, k) ∈ R(P) ∨ (j, k) ∈ R(Q)} R(µX • F(X)) = {(j, k)| ∀n ∈ N • (j, k) ∈ R(F n(true))}

+ +

slide-157
SLIDE 157

157 +

✬ ✫ ✩ ✪

From Algebraic to Denotational Presentation Define A(P) =d

f ¬d(v) ⊢ Q(v, v′)

where d(j) = true iff (v := j; P) = true is provable, and Q(j, k) = true iff (v := j; P) ⊑AL (v := k) is provable. Theorem A satisfies the defining equation of R

+ +

slide-158
SLIDE 158

158 +

✬ ✫ ✩ ✪

Unit 10 : An Operational Model

+ +

slide-159
SLIDE 159

159 +

✬ ✫ ✩ ✪

Operational Semantics An operational semantics suggests a complete set of individual steps which may be taken in the execution m → m′ where m and m′ are of the form (s, P) (1) s is a text, defining the data state as a constant assignment to all variables. (2) P is a program text, representing the rest of program that remains to be executed. When this is II, there is no more program to be executed.

+ +

slide-160
SLIDE 160

160 +

✬ ✫ ✩ ✪

Algebraic Specification of a Step ((s, P) → (t, Q)) =d

f

((s; P) ⊑AL (t; Q)) (1) (s, v := e) → (v := s; e, II) (2) (s, II; R) → (s, R) (3) If (s, P) → (t, Q), then (s, P; R) → (t, Q; R) (4) (s, P ⊓ Q) → (s, P) (s, P ⊓ Q) → (s, Q) (5) (s, P ⊳ b ⊲ Q) → (s, P), whenever s; b (s, P ⊳ b ⊲ Q) → (s, Q), whenever s; ¬b (6) (s, µX • F(X)) → (s, F(µX • F(X))) (7) (s, true) → (s, true)

+ +

slide-161
SLIDE 161

161 +

✬ ✫ ✩ ✪

Correctness There are two kinds of error which may occur in design

  • f an operational semantics.

(1) There may be too few transitions. As a result, the set of steps is incomplete. For example we omit the rule (s, P ⊓ Q) → (s, Q) thereby eliminating non-determinism from the language. (2) There are too many transitions. For example (s, P) → (s, P) is entirely consistent since it expresses reflexivity of ⊑. However such rule can lead to an infinite execution.

+ +

slide-162
SLIDE 162

162 +

✬ ✫ ✩ ✪

The Reflexive Transitive Closure Define

→ as the reflexive transitive closure of →, and define

1

→ =d

f

n+1

→ =d

f ( 1

→ ;

n

→) A machine state (s, P) is divergent if it can lead to an infinite execution (s, P) ↑ =d

f ∀n • ∃t, Q • (s, P) n

→ (t, Q)

+ +

slide-163
SLIDE 163

163 +

✬ ✫ ✩ ✪

Ordering Relation Define a relation ⊑STATE over states (s, P) ⊑STATE (t, Q) =d

f

(s, P) ↑ ∨ ∀u • ((t, Q)

→ (u, II)) ⇒ ((s, P)

→ (u, II)) Define a relation ⊑OP over program texts by P ⊑OP Q =d

f ∀s • (s, P) ⊑STATE (s, Q)

Theorem ⊑OP is a pre-order. Define P ∼ Q =d

f (P ⊑OP Q) and (Q ⊑OP P)

+ +

slide-164
SLIDE 164

164 +

✬ ✫ ✩ ✪

From Operational to Algebraic Semantics Lemma 1 (s, P) ↑ iff (s; P) = true is algebraically provable. Lemma 2 (s, P) ↑ or (s, P)

→ (t, II) iff (s; P) ⊑AL t is algebraically provable Theorem P ∼ Q iff P = Q is algebraically provable.

+ +

slide-165
SLIDE 165

165 +

✬ ✫ ✩ ✪

From Operational to Denotational Semantics Define O(P) =d

f ¬d(v) ⊢ Q(v, v′)

where d(j) =d

f (v := j, P) ↑

Q(j, k) =d

f (v := j, P) ∗

→ (v := k, II) Theorem O satisfies the defining equation of R.

+ +