let s unify theories of programming
play

Lets Unify Theories of Programming He Jifeng Software Engineering - PowerPoint PPT Presentation

1 + Lets Unify Theories of Programming He Jifeng Software Engineering Institute East China Normal University Shanghai China + + 2 + Syllabus A simple theory Relational programming Link with


  1. ✬ ✩ 29 + Non-determinism If P and Q are predicates describing the behaviour of programs with the same alphabet. The notation P ⊓ Q stands for a program which is executed either P or Q , but with no indication which one will be chosen. P ⊓ Q = d P ∨ Q f α ( P ⊓ Q ) = d αP f Nondeterministic choice may be easily implemented by arbitrary selection of either of the operands, and the selection may be made at any time, either before or after the program is compiled or even after it starts execution. ✫ ✪ + +

  2. ✬ ✩ 30 + Laws of ⊓ ⊓ is symmetric, associative and idempotent. ( ⊓ − 1) P ⊓ Q = Q ⊓ P ( ⊓ − 2) P ⊓ ( Q ⊓ R ) = ( P ⊓ Q ) ⊓ R ( ⊓ − 3) P ⊓ P = P Theorem (Link between ⊒ and ⊓ ) ( P ⊒ Q ) iff ( P ⊓ Q ) = Q ✫ ✪ + +

  3. ✬ ✩ 31 + Additional Law ⊓ distributes through itself. ( ⊓ − 4) P ⊓ ( Q ⊓ R ) = ( P ⊓ Q ) ⊓ ( P ⊓ R ) { ( ⊓ − 1 , 2) } Proof RHS = ( P ⊓ P ) ⊓ ( Q ⊓ R ) { ( ⊓ − 3) } = LHS ✫ ✪ + +

  4. ✬ ✩ 32 + Distributivity All programming combinators defined so far distribute through ⊓ . ( ⊓ − 5) P ⊳ b ⊲ ( Q ⊓ R ) = ( P ⊳ b ⊲ Q ) ⊓ ( P ⊳ b ⊲ R ) ( ⊓ − 6) ( P ⊓ Q ); R = ( P ; R ) ⊓ ( Q ; R ) ( ⊓ − 7) P ; ( Q ⊓ R ) = ( P ; Q ) ⊓ ( P ; R ) ( ⊓ − 8) P ⊓ ( Q ⊳ b ⊲ R ) = ( P ⊓ Q ) ⊳ b ⊲ ( P ⊓ R ) Corollary All combinators are monotonic in all arguments. ✫ ✪ + +

  5. ✬ ✩ 33 + Lower And Upper Bounds Let S be a set of relations with the same alphabet. ⊓S denotes the greatest lower bound of relations of S , and is defined by ( bound-1 ) ( ⊓S ) ⊒ R P ⊒ R for all P ∈ S iff The least upper bound of relations of S is defined by ( bound-2 ) ( ⊔S ) ⊑ R iff P ⊑ R for all P ∈ S ✫ ✪ + +

  6. ✬ ✩ 34 + Top and Bottom The top element of the lattice is ⊤ = d ⊓ {} = false f The bottom one is ⊥ = d ⊔ {} = true f ✫ ✪ + +

  7. ✬ ✩ 35 + Distributivity and Disjunctivity ( bound-3 ) ( ⊔S ) ⊓ Q = ⊔ { X ⊓ Q | X ∈ S} ( bound-4 ) ( ⊓S ) ⊔ Q = ⊓ { X ⊔ Q | X ∈ S} Composition is universally disjunctive ( bound-5 ) ( ⊓S ) ; Q = ⊓ { X ; Q | X ∈ S} ( bound-6 ) P ; ( ⊓S ) = ⊓ { P ; X | X ∈ S} ✫ ✪ + +

  8. ✬ ✩ 36 + Tarski’s Fixed Point Theorem If F is a monotonic mapping on a complete lattice ( C , ⊒ ), then the solutions of equation X = F ( X ) form a complete lattice, where µ F = d ⊓ { P | P ⊒ F ( P ) } f is the weakest solution, and ν F = d ⊔ { P | P ⊑ F ( P ) } f is the strongest solution. In particular µX = true = ⊥ , νX = false = ⊤ ✫ ✪ + +

  9. ✬ ✩ 37 + Laws of Fixed Points ( µ − 1) ( P ⊒ F ( P )) P ⊒ µ F iff ( µ − 2) F ( µ F ) = µ F ( ν − 1) ( P ⊑ F ( P )) P ⊑ ν F iff ( ν − 4) F ( ν F ) = ν F ✫ ✪ + +

  10. ✬ ✩ 38 + Proof of µ − 2 ∀ Y • ( Y ⊒ F ( Y )) ⇒ ( Y ⊒ µ F ) implies ∀ Y • ( Y ⊒ F ( Y )) ⇒ ( F ( Y ) ⊒ F ( µ F )) implies ∀ Y • ( Y ⊒ F ( Y )) ⇒ ( Y ⊒ F ( µ F )) implies ⊓{ Y | Y ⊒ F ( Y ) } ⊒ F ( µ F ) equiv µ F ⊒ F ( µ F ) implies ( µ F ⊒ F ( µ F )) ∧ ( F ( µ F ) ⊒ F 2 ( µ F )) implies ( µ F ⊒ F ( µ F )) ∧ ( F ( µ F ) ⊒ µ F ) µ F = F ( µ F ) equiv ✫ ✪ + +

  11. ✬ ✩ 39 + Unit 3: Link with wp-Calculus ✫ ✪ + +

  12. ✬ ✩ 40 + Precondition and Postcondition A condition p has been defined as a predicate not containing dashed variables. It describes the values of global variables of a program Q before its execution starts. Such a condition is therefore called a precondition of the program. If r is a condition, Let r ′ be the result of placing a dash on all its variables. As a result, r ′ describes the values of the variables of a program Q when it terminates. Such a condition is therefore called a postcondition of the program. ✫ ✪ + +

  13. ✬ ✩ 41 + Hoare Triple The overall specification of the program can often be formalised as a simple implication ⇒ r ′ p = Define ⇒ r ′ ) p { Q } r = d f Q ⊒ ( p = ✫ ✪ + +

  14. ✬ ✩ 42 + A Model for Hoare Logic (1) If p { Q } r and p { Q } s , then p { Q } ( r ∧ s ) (2) If p { Q } r and q { Q } r , then { ( p ∨ q ) { Q } r (3) If p { Q } r then ( p ∧ q ) { Q } ( r ∨ s ) (4) r ( e ) { x := e } r ( x ) (5) If ( p ∧ b ) { Q 1 } r and ( p ∧ ¬ b ) { Q 2 } r , then p { Q 1 ⊳ b ⊲ Q 2 } r (6) If p { Q 1 } s and s { Q 2 } r , then p { Q 1 ; Q 2 } r (7) If If p { Q 1 } r and p { Q 2 } r then p { Q 1 ⊓ Q 2 } r (8) If ( b ∧ c ) { Q } c then c { νX • Q ; X ⊳ b ⊲ skip } ( ¬ b ∧ c ) ✫ ✪ + +

  15. ✬ ✩ 43 + Proof of (8) ⇒ ¬ b ′ ∧ c ′ ). By ( ν − 1) it is sufficient to Let Y = d f ( c = prove ⇒ Y ( Q ; Y ) ⊳ b ⊲ skip = Assume the antecedents ( Q ; Y ) ⊳ b ⊲ skip ) ∧ c = ( b ∧ c ∧ Q ); Y ∨ ( ¬ b ∧ c ∧ skip ) { (6) } ¬ b ′ ∧ c ′ ⇒ = ✫ ✪ + +

  16. ✬ ✩ 44 + Condition vs Annotation Conditions can play a vital role in explaining the meaning of a program if they are included at appropriate points as an integral part of the program documentation. Such a condition is asserted or expected to be true at the point at which it is written. It is known as a Floyd assertion; formally it is defined to have no effect if it is true, but to cause failure if it is false. ✫ ✪ + +

  17. ✬ ✩ 45 + Assertion And Assumption skip ⊳ c ⊲ ⊥ c ⊥ = d (assertion) f c ⊤ = d skip ⊳ c ⊲ ⊤ (assumption) f Theorem (1) b ⊥ ; c ⊥ = ( b ∧ c ) ⊥ (2) b ⊥ ⊓ c ⊥ = ( b ∧ c ) ⊥ (3) c ⊥ ⊳ b ⊲ d ⊥ = ( c ⊳ b ⊲ d ) ⊥ (4) b ⊥ ; b ⊤ = b ⊥ ✫ ✪ + +

  18. ✬ ✩ 46 + Correctness Definition of correctness can be rewritten in a number of different way ⇒ r ′ )] ⇒ ( p = [ Q = ⇒ r ′ ] ⇒ (( Q = = [ p = ⇒ ( ∀ v ′ • Q = ⇒ r ′ )] = [ p = ⇒ ¬∃ v ′ ( Q ∧ ¬ r ′ )] = [ p = = [ p = ⇒ ¬ ( Q ; ¬ r )] This last reformulation gives an answer to the following question: What is the weakest precodition under which execution of Q is guaranteed to achieve the condition r ′ ? ✫ ✪ + +

  19. ✬ ✩ 47 + Weakest Precondition f ¬ ( Q ; ¬ r ) Q wp r = d ( wp − 1) ( x := e ) wp r ( x ) = r ( e ) ( wp − 2) ( P ; Q ) wp r = P wp ( Q wp r ) ( wp − 3) ( P ⊳ b ⊲ Q ) wp r = ( P wp r ) ⊳ b ⊲ ( Q wp r ) ( wp − 4) ( P ⊓ Q ) wp r = ( P wp r ) ∧ ( Q wp r ) ✫ ✪ + +

  20. ✬ ✩ 48 + Healthiness Conditions Theorem ⇒ s ] then [ Q wp r = ⇒ Q wp s ] (1) If [ r = (2) If [ Q = ⇒ S ], then [ S wp r = ⇒ Q wp r ] (3) Q wp ( r ∧ s ) = Q wp r ∧ Q wp s (4) Q wp false = false provided that Q ; true = true ✫ ✪ + +

  21. ✬ ✩ 49 + Right Inverse of Composition Define S/Q = d f Q wp S Theorem ( P ; Q ) ⊒ S iff P ⊒ S/Q Theorem (1) S/ ( Q 1 ; Q 2 ) = ( S/Q 2 ) /Q 1 (2) S/ ( Q 1 ⊳ b ⊲ Q 2 ) = ( S/Q 1 ) ⊳ b ⊲ ( S/Q 2 ) (3) S/ ( Q 1 ⊓ Q 2 ) = ( S/Q 1 ) ⊔ ( S/Q 2 ) (4) ( S 1 ∧ S 2 ) /Q = ( S 1 /Q ) ⊔ ( S 2 /Q ) ✫ ✪ + +

  22. ✬ ✩ 50 + Left Inverse of Composition Let P and S are predicates describing two programs. The notation P \ S denotes the weakest postspecification of P with respect to S , and is defined by ( P ; Q ) ⊒ S iff Q ⊒ P \ S Theorem ( P \ S ) /Q = P \ ( S/Q ) ✫ ✪ + +

  23. ✬ ✩ 51 + Unit 4: Design Calculus ✫ ✪ + +

  24. ✬ ✩ 52 + Incomplete Observation Consider the program ⊥ ; ( x, y, z := 1 , 2 , 3) In any normal implementation, it would fail to terminate, and so be equal to ⊥ . Unfortunately, our theory gives ⊥ ; ( x ′ = 1 ∧ y ′ = 2 ∧ z ′ = 3) = ( x ′ = 1 ∧ y ′ = 2 ∧ z ′ = 3) This is the same as if the prior non-terminating program had been omitted. ✫ ✪ + +

  25. ✬ ✩ 53 + Required Laws What we need are the following two laws ⊥ ; P = ⊥ (left zero law) and P ; ⊥ = ⊥ (right zero law) ✫ ✪ + +

  26. ✬ ✩ 54 + Initiation and Termination We add two logical variables into the alphabet ok records the observation that the program has been started. ok ′ records the observation that the program has terminated. Remark: ok and ok ′ are not global variables held in the store of any program; and it is assumed that they will never be mentioned in any expression or assignment of the program text. ✫ ✪ + +

  27. ✬ ✩ 55 + Design A design D is a relation which can be expressed by ( ok ∧ P ) ⇒ ( ok ′ ∧ Q ) We write D = ( P ⊢ Q ), where predicates P and Q contain no ok or ok ′ , and (1) P is an assumption which the designer can rely on when D is initiated. (2) Q is the commitment which must be true when D terminates. ⊥ can be rewritten as ⊥ = true = ( false ⊢ true ) ✫ ✪ + +

  28. ✬ ✩ 56 + Refinement Theorem [( P 1 ⊢ Q 1) ⇒ ( P 2 ⊢ Q 2)] iff [ P 2 ⇒ P 1] and [( P 2 ∧ Q 1) ⇒ Q 2] ✫ ✪ + +

  29. ✬ ✩ 57 + Proof LHS ≡ [ D 1[ true, false/ok, ok ′ ] ⇒ D 2[ true, false/ok, ok ′ ]] ∧ [ D 1[ true, true/ok, ok ′ ] ⇒ D 2[ true, true/ok, ok ′ ]] ∧ [ D 1[ false/ok ] ⇒ D 2[ false/ok ]] ≡ [ ¬ P 1 ⇒ ¬ P 2] ∧ [( P 1 ⇒ Q 1) ⇒ ( P 2 ⇒ Q 2)] ≡ [ P 2 ⇒ P 1] ∧ [( P 1 ⇒ Q 1) ∧ P 2 ⇒ Q 2] ≡ RHS ✫ ✪ + +

  30. ✬ ✩ 58 + Equivalence ( D 1 ≡ D 2) = d f ( D 1 ⊒ D 2) and ( D 2 ⊒ D 1) Theorem (1) ( P ⊢ Q ) ≡ ( P ⊢ ( P ∧ Q )) (2) ( P ⊢ Q ) ≡ ( P ⊢ ( P ⇒ Q )) (3) ( P ⊢ Q ) ≡ ( P ⊢ R ) iff [( P ∧ Q ) ⇒ R ] and [ R ⇒ ( P ⇒ Q )] (4) ( false ⊢ false ) ≡ ( false ⊢ true ) ✫ ✪ + +

  31. ✬ ✩ 59 + Nondeteminism Theorem ( P 1 ⊢ Q 1) ⊓ ( P 2 ⊢ Q 2) = ( P 1 ∧ P 2) ⊢ ( Q 1 ∨ Q 2) Proof LHS = ( P 1 ⊢ Q 1) ∨ ( P 2 ⊢ Q 2) = ¬ ok ∨ ¬ P 1 ∨ ( ok ′ ∧ Q 1) ∨ ¬ ok ∨ ¬ P 2 ∨ ( ok ′ ∧ Q 2) = ¬ ok ∨ ¬ ( P 1 ∧ P 2) ∨ ok ′ ∧ ( Q 2 ∨ Q 2) = RHS ✫ ✪ + +

  32. ✬ ✩ 60 + Conditional and Composition Theorem ( P 1 ⊢ Q 1) ⊳ b ⊲ ( P 2 ⊢ Q 2) = ( P 1 ⊳ b ⊲ P 2) ⊢ ( Q 1 ⊳ b ⊲ Q 2) Theorem ( P 1 ⊢ Q 1); ( P 2 ⊢ Q 2) = ( ¬ ( ¬ P 1; true ) ∧ ¬ ( Q 1; ¬ P 2)) ⊢ ( Q 1; Q 2) ✫ ✪ + +

  33. ✬ ✩ 61 + Proof Proof LHS = D 1[ false/ok ′ ]; D 2[ false/ok ] ∨ D 1[ true/ok ′ ]; D 2[ true/ok ] = ( ¬ ok ∨ ¬ P 1); true ∨ ( ¬ ok ∨ ¬ P 1 ∨ Q 1); D 2[ true/ok ] = ¬ ok ∨ ( ¬ P 1); true ∨ Q 1; ( ¬ P 2 ∨ ok ′ ∧ Q 2) = RHS ✫ ✪ + +

  34. ✬ ✩ 62 + Left Zero Law ⊥ ; ( P ⊢ Q ) = ⊥ ⊥ ; ( P ⊢ Q ) Proof = ( false ⊢ false ); ( P ⊢ Q ) = ¬ ( true ; true ) ∧ ¬ ( false ; ¬ P 1) ⊢ ( false ; Q 2) = false ⊢ false = ⊥ ✫ ✪ + +

  35. ✬ ✩ 63 + Left Unit Law skip ; D = D where f true ⊢ ( x ′ = x ∧ . . . ∧ z ′ = z ) skip = d Proof skip ; ( P ⊢ Q ) = ( ¬ (( ¬ true ); true ) ∧ ¬ (( x ′ = x ∧ . . . ∧ z ′ = z ); ¬ P ) ⊢ ( x ′ = x ∧ . . . ∧ z ′ = z ); Q = P ⊢ Q ✫ ✪ + +

  36. ✬ ✩ 64 + Complete Lattice of Designs Theorem (1) ⊓ i ( P i ⊢ Q i ) = ( ∧ i P i ) ⊢ ( ∨ i Q i ) (2) ⊔ i ( P i ⊢ Q i ) = ( ∨ i P i ) ⊢ ∧ i ( P i ⇒ Q i ) The top design is defined by ⊤ D = d f true ⊢ false = ¬ ok It is a left zero of sequential composition in design calculus ⊤ D ; D = ⊤ D ✫ ✪ + +

  37. ✬ ✩ 65 + Assignment Revisited If the evaluation of expression e always yields a value, then ( x := e ) = d f true ⊢ ( x ′ = e ∧ . . . ∧ z ′ = z ) ✫ ✪ + +

  38. ✬ ✩ 66 + Well-definedness of Expression Let WD ( e ) be a predicate which is true in just those circumstances in which e can be successfully evaluated WD ( x ) = true WD ( e 1 + e 2) = WD ( e 1) ∧ WD ( e 2) WD ( e 1 /e 2) = WD ( e 1) ∧ WD ( e 2) ∧ ( e 2 � = 0) Define f WD ( e ) ⊢ ( x ′ = e ∧ . . . ∧ z ′ = z ) ( x := e ) = d ✫ ✪ + +

  39. ✬ ✩ 67 + Algebraic Laws of Assignment We have to reestablish the following laws ( asgn − 1) x := e = x, y := e, y ( asgn − 2) ( x, y, z := e, f, g ) = ( y, x, z := f, e, g ) ( asgn − 3) ( x := e ; x := f ( x )) = x := f ( e ) ( asgn − 4) x := e ; ( P ⊳ b ( x ) ⊲ Q ) = ( x := e ; P ) ⊳ b ( e ) ⊲ ( x := e ; Q ) ( asgn − 5) P ; skip = P = skip ; P ✫ ✪ + +

  40. ✬ ✩ 68 + Variable Declaration To introduce a new variable x we use the form of declaration var x which permits the variable x to be used in the portion of the program that follows it. The complementary operation (called undeclaration ) takes the form end x and terminates the region of permitted use of x . The portion of program Q in which a variable x may be used is called its scope ; it is bracketed on the left and on the right by the declaration and undeclaration var x ; Q ; end x ✫ ✪ + +

  41. ✬ ✩ 69 + Definition of Declaration Let A be an alphabet which includes x and x ′ . Then ∃ x • skip A var x = d f ∃ x ′ • skip A end x = d f A \ { x } α ( var x ) = d f A \ { x ′ } α ( end x ) = d f Note that the alphabet constraints forbid the redeclaration of a variable within its own scope. For example, var x ; var x is disallowed because x ′ ∈ OUTα ( var x ) ∈ INα ( var x ) but x / ✫ ✪ + +

  42. ✬ ✩ 70 + Declaration vs Existential Quantifier Declaration and undeclaration act exactly like existential quantification over their scope var x ; Q = ∃ x • Q Q ; end x = ∃ x ′ • Q As a result, the algebraic laws for declaration closely match those for existential quantification. ✫ ✪ + +

  43. ✬ ✩ 71 + Algebraic Laws Both declaration and undeclaration are commutative. ( var-1 ) ( var x ; var y ) = ( var y ; var x ) ( var-2 ) ( end x ; end y ) = ( end y ; end x ) ( var-3 ) ( var x ; end y ) = ( end y ; var x ) provided that x and y are distinct. ( var-4 ) If x is not free in b , then var x ; ( P ⊳ b ⊲ Q ) = ( var x ; P ) ⊳ b ⊲ ( var x ; Q ) end x ; ( P ⊳ b ⊲ Q ) = ( end x ; P ) ⊳ b ⊲ ( end x ; Q ) ✫ ✪ + +

  44. ✬ ✩ 72 + Composition of declaration and undeclaration var x followed by end x has no effect whatsoever. ( var-5 ) var x ; end x = skip The sequential composition of end x with var x has no effect whenever it is followed by an update of x that does not rely on the previous value ofd x ( var-6 ) ( end x ; var x := e ) = ( x := e ) provided that x does not occur in e . Assignment to a variable just before the end of its scope is irrelevant. ( var-7 ) ( x := e ; end x ) = end x ✫ ✪ + +

  45. ✬ ✩ 73 + Example: Compilation of assignments Many computers can execute only the simplest assignment, with just two or three operands, and most of these operands must be selected from among the available machine registers (such as a , b , c ). effect ( load x ) = a, b := x, a effect ( store x ) = x, a := a, b effect ( add ) = a := a + b effect ( subtract ) = a := a − b effect ( multiply ) = a := a × b ✫ ✪ + +

  46. ✬ ✩ 74 + Machine Code for ( x := y × z + w ) var a, b a, b := y, a load y a, b := z, a load z a := a × b multiply a, b := w, a load w a := a + b add x, a := a, b store x ✫ ✪ + +

  47. ✬ ✩ 75 + Unit 5: Implementation ✫ ✪ + +

  48. ✬ ✩ 76 + Machine Language • Alphabet: Machine state • Operations: Instruction set ✫ ✪ + +

  49. ✬ ✩ 77 + Machine State 1. A code memory m occupied can be modelled as a function from addresses to machine instructions m : rom → instruction Once a machine code is stored in m , any update on m is forbidden. 2. A data memory M maps addresses to integers: M : ram → INT The state of M can be updated by executing a machine instruction. ✫ ✪ + +

  50. ✬ ✩ 78 + Remark In practice, both m and M are part of storage provided by the machine. It is the resposibility of the designer of a compiler to ensure that the storage is properly devided to accomodate both target code and data of a sourse program. ✫ ✪ + +

  51. ✬ ✩ 79 + Machine State (Cont’d) 3. A program pointer P : rom always points to the current instruction. By updating the contents of P the computer can choose to execute a specific branch of the machine code. 4. A stack of data registers A, B, C : INT are seen as the extension of data memory. ✫ ✪ + +

  52. ✬ ✩ 80 + State Update Machine instructions are defined as assignments that update the state of the data memory and stack of registers. ldc ( w ) has a constant w as its operand, and its execution pushes w into the stack of registers ldc ( w ) = d f A, B, C, P := w, A, B, P + 1 ldl ( w ) pushes the contents of M [ w ] to the stack of registers ldl ( w ) = d f A, B, C, P := M [ w ] , A, B, P + 1 ✫ ✪ + +

  53. ✬ ✩ 81 + State Update (Cont’d) stl ( w ) sends the contents of A back into the data memory location w stl ( w ) = d f M [ w ] , P := A, P + 1 swap ( R 1 , R 2 ) swaps the contents of the data registers R 1 and R 2 swap ( R 1 , R 2 ) = d f R 1 , R 2 := R 2 , R 1 ✫ ✪ + +

  54. ✬ ✩ 82 + Evaluation of Expressions add = d A, P := B + A, P + 1 f A, P := B − A, P + 1 sub = d f A, P := B × A, P + 1 mul = d f divi = d A, P := A/B, P + 1 f or = d A, P := 0 , P + 1 ⊳ A = 0 ∧ B = 0 ⊲ f A, P := 1 , P + 1 A, P := 1 , P + 1 ⊳ A = 1 ∧ B = 1 ⊲ and = d f A, P := 0 , P + 1 ✫ ✪ + +

  55. ✬ ✩ 83 + Evaluation of Expressions (Cont’d) not = d A, P := 0 , P + 1 ⊳ A = 1 ⊲ f A, P := 1 , P + 1 eqc ( w ) = d A, P := 1 , P + 1 ⊳ A = w ⊲ f A, P : 0 , P + 1 gt = d A, P := 1 , P + 1 ⊳ B > A ⊲ f A, P := 0 , P + 1 ✫ ✪ + +

  56. ✬ ✩ 84 + Machine Programs Suppose a machine program is stored in the memory m between locations s and f . Its behaviour is the combined effect of running the sequence of machine instructions   P := s   I ( s, f, m ) = d ( s ≤ P < f ) ∗ m [ P ]   f     ( P = f ) ⊥ ✫ ✪ + +

  57. ✬ ✩ 85 + Symbol Table At compiler time there are a number of items to which storage must be allocated in order to execute the target code. In our case we asume that the target code always resides in m , and the program variables are stored in the data memory M . We use Ψ to denote a symbol table which maps each program variables to an address of the data memory M . Because no location can be allocated to two identifiers, Ψ has to be injective. ✫ ✪ + +

  58. ✬ ✩ 86 + Linking Program State with Machine State Define   var x, .., z ;   ˆ Ψ = d x, .. z := M [Ψ x ] , ..., M [Ψ z ] ;   f     end M, A, B, C   var M, A, B, C ;   Ψ − 1 = d ˆ M [Ψ x ] , ..., M [Ψ z ] := x, .., z ;   f     end x, .., z ✫ ✪ + +

  59. ✬ ✩ 87 + Valid Implementation A program state can be recovered after it is converted to a machine state and then retrieved back: Theorem (1) ˆ Ψ − 1 ; ˆ Ψ = skip Ψ − 1 ⊑ skip (2) ˆ Ψ ; ˆ A machine program stored in m is regarded as a realisation of Q if Ψ ; Q ⊑ I ( s, f, m ) ; ˆ ˆ Ψ − 1 ✫ ✪ + +

  60. ✬ ✩ 88 + Weakest Specification of Valid Implementation For any source program Q , we define a machine program which acts like Q except that it operates on the data memory M : Ψ ; Q ; ˆ ˆ Ψ − 1 [Ψ]( Q ) = d f Theorem (ˆ Ψ ; Q ) ⊑ ( I ( s, f, m ) ; ˆ Ψ − 1 ) iff [Ψ]( Q ) ⊑ I ( s, f, m ) ✫ ✪ + +

  61. ✬ ✩ 89 + Proof (ˆ Ψ ; Q ) ⊑ ( I ( s, f, m ); ˆ { ; is monotonic } Ψ) (ˆ Ψ; Q ; ˆ Ψ − 1 ) ⊑ = ⇒ Ψ − 1 ⊑ skip } ( I ( s, f, m ); ˆ Ψ; ˆ { ˆ Ψ; ˆ Ψ − 1 ) ⇒ [Ψ]( Q ) ⊑ I ( s, f, m ) { ; is monotonic } = ([Ψ]( Q ); ˆ Ψ) ⊑ ( I ( s, f, m ) ; ˆ Ψ) { ˆ Ψ − 1 ; ˆ = ⇒ Ψ = skip } (ˆ Ψ ; Q ) ⊑ ( I ( s, f, m ); ˆ ⇒ = Ψ) ✫ ✪ + +

  62. ✬ ✩ 90 + Implementation of Assignment Define e Ψ = d f e [ M [Ψ x ][ /x, .., M [Ψ z ] /x ] Theorem [Ψ]( x := e ) ⊑ M [Ψ x ] := e Ψ ✫ ✪ + +

  63. ✬ ✩ 91 + Proof { end M ; x := e = x := e ; end M } LHS = var x, .., z ; x, .., z := M [Ψ x ] , .., M [Ψ z ] ; x := e ; end M, A, B, C ; ˆ Ψ − 1 { merge assignment } ⊑ var x, .., z := e Ψ , .., M [Ψ z ] M [Ψ x ] , .., M [Ψ z ] := x, .., z ; end x, .., z { merge assignment } ⊑ var x, .., z := e Ψ , .., M [Ψ z ] end x, .., z ; { var x := e ; end x = skip } M [Ψ x ] , .., M [Ψ z ] := e Ψ , .., M [Ψ z ] = M [Ψ x ] := e Ψ ✫ ✪ + +

  64. ✬ ✩ 92 + Implementation of Sequential Composition Theorem [Ψ]( Q ; R ) = [Ψ]( Q ); [Ψ]( R ) Proof LHS { def of [Ψ] } Ψ − 1 ; Q ; R ; ˆ ˆ { ˆ Ψ − 1 ; ˆ Ψ = skip } = Ψ Ψ; Q ; ˆ ˆ Ψ − 1 ; ˆ Ψ; R ; ˆ Ψ − 1 { def of [Ψ] } = = RHS ✫ ✪ + +

  65. ✬ ✩ 93 + Implementation of Conditional Theorem [Ψ]( Q ⊳ b ⊲ R ) = [Ψ]( Q ) ⊳ b Ψ ⊲ [Ψ]( R ) { ( Q ⊳ b ⊲ R ); W = ( Q ; W ) ⊳ b ⊲ ( R ; W ) } LHS Proof Ψ; ( Q ; ˆ ˆ Ψ − 1 ) ⊳ b ⊲ ( R ; ˆ Ψ − 1 ) = { end w ; ( Q ⊳ b ⊲ R ) = ( end w ; Q ) ⊳ b ⊲ ( end w R ) } = var x, .. z ; ( x, .., z := M [Ψ x ] , .., M [Ψ z ]; end M, A, B, C, P ; Q ; ˆ Ψ − 1 ) ⊳ b Ψ ⊲ ( x, .., z := M [Ψ x ] , .., M [Ψ z ]; var w ; ( Q ⊳ b ⊲ R ) = end M, A, B, C, P ; R ; ˆ Ψ − 1 ) ( var w ; Q ) ⊳ b ⊲ ( var w R ) } = RHS ✫ ✪ + +

  66. ✬ ✩ 94 + Implementation of Iteration Theorem [Ψ]( b ∗ Q ) ⊑ b Ψ ∗ [Ψ]( Q ) Ψ − 1 ; RHS ; ˆ (ˆ Ψ) { ( Q ⊳ b ⊲ R ); W = Proof ( Q ; W ) ⊳ b ⊲ ( R ; W ) } ˆ Ψ − 1 ; { x := e ; ⊳ Q ⊳ b ⊲ R = = ([Ψ]( Q ); RHS ; ˆ Ψ) ⊳ b Ψ ⊲ ˆ Ψ ( x := e ; Q ) ⊳ b [ e/x ] ⊲ ( x := e ; R ) } (ˆ Ψ − 1 ; [Ψ]( Q ); RHS ; ˆ = Ψ) ⊳ b ⊲ (ˆ Ψ − 1 ; ˆ { ˆ Ψ − 1 ; ˆ Ψ = skip } Ψ) ( Q ; (ˆ Ψ − 1 ; RHS ; ˆ = Ψ)) ⊳ b ⊲ skip which implies Ψ − 1 ⊒ b ∗ Q ) (ˆ Ψ; RHS ; ˆ ( RHS ⊒ ˆ Ψ − 1 ; ( b ∗ Q ); ˆ = ⇒ Ψ) ✫ ✪ + +

  67. ✬ ✩ 95 + Empty Machine Program The empty machine program is the unit of sequential composition. Theorem I ( s, s, m ); I ( s, f, m ) = I ( s, f, m ) = I ( s, f, m ); I ( f, f, m ) Proof I ( s, s, m ) { Def of I} = P := s ; ( s ≤ P < s ) ∗ m [ P ] ; ( P = s ) ⊥ { false ∗ Q = skip } = P := s ; ( P = s ) ⊥ { Q ⊳ true ⊲ R = Q } = P := s ✫ ✪ + +

  68. ✬ ✩ 96 + Entry of Machine Code Theorem If s ≤ j < f , then I ( j, f, m ) ⊑ P := j ; ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { ( b ∨ c ) ∗ Q = b ∗ Q ; ( b ∨ c ) ∗ Q } Proof RHS = P := j ; ( j ≤ P < f ) ∗ m [ P ] ; ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { ( P = f ) ⊥ = ( P = f ) ⊥ ; ( P ; = f ) } ⊒ I ( j, f, m ) ; ( P := f ) ; ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { false ∗ Q = skip } = I ( j, f, m ); P := f ; ( P = f ) ⊥ { b ⊥ ; b ⊥ = b ⊥ } = LHS ✫ ✪ + +

  69. ✬ ✩ 97 + Elimination of Machine Code Theorem If s < f and m [ s ] = jump ( f − s − 1), then I ( s, f, m ) = I ( f, f, m ) LHS { Unfolding of b ∗ Q } Proof = P := s ; m [ s ] ; ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { def of jump ( w ) } = P := s ; P := P + f − s ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { false ∗ Q = skip } { false ∗ Q = skip } = P := f ; ( P = f ) ⊥ = P := f ; ( f ≤ P < f ) ∗ m [ P ]; ( P = f ) ⊥ ✫ ✪ + +

  70. ✬ ✩ 98 + Void Initial State If a machine program contains an instruction jumping backwards to its start address, then it does not matter whether the execution starts at its first instruction or that jump instruction. Theorem If s ≤ j < f , and m [ j ] = jump ( s − j − 1) then ( P := j ; ( s ≤ P < f ) ∗ m [ P ]) = ( P := s ; ( s ≤ P < f ) ∗ m [ P ]) LHS { unfolding iteration } Proof = P := j ; m [ j ] ; ( s ≤ P < f ) ∗ m [ P ] { def of jump ( w ) } = P := j ; P := P + s − j ; ( s ≤ P < f ) ∗ m [ P ] { merge assignment } RHS ✫ ✪ + +

  71. ✬ ✩ 99 + Concatenation The concatenation of two machine programs is a refinement of their sequential composition, since it is easy for the former to materialise its normal exit commitment. Theorem If s ≤ j ≤ f then I ( s, f, m ) ⊒ I ( s, j, m ); I ( j, f, m ) ✫ ✪ + +

  72. ✬ ✩ 100 + Proof { def of I} RHS P := s ; ( s ≤ P < j ) ∗ m [ P ] ; ( P = j ) ⊥ ; = P := j ; ( j ≤ P < f ) ∗ m [ P ]; ( P = f ) ⊥ { Entry of machine code } ⊑ P := s ; ( s ≤ P < j ) ∗ m [ P ] ; ( P = j ) ⊥ ; P := j ; ( s ≤ P < f ) ∗ m [ P ]; ( P = f ) ⊥ { ( P = j ) ⊥ = ( P = j ) ⊥ ; ( P := j } ⊑ P := s ; ( s ≤ P ≤ j ) ∗ m [ P ] ; skip ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { b ∗ Q ; ( b ∨ c ) ∗ Q = ( b ∨ c ) ∗ Q } P := s ; ( s ≤ P < f ) ∗ m [ P ] ; ( P = f ) ⊥ { def of I} = = LHS ✫ ✪ + +

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend