Coiterative Synchronous Semantics Marc Pouzet Ecole normale - - PowerPoint PPT Presentation

coiterative synchronous semantics
SMART_READER_LITE
LIVE PREVIEW

Coiterative Synchronous Semantics Marc Pouzet Ecole normale - - PowerPoint PPT Presentation

Coiterative Synchronous Semantics Marc Pouzet Ecole normale suprieure Paris Marc.Pouzet@ens.fr 2019 1 / 107 Today The denotational semantics we have given in the previous lesson hides ressource aspects. It cannot answer questions like


slide-1
SLIDE 1

Coiterative Synchronous Semantics

Marc Pouzet

Ecole normale supérieure Paris Marc.Pouzet@ens.fr

2019

1 / 107

slide-2
SLIDE 2

Today

  • The denotational semantics we have given in the previous lesson hides

ressource aspects. It cannot answer questions like “how much memory and time to compute the n-th element of a stream?”

  • It is good for reasonning about program equivalence; but not to prove

properties about boundedness of execution time and memory.

  • Today: we give a more operational interpretation of a stream and a

stream function.

  • It highlight the way sequential code can be generated.
  • We first identify synchronous stream functions to length preserving

functions.

  • Then, we consider non length preserving one by introducing an explicit

absent value. Read the associated notes and the technical report [2]. Also read the beautiful technical report [6].

2 / 107

slide-3
SLIDE 3

The language kernel

d ::= let f = e | let node f x = e | d d e ::= c | x | (e, e) | f e | run f e | prec(e) | e fby e | let x = e in e | let rec x = e in e | if e then e else e | present e do e else e | reset e every e

  • f e is application of a combinatorial function.
  • run f e is the application of a node.
  • prec(e) is the delay initialised with the constant c.
  • e1 -> e2 is a shortcut for if pretrue(false) then e1 else e2

3 / 107

slide-4
SLIDE 4

Static Typing

4 / 107

slide-5
SLIDE 5

Typing rules

The type language is subset of that of ML [4]. We voluntarily restrict it to be first-order. 1 σ ::= ∀α1, ..., αn.gt | gt gt ::= t

k

→ t | t t ::= t × t | (t, ..., t) c | α k ::= 0 | 1

  • t1

k

→ t2 with k ∈ {0, 1} its sort is the type of a function. 0 means that the function is combinatorial;

  • 1 means that the function is stateful;
  • (t1 × t2) is the product type;
  • (t1, ..., tn)c is a parameterized type name with arguments t1, ..., tn.

Historial note: Kinds were introduced in Lucid Synchrone [7] in version 2 (2000); they are used in the type system of Scade 6 [3].

1The general case will be explain in the next lesson. 5 / 107

slide-6
SLIDE 6

Combinatorial vs Stateful Expressions

A stream function f is combinatorial if the value of f (x) at instant n only depends on its input at instant n, that is, it exists g such that: ∀n ∈ N.(f (x))n = g(xn) Otherwise, f is sequential or stateful An expression e is combinatorial if it only compose combinatorial fonctions; it is statefull otherwise. Besides computing types, the type system associate a sort k ∈ {0, 1} to every expression.

6 / 107

slide-7
SLIDE 7

Typing consists in asserting the following type jugment: G, H

k

⊢ e : t where: G ::= [x : σ; ...; x : σ] H ::= [x : t; ...; x : t] The predicate G, H

k

⊢ e : t means that the expression e is of kind k and type t in a global environment G and local environment H.

7 / 107

slide-8
SLIDE 8

Initial Environment

. fby . ::= ∀α.α × α

1

→ α pre . ::= ∀α.α

1

→ α . -> . ::= ∀α.α × α

1

→ α fst(.) ::= ∀α1, α2.α1 × α2 → α1 snd(.) ::= ∀α1, α2.α1 × α2 → α2 (+) ::= int × int 0 → int

  • . fby ., pre . and . -> . are stateful, hence the type constructor .

1

→ .;

  • fst(.), snd(.) are combinatorial.
  • Imported primitive, e.g., +, that are applied point-wise are

combinatorial.

8 / 107

slide-9
SLIDE 9

Generalisation and Instanciation

A type gt can be generalised into a type scheme σ. Conversely, a type scheme can be instanciated into a type.

  • Generalisation:

Gen(H)(gt) = ∀α1, ..., αn.gt where {α1, ..., αn} = FV (t) − FV (H)

  • Instanciation:

gt[t1/α1, ..., tn/αn] ∈ Inst(∀α1, ..., αn.gt)

9 / 107

slide-10
SLIDE 10

(var)

G | H

k

⊢ x : H(x)

(global)

x ∈ Dom(H) t ∈ Inst(G(x)) G | H

k

⊢ x : t

(pair)

G | H

k

⊢ e1 : t1 G | H

k

⊢ e2 : t2 G | H

k

⊢ (e1, e2) : t1 × t2

(App)

t k′ → t′ ∈ Inst(G(f )) f ∈ Dom(H) G | H

k

⊢ e : t k′ ≤ k G | H

k

⊢ f e : t′

10 / 107

slide-11
SLIDE 11

Local definitions of streams

(let)

G | H

k

⊢ e1 : t1 G | H, x : t1

k

⊢ e2 : t2 G | H

k

⊢ let x = e1 in e2 : t2

(letrec)

G | H, x : t1

k

⊢ e1 : t1 G | H, x : t1

k

⊢ e2 : t2 G | H

k

⊢ let rec x = e1 in e2 : t2

11 / 107

slide-12
SLIDE 12

Control structures

(if)

G | H

k

⊢ e : bool G | H

k

⊢ e1 : t H

k

⊢ e2 : t G | H

k

⊢ if e1 then e2 else e3 : t

(present)

G | H

k

⊢ e : bool G | H

k

⊢ e1 : t H

k

⊢ e2 : t G | H

k

⊢ present e1 do e2 else e3 : t

(reset)

G | H

k

⊢ e1 : t1 G | H

k

⊢ e2 : bool G | H

k

⊢ reset e1 every e2 : t1

12 / 107

slide-13
SLIDE 13

Global definitions

The generalisation is a particular case of that of [4] because all functions are defined globally and cannot be nested.

(letfun)

G | [x : t1] ⊢ e : t2 G ⊢ let f x = e : G + [Gen([])(t1 → t2)]

(letnode)

G | [x : t1]

1

⊢ e : t2 G ⊢ let node f x = e : G + [Gen([])(t1

1

→ t2)]

(letdef)

G | [] ⊢ e : t G ⊢ let f = e : G + [Gen([])(t)]

13 / 107

slide-14
SLIDE 14

A few examples (in Zelus)

E.g., the following functions (written in Zelus) are well typed. 2

let node from(x) = let rec f = x fby (f + 1) in f let incr x = x + 1

On the contrary, the following is rejected.

let from(x) = let rec f = x fby (f + 1) in f

> let rec f = x fby (f + 1) in f > ^^^^^^^^^^^^^ Type error: this is a stateful discrete expression and is expected to be combinatorial.

2The second form ask incr to be a combinatorial function, i.e., to have a type of

the form . → .

14 / 107

slide-15
SLIDE 15

Semantics

15 / 107

slide-16
SLIDE 16

We give a semantics to well-typed expressions and definitions only. To simplify the presentation, we consider the same language but where every expression/sub-expression is annotated with its kind and type. et

k stands for the expression e whose type is t and kind is k.

16 / 107

slide-17
SLIDE 17

Streams processes

A stream process producing values in the set T is a a pair made of a step function of type S → T × S and an initial state S. CoStream(T, S) = CoF(S → T × S, S) Given a process CoF(f , s), Nth(v)(n) returns the n-th element of the corresponding stream process: Nth(CoF(f , s))(0) = let v, s = f s in v Nth(CoF(F, s))(n) = let v, s = f s in Nth(CoF(f , s))(n − 1) Two stream processes CoF(f , s) and CoF(f ′, s′) are equivalent iff they compute the same streams, that is, ∀n ∈ N.Nth(CoF(f , s))(n) = Nth(CoF(f ′, s′))(n)

17 / 107

slide-18
SLIDE 18

Synchronous Stream Processes

A stream function should be a value from: CoStream(T, S) → CoStream(T ′, S′) We consider a particular class of stream functions that we call synchronous stream functions. A synchronous stream function, from inputs of type T to outputs of type T ′ is a pair, made of a step function and an initial state. type SFun(T, T ′, S) = CoP(S → T → T ′ × S, S) That is, it only need the current value of its input in order to compute the current value of its output. Remark that s : CoStream(T, S) can be represented by a value of the set SFun(Unit, T, S) with Unit the set with a single element ().

18 / 107

slide-19
SLIDE 19

Equivalence between streams

Two stream processes CoF(f , s0) and CoF(f ′, s′

0) are equal iff

∀n ∈ N.(on, sn+1 = f sn) ⇔ (on, s′

n+1 = f ′ s′ n)

Two streams with different concrete implementation with different states can be equal. This equivalence is a particular case of the bisimulation equivalence. Consider a binary relation R between states: R(sf , sg) means that sf and sg are in relation by R. Two streams CoF(f , sfi) and CoF(g, sgi) are bisimilar by R (or R is a bisimulation) if the following holds:

  • 1. Initial states are in relation, that is, R(sfi, sgi)
  • 2. For all sf , sg, it exists v, sf ′, sg′:

R(sf , sg) ⇒ ((v, sf ′) = f sf ) ∧ (v, sg′) = g sg) ∧ R(sf ′, sg′)

19 / 107

slide-20
SLIDE 20

Equivalence (Stream Functions)

The relation is lifted to synchronous stream functions. Two stream functions CoP(f , sfi) and CoP(g, sgi) are bisimilar if there exists R such that the following holds:

  • 1. Initial states are in relation:

R(sfi, sgi)

  • 2. For all sf , sg, x, it exists v, sf ′, sg′:

R(sf , sg) ⇒ ((v, sf ′) = f sf x) ∧ (v, sg′) = g sg x) ∧ R(sf ′, sg′)

20 / 107

slide-21
SLIDE 21

Application

Bisimilarity is useful to prove inductively the correctness of source-to-source transformations. E.g., (+)(1, 2) and 3, which do not have the same representation, are equivalent. E.g., it can be implemented as CoF(λ(s1, s2).(+)(1, 2), (s1, s2), ((), ())) or CoF(λs.(+)(1, 2), s, ()) or even CoF(λs.3, s, ()). Yet, they compute the same sequences. Exercice: One you have seen the semantics of the language kernel, prove that 0 fby (x + y) and (0 fby x) + (0 fby y) are equivalent.

21 / 107

slide-22
SLIDE 22

Combinatorial functions (again)

In the kernel language, it is only possible to define a stateful stream function: let node f x = e The kernel can be extended to declare a function that must be combinatorial: let f x = e In that case, e must be typable with kind 0. The language has two forms of applications: f e and run f e. It is be possible to have one only, e.g., write run f e or simply f e. The type system can infer the sort k′ of the application.

22 / 107

slide-23
SLIDE 23

It would be possible to consider that all stream functions are implemented as stateful functions. When imported, a function f : T1 → T2 becomes a value: CoP(λs, v.(f (v), s), ()) : SFun(T1, T2, Unit) A function f is combinatorial if it is equivalent to a function of type SFun(T1, T2, Unit). To highlight the fact that a combinatorial function can be implemented more simply than a stateful function, we prefer to keep the two form of

  • applications. 3

3In Scade 6 and Zelus, there is a single form of application. 23 / 107

slide-24
SLIDE 24

Fixpoint

Consider a synchronous stream function f : S → T → T × S. Write fix (f ) : S → T × S for the smallest fix-point of f . That is, given an initial state s : S, we want fix (f ) to be a solution of the following equation: X(s) = let v, s′ = X(s) in f s v fix (f )(s) = v, s′ such that: v, s′ = f s v This fix-point can be can be implemented with a recursion on values, for example in Haskell: fix (f ) = λs.let rec v, s′ = f v s in v, s′ The value v is defined recursively.

24 / 107

slide-25
SLIDE 25

Justification of its existence

In order to apply the Kleene theorem that state the existence of a smallest fix-point, all functions must be total. Given a set of values V , V ⊥ = V + {⊥} extends V with an element ⊥ which represents an undefined value. ≤ is the flat order with ⊥ as the minimum element (that is, ∀x ∈ V , ⊥ < x). 4 Suppose that f : V1 → V2 is a external function, e.g., +, −, not that is

  • total. We implicitly lift it into f : V ⊥

1 → V ⊥ 2 so that f (⊥) = ⊥.

For the set of streams, the bottom element is the trivial step function λs.(⊥, s) and the initial state is the bottom element ⊥. Call ⊥stream or simply ⊥, this bottom stream CoF(λs.(⊥, s), ⊥). It corresponds to a stream process that stuck: giving an input state, it never returns a value.

4(x < y) stands for (x ≤ y) ∧ (x = y) 25 / 107

slide-26
SLIDE 26

Bounded Fixpoint:

The bounded iteration fix (f )(n) is defined by: fix (f )(0)(s) = ⊥, s fix (f )(n)(s) = let v, s′ = fix (f )(n − 1)(s) in f v s

26 / 107

slide-27
SLIDE 27

Semantics

Let V1, ..., Vn be sets of values (e.g., integers, booleans). We define the set

  • f instantaneous values V as the smallest set which verifies the fix-point

equation: V = V1 + ... + Vn + (V × V ) A environment ρ is a function from a set of names to a set of values. Values are either elements of V , combinatorial functions (V → V ) or stateful functions (SFun(V , V , S)): V + (V → V ) + SFun(V , V , S) If ρ(f ) = CoP(p, s), we write ρ(f )I = p and ρ(f )S = s.

27 / 107

slide-28
SLIDE 28

The semantics of an expression e is: [ [e] ]ρ = CoF(f , s) where f = [ [e] ]State

ρ

and s = [ [e] ]Init

ρ

We use two auxiliary functions. If e is an expression and ρ an environment which associates a value to a variable name:

  • [

[e] ]Init

ρ

is the initial state of the transition function associated to e;

  • [

[e] ]State

ρ

is the step function.

28 / 107

slide-29
SLIDE 29

[ [prec(e)] ]Init

ρ

= (c, [ [e] ]Init

ρ )

[ [prec(e)] ]State

ρ

= λ(m, s).m, [ [e] ]State

ρ

(s) [ [f e] ]Init

ρ

= [ [e] ]Init

ρ

[ [f e] ]State

ρ

= λs.let v, s = [ [e] ]State

ρ

(s) in f (v), s [ [x] ]Init

ρ

= () [ [x] ]State

ρ

= λs.(ρ(x), s) [ [c] ]Init

ρ

= () [ [c] ]State

ρ

= λs.(c, s) [ [(e1, e2)] ]Init

ρ

= ([ [e1] ]Init

ρ , [

[e2] ]Init

ρ )

[ [(e1, e2)] ]State

ρ

= λ(s1, s2).let v1, s1 = [ [e1] ]State

ρ

(s1) in let v2, s2 = [ [e2] ]State

ρ

(s2) in (v1, v2), (s1, s2)

29 / 107

slide-30
SLIDE 30

[ [run f e] ]Init

ρ

= ρ(f )I, [ [e] ]Init

ρ

[ [run f e] ]State

ρ

= λ(m, s).let v, s = [ [e] ]State

ρ

(s) in let r, m′ = ρ(f )S m v in r, (m′, s) [ [let node f x = e] ]Init

ρ

= ρ + [CoP(p, s)/f ] such that s = [ [e] ]Init

ρ

and p = λs, v.[ [e] ]State

ρ+[v/x](s)

30 / 107

slide-31
SLIDE 31

Fixpoint

[ [let rec x = e in e′] ]Init

ρ

= [ [e] ]Init

ρ , [

[e′] ]Init

ρ

[ [let rec x = e in e′] ]State

ρ

= λ(s, s′).let v, s = fix (λs, v.[ [e] ]State

ρ+[v/x](s)) in

let v′, s′ = [ [e′] ]State

ρ+[v/x](s′) in

v′, (s, s′) Using a recursion on value, it corresponds to: [ [let rec x = e in e′] ]State

ρ

= λ(s, s′).let rec v, ns = [ [e] ]State

ρ+[v/x](s) in

let v′, s′ = [ [e′] ]State

ρ+[v/x](s′) in

v′, (ns, s′) Note that v is recursively defined

31 / 107

slide-32
SLIDE 32

Control structure

Note the difference between the two: the conditional “if/then/else” always executes its three arguments but not the “present”: [ [if e then e1 else e2] ]Init

ρ

= ([ [e] ]Init

ρ , [

[e1] ]Init

ρ , [

[e2] ]Init

ρ )

[ [if e then e1 else e2] ]State

ρ

= λ(s, s1, s2).let v, s = [ [e] ]State

ρ

(s) in let v1, s1 = [ [e1] ]State

ρ

(s1) in let v2, s2 = [ [e2] ]State

ρ

(s2) in (if v then v1 else v2, (s, s1, s2)) [ [present e do e1 else e2] ]Init

ρ

= ([ [e] ]Init

ρ , [

[e1] ]Init

ρ , [

[e2] ]Init

ρ )

[ [present e do e1 else e2] ]State

ρ

= λ(s, s1, s2). let v, s = [ [e] ]State

ρ

(s) in if v thenlet v1, s1 = [ [e1] ]State

ρ

(s1) in v1, (s, s1, s2) else let v2, s2 = [ [e2] ]State

ρ

(s2) in v2, (s, s1, s2)

32 / 107

slide-33
SLIDE 33

Modular Reset

Reset a computation when a boolean condition is true. [ [reset e1 every e2] ]Init

ρ

= ([ [e1] ]Init

ρ , [

[e1] ]Init

ρ , [

[e2] ]Init

ρ )

[ [reset e1 every e2] ]State

ρ

= λ(si, s1, s2). let v2, s2 = [ [e2] ]State

ρ

(s2) in let v1, s1 = [ [e1] ]State

ρ

(if v2 then si else s1) in v1, (si, s1, s2) This definition duplicates the initial state. An alternative is: [ [reset e1 every e2] ]Init

ρ

= ([ [e1] ]Init

ρ , [

[e2] ]Init

ρ )

[ [reset e1 every e2] ]State

ρ

= λ(s1, s2). let v2, s2 = [ [e2] ]State

ρ

(s2) in let s1 = if v2 then [ [e1] ]Init

ρ

else s1 in let v1, s1 = [ [e1] ]State

ρ

(s1) in v1, (s1, s2)

33 / 107

slide-34
SLIDE 34

Fix-point for mutually recursive streams

Consider:

let node sincos(x) = (sin, cos) where rec sin = int(0.0, cos) and cos = int(1.0, -. sin)

The fix-point construction used in the kernel language is able to deal with mutually recursive definitions, encoding them as:

sincos = (int(0.0, snd sincos), int(1.0, -. fst sincos)

34 / 107

slide-35
SLIDE 35

Encoding mutually recursive streams

A set of mutually recursive streams: e ::= let rec x = e and ... and x = e in e is interpreted as the definition of a single recursive definition such that: let rec x1 = e1 and ... and xn = en in e means: let rec x = (e1, (e2, (..., en)))[e′

1/x1, ..., e′ n/xn] in

with: e′

1

= fst(x) e′

2

= fst(snd(x)) ... e′

n

= sndn−1(x) That is, if the n variables x1, ..., xn are streams whose outputs are of type CoStream(Ti, Si) with i ∈ [1..n], fix (.) is applied to a function of type S → T1 × ... × Tn → (T1 × ... × Tn) × S with S = (S1 × (... × Sn)). All streams progress synchronously.

35 / 107

slide-36
SLIDE 36

Where are the bottom values?

From the semantics we have given, some equations have the constant bottom stream as minimal fix-point. E.g.:

let node f(x) = o where rec o = o

Indeed: fix (λs, v.[ [o] ]State

ρ+[v/o](s)) = fix (λs, v.(v, s)) = λs, v.(⊥, s)

An other example is:

let node f(z) = (x, y) where rec x = y and y = x

Indeed: fix (λs, v.[ [(snd(v), fst(v))] ]State

ρ+[v/x](s))

= fix (λs, v.(snd(v), fst(v)), s) = λs.(⊥, ⊥), s We are interesting in finding sufficient conditions to ensure that the output is not bottom or, even more, that it does not contain bottom.

36 / 107

slide-37
SLIDE 37

Def-use chains

In term of def-use chains of variables based on the occurrence of variables in expression, there is a cyclic dependence in both examples:

x depends on y which depends on x

The following definition does not define a bottom stream (provided that inputs are non bottom streams). 5

let node euler_forward(h, x0, xprime) = x where rec x = x0 fby (x +. h *. xprime)

5We suppose that all imported functions are total. 37 / 107

slide-38
SLIDE 38

Break the dependence cycles with a unit delay

The graphical argument we used — the dependence graph between variables is cyclic or not — can be adapted to take the unit delay into account. Say that prec(e) does not depend on variable in e; hence, a variable x defined by an equation x = e only depends on the variables in e which do not appear on the right of a unit delay. The dependence relation between variables in the euler_backward is acyclic.

38 / 107

slide-39
SLIDE 39

Is that enough?

The dependence graph is a rough abstraction of the dependence relation. It does not take into account the actual values of expressions. It over approximate instantaneous dependences. Some sets of equations whose associate dependence graph is cyclic do define non bottom streams. E.g.,:

let node f(y) = x where rec x = if false then x else 0

The conditional only needs the current value of its first (or second) argument with the condition is true (or false). This program is rejected by Lustre compilers.

39 / 107

slide-40
SLIDE 40

Mutually recursive definitions

The notion of dependence is subtle. All function below are such that if x is non bottom, outputs z and t are non bottom.

let node good1(x) = (z, t) where rec z = t and t = 0 fby z let node good2(x) = (z, t) where rec (z, t) = (t, 0 fby z) let node good3(x) = (fst r, snd r) where rec r = (snd r, 0 fby (fst r)) let node pair(r) = (snd r, 0 fby (fst r)) let node good4(x) = r where rec r = pair(r)

Do we want to accept all of them? What is the criterium to accept them

  • r not? The next lesson will be devoted to the precise definition of what is

a dependence and its exploitation to generate sequential code.

40 / 107

slide-41
SLIDE 41

The following is a classical example that is “constructively causal” but is also rejected by Lustre compilers.

let node mux(c, x, y) = present c then x else y let node constructive(c, x) = y where rec rec x1 = mux(c, x, y2) and x2 = mux(c, y1, x) and y1 = f(x1) and y2 = g(x2) and y = mux(c, y2, y1)

If we look at the def-use chains of variables, there is a cycle in the dependence graph:

  • x1 depends on c, x and y2;
  • x2 depends on c, y1 and x;
  • y1 depends on x1; y2 depends on x2;
  • y depends on c, y2 and y1.

By transitivity, y2 depends on y2 and y1 depends on y1.

41 / 107

slide-42
SLIDE 42

Yet, if c and x are non bottom streams, the fix-point that defines

(x1,x2,y1,y2,y) is a non bottom stream.

It can be proved to be equivalent to:

let node constructive(c, x) = y where rec rec y = mux(c, g(f(x)), f(g(x)))

Question: can you prove it? How? In term of an implementation into a circuit, the cyclic version has a single

  • ccurrence of f and g whereas the second has two copies of each.

A cyclic combinatorial circuit can be exponentially smaller than its non cyclic counterpart. 6 The causality analysis ensures that an expression does not produce bottom and can be translated into an expression with no fix-point.

6See notes for references. 42 / 107

slide-43
SLIDE 43

The following example (written in Zelus) also defines a node whose output is non bottom:

let node composition(c1, c2, y) = (x, z, t, r) where rec present c1 then do x = y + 1 and z = t + 1 done else do x = 1 and z = 2 done and present c2 then do t = x + 1 and r = z + 2 done else do t = 1 and r = 2 done

that can be interpreted as the following program in the language kernel:

let node composition(c1, c2, y) = (x, z, t, r) where rec (x, z) = present c1 then (y + 1, t + 1) else (1, 2) and (t, r) = present c2 then (x + 1, z + 2) else (1, 2)

43 / 107

slide-44
SLIDE 44

Is it causal?

Supposing the c1, c2 and y are not bottom values, taking e.g., true for c1 and c2, starting with x0 = ⊥, z0 = ⊥, t0 = ⊥ and r0 = ⊥, the fixpoint is the limit of the sequence: xn = y + 1 ∧ zn = tn−1 + 1 ∧ tn = xn−1 + 1 ∧ rn = zn−1 + 2 and is obtained after 4 iterations. This program is causal: if inputs are non bottom values, all outputs are non bottom values and this is the case for all computations of it.

44 / 107

slide-45
SLIDE 45

The inpact of static code generation

Nonetheless, if we want to generate statically scheduled sequential code, the control structure must be duplicated: (1) test c1 to compute x; (2) test c2 to compute t; (3) test (again) c1 to compute z; (4) test (again) c2 to compute r

let node composition(c1, c2, y) = (x, z, t, r) where rec present c1 then do x = y + 1 done else do x = 1 done and present c2 then do t = x + 1 done else do t = 1 done and present c1 then do z = t + 1 done else do z = 2 done and present c2 then do r = z + 2 done else do r = 2 done

Accepting program with interwined dependences has an impact on code size and efficiency. It is possible to overconstraint the causality analysis and control structures to be atomic (outputs all depend on all inputs).

45 / 107

slide-46
SLIDE 46

Removing Recursion

Yet, the semantics we have given computes a step function which must be evaluated lazilly. Is this really a progress w.r.t the co-inductive semantics? Some recursive equations can be translated into non recursive definitions. Consider the stream equation:

let rec nat = 0 fby (nat + 1) in nat

Can we get rid of recursion in this definition? Surely we can, since it can be compiled into a finite state machine corresponding to the co-iterative process: nat = Co(λs.(s, s + 1), 0)

46 / 107

slide-47
SLIDE 47

First: let us unfold the semantics

Consider the recursive equation:

rec nat = (0 fby nat) + 1

Let us try to compute the solution of this equation manually by unfolding the definition of the semantics. Let x = CoF(f , s) where f is a transition function of type f : S → X × S and s : S the initial state, we write: x.step for f and x.init for x : init for s. The bottom stream, to start with, is: x0 = CoF(λs.(⊥, s), ⊥)

47 / 107

slide-48
SLIDE 48

The equation that defines nat can be rewritten as let rec nat = f (nat) in nat with let node f x = (0 fby x) + 1. The semantics of f is: f = CoP(fs, s0) = CoP(λs, x.(s + 1, x), 0) Solving nat = f (nat) amount at finding a stream X such that: X(s) = let v, s′ = X(s) in fs s v

48 / 107

slide-49
SLIDE 49

Let us proceed iteratively by unfolding the definition of the semantics. We have: x1.step = λs.let v, s′ = x0.step s in fs s v = λs.fs s ⊥ = λs.s + 1, ⊥ x1.init = x2.step = λs.let v, s′ = x1.step s in fs s v = λs.let v = s + 1 in fs s v = λs.let v = s + 1 in s + 1, v = λs.s + 1, s + 1 x2.init = x3.step = λs.let v, s′ = x2.step s in fs s v = λs.let v = s + 1 in fs s v = λs.let v = s + 1 in s + 1, v = λs.s + 1, s + 1 x3.init = We have reached the fix-point CoF(λs.(s + 1, s + 1), 0) in three steps.

49 / 107

slide-50
SLIDE 50

Syntactically Guarded Stream Equations

We give now a simple, syntactic condition under which the semantics of mutually recursive stream equations does not need any fix point. Consider a node f : CoStream(T, S) → CoStream(T, S′) whose semantics is (ft, st) with ft : S′ → T → T ′ × S′ and st : S′. The semantics of an equation y = f (y) is: 7 [ [let rec y = f (y) in y] ]Init

ρ

= st [ [let rec y = f (y) in y] ]State

ρ

= λs.let rec v, s′ = ft v s in v, s′ The recursion on time (a stream recursion) is transformed into a recursion

  • n the instant.

7We reason upto bisimulation, that is, independently on the actual representation of

the internal state.

50 / 107

slide-51
SLIDE 51

Two cases can happen:

  • We deal with a 0-order expression (a stream expression or product of

0-order expressions), then:

  • Either the first element of the pair ft v s, that is v, s′ depends on v and we

have an unbounded recursion — the program contains a causality loop —;

  • or it does not and the evaluation succeeds.
  • the expression is an higher order one and its boundedness depends on

semantic conditions to be checked in each case. For example, the following equation:

let rec nat = nat + 1 in nat

is not causal since x depends instantaneously on itself and its evaluation have an unbounded recursion.

51 / 107

slide-52
SLIDE 52

When the program does not contain any causality loop, it means that indeed the recursive evaluation of the pair v, s′ can be split into two non recursive ones. This case appears, for example, when every stream recursion appears on the right of a unit delay pre. A synchronous compiler takes advantage of this in order to produce non recursive code like the co-iterative nat expression given above. Yet, if we are interested in defining an interpreter only, the co-iterative semantics can be used for that purpose.

52 / 107

slide-53
SLIDE 53

For example, consider the equation y = f (v fby x). Its semantics is: [ [let rec x = f (v fby x) in x] ]Init

ρ

= (v, st) [ [let rec x = f (v fby x) in x] ]State

ρ

(m, s) = let rec v, s′ = ft m s in v, (v, s′) But this time, the recursion is no more necessary, that is: [ [let rec x = f (v fby x) in x] ]State

ρ

(m, s) = let v, s′ = ft m s in v, (v, s′)

53 / 107

slide-54
SLIDE 54

Putting Mutually Recursive Equation in Normal Form

Consider:

let rec sin = 0.0 fby (sin +. h *. cos) and cos = 1.0 -> (0.0 fby cos) +. h *. sin in sin, cos

Rewrite it into:

let rec sin = 0.0 fby sin_next and pre_cos = 0.0 fby cos and sin_next = sin +. h *. cos and cos = 1.0 -> pre_cos +. h *. sin sin, cos

All the unit delay are un-nested; their argument is a variable. Gather equations on delays on the top; statically schedule other equations according to read/write variables.

54 / 107

slide-55
SLIDE 55

The transition function is: λ(m1, m2, m3).let sin = m1 in let pre_cos = m2 in let sin_next = sin + .h ∗ .cos in let cos = if m3 then 1.0 else pre_cos + .h ∗ .sin in (sin, cos), (sin_next, cos, false) and initial state: (0.0, 0.0, true) There is no more recursion in the transition function.

55 / 107

slide-56
SLIDE 56

The Semantics for Normalised Equations

Consider a set of mutually recursive equations such that it can be put under the following form: let rec x1 = v1 fby nx1 and ... xn = vn fby nxn and p1 = e1 and ... and pk = ek in e where ∀i, j.(i < j) ⇒ Var(ei) ∩ Var(pj) = ∅ where Var(p) and Var(e) are the set of variable names appearing in p and e.

56 / 107

slide-57
SLIDE 57

Its transition function is: λ(x1, ..., xn, s1, ..., sk, s).let p1, s1 = [ [e1] ]State

ρ

(s1) in let ... in let pk, sk = [ [ek] ]State

ρ

(sk) in let r, s = [ [e] ]State

ρ

(s) in r, (nx1, ..., nxn, s1, ..., sk, s) with initial state: (v1, ..., vn, s1, ..., sk, s) if [ [ei] ]Init

ρ

= si and [ [e] ]Init

ρ

= s. When a set of mutually recursive streams can be put in the above form, its transition function does not need a fix-point. It can be statically scheduled into a function that can be evaluated eagerly. This removing of the recursion is the basis of generation of statically scheduled code done by a synchronous language compiler. Exercice: prove that the new semantics for the let/rec operation is correct, that is, it produces the same stream as the original semantics.

57 / 107

slide-58
SLIDE 58

Extension of the language kernel: local variables, mutually recursive definitions, hierarchical automata

58 / 107

slide-59
SLIDE 59

The language kernel we have considered is similar to Lustre.

  • It is first-order as Lustre but adds type polymorphism, a reset and an

elementary control-structure (present/then/else) to execute a block conditionally.

  • All functions are length preserving: there is no when/merge or current
  • peration.

We consider now an extended language that incorporates the programming construct that exists in Scade 6.

59 / 107

slide-60
SLIDE 60

Mutually Recursive Equations

Equations are extended with local definitions: E ::= p = e | E and E | local v in E v ::= x | x init e | x default e p ::= x | (p, ..., p) Expressions are extended with a construct to access the last value of a stream: e ::= ... | last x

60 / 107

slide-61
SLIDE 61

Environment

The construct local x in E declares x to be local in E. The construct local x init e in E declares x to be local and the last computed value of x to be initialized with the value of e. The construct local x default e in E declares x to be local and the default value of x to be the value of e, at instants where no definition of x is given.

61 / 107

slide-62
SLIDE 62

Conditionals over Equations

If e is an expression whose type is a sum type t = C1 | ... | Cn,

  • match e with Ci1 → E1 | ... | Cin → En activates equation Ej such

that ij is the first index such that e = Cij, with 1 ≤ i1, ..., in ≤ n.

  • present e do E1 else E2 activates equations E1 or E2 according to the

boolean value of e. E ::= ... | match e with C → E ... C → E | present e do E1 else E2

62 / 107

slide-63
SLIDE 63

Hierarchical Automata

A automaton which describe a system with several modes and transitions between them. Such an automaton is characterized by:

  • A finite set of states.
  • In every state, a set of equations with variables that are possibly local to

the state.

  • A set (possibly empty) of “weak transitions” (keyword until) which

define the active state for the next reaction.

  • A set (possibly empty) of “strong transitions” (keyword unless) which

define the active set of equations for the current reaction.

  • Transitions can be by “reset” or by “history”.

Rmq: Contrary to Scade 6 and Lucid Synchrone, weak and strong transitions cannot be mixed inside an automaton.

63 / 107

slide-64
SLIDE 64

The syntax is extended in the following way. E ::= ... | automaton S → u wt | ... | S → u wt | automaton S → u st | ... | S → u st u ::= local v in u | do E st ::= unless t | ǫ wt ::= until t | ǫ t ::= e then S et | e continue S et et ::= else t et | ǫ

64 / 107

slide-65
SLIDE 65

Examples in Zelus

type t = Incr | Decr | Idle let f(c) = local o init 0 do match c with | Idle -> (* o keeps its previous value, i.e., o = last o *) do done | Incr -> do o = last o + 1 done | Decr -> do o = last o - 1 done in o

65 / 107

slide-66
SLIDE 66

Examples in Zelus

let node controller(auto, error, input) = output where rec automaton | Manual -> do output = input unless auto then Auto | Auto -> do output = run pid(p, i, d, error) unless (not auto) then Manual let node await(a) = go where rec automaton | Await -> do go = false unless a then Run | Go -> do go = true done let node abro(a, b, r) = go where rec reset automaton | Await -> do go = false unless (run await(a) && run await(b)) then Go | Go -> do go = true done every r

66 / 107

slide-67
SLIDE 67

Typing constraints

67 / 107

slide-68
SLIDE 68
  • G | H

k

⊢ E : H defines the typing of an equation E in an environment H, with result kind k and producing an environment H’.

  • G | H | L

k

⊢ hw : H′ defines the typing of the body of a state hw of an automaton with weak preemption. L is the set of states of the automaton; H is the typing environment; k is the kind; hw is the body associated to state S; H′ is the returned environment.

  • A similar predicate G | H | L

k

⊢ hs : H′ applies for the body of a strong automaton.

  • G | H | L

k

⊢ t and G | H | L

k

⊢ ts define the typing of a transition. To make the notation ligther, we omit the global environment in the rules.

68 / 107

slide-69
SLIDE 69

Operations on typing environments

If H1 and H2 are two environment, H1 ∗ H2 join the two. H1 ∗ ... ∗ Hn = (...(H1 ∗ H2) ∗ ...) ∗ Hn. (H1 ∗ H2)(x) = H1(x) if H1(x) = H2(x) (H1 ∗ H2)(x) = H1(x) if x ∈ Dom(H1) ∧ x ∈ Dom(H2) (H1 ∗ H2)(x) = H2(x) if x ∈ Dom(H2) ∧ x ∈ Dom(H1) H1 + H2 is the union of the two and is defined only when their two domains do not intersect. If H is a typing environment such that H = H′ + [x : t], then H\x = H′.

69 / 107

slide-70
SLIDE 70

(A-weak)

∀i, j ∈ [1..n] H

k

⊢ ui : Hi H, Hi | {S1, ..., Sn}

k

⊢ wti (i = j) ⇒ (Si = Sj) H

1

⊢ automaton (Si → ui wti)i∈[1..n] : H1 ∗ ... ∗ Hn

(A-strong)

∀i, j ∈ [1..n] H

k

⊢ ui : Hi H | {S1, ..., Sn}

k

⊢ sti (i = j) ⇒ (Si = Sj) H

1

⊢ automaton (Si → ui sti)i∈[1..n] : H1 ∗ ... ∗ Hn

(Eq)

H

k

⊢ p : t H

k

⊢ e : t H(xi) = ti Vars(p) = {x1, ..., xn} H

k

⊢ p = e : [t1/x1, ..., tn/xn]

70 / 107

slide-71
SLIDE 71

(Local)

H

k

⊢ v : H0 H, H0

k

⊢ E : H′ H

k

⊢ local v in E : H′\x

(And)

H

k

⊢ E1 : H1 H

k

⊢ E2 : H2 H

k

⊢ E1 and E2 : H1 + H2

(Last)

H, x : t

1

⊢ last x : t

(Match)

H

k

⊢ e : t t = C1 | ... | Cn ∀i ∈ [1..n] H

k

⊢ Ei : Hi H

k

⊢ match e with (Ci → Ei)i∈[1..n] : H1 ∗ ... ∗ Hn

71 / 107

slide-72
SLIDE 72

(Local-u)

H

k

⊢ v : H0 H, H0

k

⊢ u : H′ H

k

⊢ local v in u : H′\x

(Do-u)

H

k

⊢ E : H′ H

k

⊢ do E :H′

(Until)

H | L

k

⊢ t H | L

k

⊢ until t

(Unless)

H | L

k

⊢ t H | L

k

⊢ unless t

(Epsilon)

H | L

k

⊢ ǫ

(Else)

H | L

k

⊢ t H | L

k

⊢ et H | L

k

⊢ else t et

72 / 107

slide-73
SLIDE 73

(ResetTransition)

H

k

⊢ e : bool S ∈ L H | L

k

⊢ e then S

(ContinueTransition)

H

k

⊢ e : bool S ∈ L H | L

k

⊢ e continue S

(varpat)

H

k

⊢ x : [t/x]

(init)

H

k

⊢ e : t H

1

⊢ x init e : [t/x]

(default)

H

k

⊢ e : t H

k

⊢ x default e : [t/x]

73 / 107

slide-74
SLIDE 74

Semantics

74 / 107

slide-75
SLIDE 75

Environment

The environement is complemented to possibly associate a default or initial value to a variable. ρ ::= ρ + [v/x] | ρ + [v/default x] | [v/last x] | [] If ρ and ρ′ are two environments, we write ρ by ρ′ the completion of ρ with default or initial values from ρ′. This operation is used to define the value of a variable in ρ by [] = ρ ρ by (ρ′ + [v/default x]) = (ρ + [v/x]) by ρ′ ρ by (ρ′ + [v/last x]) = (ρ + [v/x]) by ρ′ ρ by (ρ′ + [v/x]) = ρ by ρ′ If p is a pattern and v is a value, match v with p builds the environment by matching v by p such that: [v|x] = [v/x] [(v1, v2)|(p1, p2)] = [v1|p1] + [v2|p2]

75 / 107

slide-76
SLIDE 76

If E is an equation, ρ is an environment, [ [E] ]Init

ρ

is the initial state and [ [E] ]State

ρ

is the step function. The semantics of an equation E is: [ [E] ]ρ = [ [E] ]Init

ρ , [

[E] ]State

ρ

[ [p = e] ]Init

ρ

= [ [e] ]Init

ρ

[ [p = e] ]State

ρ

= λs.let v, s = [ [e] ]State

ρ

(s) in [v|p], s [ [E1 and E2] ]Init

ρ

= ([ [E1] ]Init

ρ , [

[E2] ]Init

ρ )

[ [E1 and E2] ]State

ρ

= λ(s1, s2).let ρ1, s1 = [ [E1] ]State

ρ

(s1) in let ρ2, s2 = [ [E2] ]State

ρ

(s2) in ρ1 + ρ2, (s1, s2)

76 / 107

slide-77
SLIDE 77

Notation: If ρ = ρ′ + [v/x], ρ\x = ρ′. [ [local x in E] ]Init

ρ

= [ [E] ]Init

ρ

[ [local x in E] ]State

ρ

(s) = let ρ′, s = fix (λs, ρ′.[ [E] ]State

ρ+ρ′ (s))(s) in

ρ′\x, s [ [local x default v in E] ]Init

ρ

= [ [E] ]Init

ρ

[ [local x init v in E] ]Init

ρ

= (v, [ [E] ]Init

ρ )

[ [local x default v in E] ]State

ρ

(s) = let ρ′, s = fix (λρ′, s.[ [E] ]State

ρ+ρ′+[v/default x](s)) in

ρ′\x, s [ [local x init v in E] ]State

ρ

(w, s) = let ρ′, s = fix (λρ′, s.[ [E] ]State

ρ+ρ′+[w/last x](s)) in

ρ′\x, (ρ′(x), s)

77 / 107

slide-78
SLIDE 78

Semantics for conditionals

The semantics for a conditional must consider the case where a branch defines a value for a variable x in one branch but not the other branch. We take the following convention:

  • If a variable x is declared with a default value v, then a missing

equation for x in a branch means that x = v in that branch.

  • Otherwise, x = last x, that is, x keeps its previous value.
  • If x is declared with an initial value for last x, this means that x has a

definition in every branch. Otherwise, there is a potential initialisation issue which has to be checked by other means.

78 / 107

slide-79
SLIDE 79

Semantics for Conditionals

[ [present e do E1 else E2] ]Init

ρ

= ([ [e] ]Init

ρ , [

[E1] ]Init

ρ , [

[E2] ]Init

ρ )

[ [present e do E1 else E2] ]State

ρ

(s, s1, s2) = let v, s = [ [e] ]State

ρ

(s) in if v thenlet ρ1, s1 = [ [E1] ]State

ρ

(s1) in ρ1 by ρ[N\N1], (s, s1, s2) else let ρ2, s2 = [ [E2] ]State

ρ

(s2) in ρ2 by ρ[N\N2], (s, s1, s2) where N = N1 ∪ N2and N1 = Def (E1)and N2 = Def (E2) [ [match e with (Ci → Ei)i∈[1..n]] ]Init

ρ

= ([ [e] ]Init

ρ , [

[E1] ]Init

ρ , ..., [

[En] ]Init

ρ )

79 / 107

slide-80
SLIDE 80

The Transition Function: [ [match e with (Ci → Ei)i∈[1..n]] ]State

ρ

(s, s1, ..., sn) = let v, s = [ [e] ]State

ρ

(s) in match v with Ci → let ρi, si = [ [Ei] ]State

ρ

(si) in ρi by ρ[N\Ni], (s, s1, ..., sn)

  • i∈[1..n]

where N = ∪i∈[1..n](Ni)and Ni = Def (Ei) The Last Computed Value: [ [last x] ]Init

ρ

= () [ [last x] ]State

ρ

= λs.ρ(last x), s

80 / 107

slide-81
SLIDE 81

Initial state of the transition function

[ [ǫ] ]Init

ρ

= () [ [e then S et] ]Init

ρ

= ([ [e] ]Init

ρ , [

[et] ]Init

ρ )

[ [e continue S et] ]Init

ρ

= ([ [e] ]Init

ρ , [

[et] ]Init

ρ )

[ [else t et] ]Init

ρ

= ([ [t] ]Init

ρ , [

[et] ]Init

ρ )

[ [unless t] ]Init

ρ

= [ [t] ]Init

ρ

[ [until t] ]Init

ρ

= [ [t] ]Init

ρ

[ [automaton (Si → ui wti)i∈[1..n]] ]Init

ρ

= let (si = [ [ui] ]Init

ρ )i∈[1..n] in

let (s′

i = [

[wti] ]Init

ρ )i∈[1..n] in

(S0, false, (s1, ..., sn), (s′

1, ..., s′ n))

[ [automaton (Si → ui sti)i∈[1..n]] ]Init

ρ

= let (si = [ [ui] ]Init

ρ )i∈[1..n] in

let (s′

i = [

[sti] ]Init

ρ )i∈[1..n] in

(S0, false, (s1, ..., sn), (s′

1, ..., s′ n))

81 / 107

slide-82
SLIDE 82

Transition functions

Given a transition t, a name ps of a state in the automaton, a value pr for the reset condition, [ [t] ]ps,pr

ρ

(s′) returns a new state name, a new reset condition and a new state. [ [ǫ] ]s,r

ρ (s′)

= s, r [ [e then S st] ]s,r

ρ (s1, s2)

= let s1 = if r then [ [e] ]Init

ρ

else s1 in let v, s1 = [ [e] ]State

ρ

(s1) in let (s, r), s2 = [ [st] ]s,r

ρ (s2) in

if v then (S, true), (s1, s2) else (s, r), (s1, s2) [ [e continue S st] ]s,r

ρ (s1, s2)

= let s1 = if r then [ [e] ]Init

ρ

else s1 in let v, s1 = [ [e] ]State

ρ

(s1) in let (s, r), s2 = [ [st] ]s,r

ρ (s2) in

if v then (S, false), (s1, s2) else (s, r), (s1, s2)

82 / 107

slide-83
SLIDE 83

Automata

[ [automaton (Si → ui wti)i∈[1..n]] ]State

ρ

(ps, pr, s, s′) = let (ρ, ns, nr), (s, s′) = [ [(Si → ui wti)i∈[1..n]] ]ps,pr

ρ

(s, s′) in ρ, (ns, nr, s, s′) [ [automaton (Si → ui sti)i∈[1..n]] ]State

ρ

(ps, pr, s, s′) = let (ρ, ns, nr), (s, s′) = [ [(Si → ui sti)i∈[1..n]] ]ps,pr

ρ

(s, s′) in ρ, (ns, nr, s, s′)

83 / 107

slide-84
SLIDE 84
  • [

[u] ]pr

ρ (s) resets u when pr is true, that is:

[ [u] ]pr

ρ (s) = [

[u] ]State

ρ

(if ps then [ [u] ]Init

ρ

else s) [ [(Si → ui wti)i∈[1..n]] ]ps,pr

ρ

((s1, ..., sn), (s′

1, ..., s′ n)) =

match s with   Si → let ρ, si = [ [ui] ]pr

ρ (si) in

let (ns, nr), s′

i = [

[wti] ]ps,pr

ρ

(s′

i) in

ρ, (ns, nr, (s1, ..., sn), (s′

1, ..., s′ n))

 

i∈[1..n]

[ [(Si → ui sti)i∈[1..n]] ]ps,pr

ρ

((s1, ..., sn), (s′

1, ..., s′ n)) =

let (cs, cr), (s′

1, ..., s′ n) =

match s with Si → let (cs, cr), s′

i = [

[sti] ]ps,pr

ρ

(s′

i) in

(cs, cr, (s′

1, ..., s′ n))

  • i∈[1..n]

in match cs with Si → let ρ, si = [ [ui] ]cr

ρ (si) in

ρ, (cs, cr, (s1, ..., sn), (s′

1, ..., s′ n))

  • i∈[1..n]

84 / 107

slide-85
SLIDE 85

Interpretation

  • The transition function associated with the automaton construct is

executed in an initial state.

  • This state if of the form (ps, pr, s, s′). ps is the current state of the
  • automaton. It is initialised with the initial state of the automaton. pr is

the reset status. It is initialized with the value false. s is the state to execute the code of the strong transitions; s′ is the state to execute the body of the automaton; s′ is the state to execute the transitions.

  • For an automaton with weak transition, the body is executed, then the

transitions.

  • For an automaton with strong transitions, the code of transitions of the

current state are executed. This determines the active state. Then, the corresponding body is executed.

85 / 107

slide-86
SLIDE 86

Exercices and questions

  • Defines the semantics of e1 fby e2.
  • Based on the previous definitions, write an interpretor in Haskell or

OCaml.

  • Defines a small langage to represent the transition functions. Rewrite

your interpretor so that it produces a term from this language.

86 / 107

slide-87
SLIDE 87

Non length-preserving stream functions

87 / 107

slide-88
SLIDE 88

Non length-preserving stream functions

The stream functions we have considered are length preserving: to produce

  • ne output, their step function needs only one input. This is what allowed

us to implement a stream function with type: CStream(T, S) → CStream(T ′, S′) by a value of type: (S′ → T → T ′ × S′) × S′ Hence, it is not possible to represent non length preserving functions like the function even which removes one element over two of the input

  • stream. In Haskell, with : the operation on lists: 8

even (x : (x’ : xs)) = x : (even xs) The destructor function of the input hd,tl has to be applied twice in the transition function of the result.

8See notes of the previous class. 88 / 107

slide-89
SLIDE 89

This would also be the case of filter-like functions like when defined as: (x : xs) when (true : cs) = x : xs when cs (x : xs) when (false : cs) = xs when cs

89 / 107

slide-90
SLIDE 90

Complementing Streams with Absent Values

An obvious idea to overcome the problem and turn these functions into synchronous ones would be to consider the functor F: FT(S) = S + (T × S) with the two value constructors (injective functions): S : S → FT S and P : T × S → FT S where P stands for “present” and S for “silent”. The set of streams is now: CLStream(T, S) = (S → (S + (T × S)) × S Given t, s : CLStream(T, S), the process (t s) can be silent, that is, it only updates its state without outputing values and return the next state or

  • utput a value and returns the next state.

90 / 107

slide-91
SLIDE 91

Then a transition function for even could be: even (CoF(t, s)) = CoF λ(e, s). match t s with | S(sx) → S(e, sx) | P(vx, sx) → if e then P(vx, (false, sx)) else S(true, sx) (true, sx) where e is a boolean state condition telling whether the current step is an even one or not. However, the question is now: does this functor still define streams? An answer to this question is as follows:

91 / 107

slide-92
SLIDE 92

The co-algebra of clocked streams

Theorem (Co-algebra of clocked streams)

The terminal co-algebra associated to the functor FT(S) = S + (T × S)

  • as ground set the set of streams of values in Value(T) = 1 + T, the set

T complemented with an empty value with 1 = {()} with the value constructors: E : Value(T) and V : T → Value(T): Stream(T) = (Value(T))N

  • and as destructor,

dest(v : vs) = match v with E → S(vs) | V(v′) → P(v′, vs)

92 / 107

slide-93
SLIDE 93

Proof:

Given tx : S → F T S a transition function, let us denote by next(., .) the iterated next state function next(.): next(s) = match tx s with S(s′) → s′ | P(v, s′) → s′ next(n, s) = if n = 0 then next(s) else next(n − 1, next(s)) Any function run which makes the following diagram commute: S − − −run − → (Value(T))N | | tx dest ↓ ↓ FT S − − −F id run − → FT (Value(T))N

93 / 107

slide-94
SLIDE 94

yields: dest(run(s)) = match run(s)(0) with | E → S(λn.run(s)(n + 1)) | V(v) → P(v, λn.(run(s)(n + 1))) = match t s with | S(s′) → S(run(s′)) | P(v, s′) → P(v, run(s′)) that is: run(s)(0) = match t s with S(s′) → E | P(v, s′) → V(v) run(s)(n + 1) = match t s with | S(s′) → run(s′)(n) | P(v, s′) → run(s′)(n) = run(next(s))n This uniquely defines run as: run(s)(n) = match t (next(n, s)) with S(s′) → E | P(v, s′) → V(v)

94 / 107

slide-95
SLIDE 95

Definition (Clocks)

The clock of a clocked stream s : (1 + T)N is the boolean stream: clock(s) = λn. match s(n) with | E → false | V(v) → true Note that clocks are just ordinary streams, i.e. without E elements. Yet this result shows also that we can as well assimilate clocked streams with ordinary streams with “empty” values 9. This allows us to easily reuse the result for length preserving streams developed previously. We thus will adopt this point of view in the sequel, by taking: Value(T) = E + V(T) CLStream(T, S) = CStream(Value(T))S

9This quite obvious result has been used and rediscovered many times since the

pioneering work of F. Boussinot [1]. Yet, the above proof may bring some insight about the need for “empty” values.

95 / 107

slide-96
SLIDE 96

New Definitions for Primitives

We can now revisit our previously defined operators as well as create new

  • nes. When defining binary operators, like extend we now find the

following problem: what to do if one argument yields a value while the

  • ther one does not?

At least three possibilities are open: 1) store the value in a state variable implementing a FIFO queue, until it matches an incoming value of the other argument, 2) generate an execution error, 3) or statically reject this situation. As an extension of what is done for Lustre [5] we choose the third solution and write: (CoF(tf , if )) (CoF(te, ie)) = CoF (λ(sf , se). match (tf sf ), (te se) with | (E, sf ′), (E, se′) → E, (sf ′, se′) | (V( vf ), sf ′), (V(ve), se′) → V(vf ve), (sf ′, se′), (if , ie))

96 / 107

slide-97
SLIDE 97

Under the condition that the clocks of the two arguments are the same. Otherwise, the program should raise an execution error (a pattern-matching failure). The purpose of the clock calculus is to statically ensure that such errors do not occur. When expressions have passed the analysis, clock information is used to remove the dynamic test of presence/absence.

97 / 107

slide-98
SLIDE 98

Primitive Functions

When: The co-iterative definition for the filter is as follows, assuming its two arguments share the same clock: (CoF(tx, ix)) when (CoF(tc, ic)) = CoF (λ(sx, sc). match (tx sx), (tc sc) with | (E, sx′), (E, sc′) → E, (sx′, sc′) | (V(vx), sx′), (V(true), sc′) → V(vx), (sx′, sc′) | (V(vx), sx′), (V(false), sc′) → E, (sx′, sc′), (ix, ic)) The clock of the result depends on the boolean condition.

98 / 107

slide-99
SLIDE 99

If the clock of the two arguments is (CoF(tcl, scl)), the clock of the result is CoF(tcl, scl) on CoF(tc, sc): CoF(tcl, icl) on CoF(tc, ic) = CoF λ(scl, sc). match tcl scl with | false, scl′ → let E, sc′ = tc sc in false, (scl′, sc′) | true, scl′ → let V(vc), sc′ = tc sc in vc, (scl′, sc′) (icl, ic)) Note that, according to the definition, a clock is an ordinary stream which has no “silent” move.

99 / 107

slide-100
SLIDE 100

Merge The converse of when whose abstract definition is: merge (false : cs) xs (y : ys) = y : merge cs xs ys merge (true : cs) (x : xs) ys = x : merge cs xs ys and whose co-iterative one is: merge (CoF(tc, ic)) (CoF(tx, ix)) (CoF(ty, iy)) = CoF λ(sc, sx, sy). match (tc sc), (tx sx), (ty sy) with | (E, sc′), (E, sx′), (E, sy′) → E, (sc′, sx′, sy′) | (V(true), sc′), (V(vx), sx′), (E, sy′) → V(vx), (sc′, sx′, sy′) | (V(false), sc′), (E, sx′), (V(vy), sy′) → V(vy), (sc′, sx′, sy′) (ic, ix, iy) This definition does not raise any execution error if the true branch produces a value when the false branch produces no value and the condition is true, and conversely, the true branch does not produce any value when the false branch produces its value and the condition is false.

100 / 107

slide-101
SLIDE 101

Constant This operator is polymorphic in the sense that it may produce or not depending on its environment. For this reason, const should have an extra argument giving its clock. We write it: ClConst(v)(CoF(cl, icl)) = CoF (λscl. match cl scl with | true, s → (V(v), s) | false, s → (E, s), icl) The clock plays an essential role since this is the way to give a deterministic

  • perational semantics to the generator const. The clock calculus can infer

a clock for the constant so that it does not have to be explicitly passed.

101 / 107

slide-102
SLIDE 102

The unit delay We can wonder whether the previous definition for pre extends naturally for programs which do not preserve length. Indeed, we could simply write: pre(v)((CoF(t, i))) = CoF λ(pre, s). match t s with | E, s′ → E, (pre, s′) | V(v), s′ → V(pre), (v, s′) (v, i) Unfortunately, this definition cannot be combined with recursion in a satisfactory way. Running the co-iterative process: fix (λx.pre(0)(x)) implementing the stream equation: x = pre(0)(x) leads to a deadlock (corresponding to a “stack overflow” in Haskell).

102 / 107

slide-103
SLIDE 103

This is due to the fact that the input of pre is connected to its output and pre emits a value iff its input emits a value. This deadlock can be eliminated by adding an extra argument — an input clock — to pre controlling the production. The new definition becomes: pre(v)(CoF(cl, icl))(CoF(t, i)) = CoF λ(pre, s, scl). match cl scl with | false, scl → E, let E, s′ = t s in (pre, s′, scl) | true, scl → V(pre), let V(v), s′ = t s in (v, s′, scl) (v, i, icl) This time, programs are deadlock free if recursions appear on the right of a pre . The use of this new pre instead of the previous one is satisfactory if it is possible to built a system inferring the clock. This will be considered later. The definitions for application, abstraction and recursion remain unchanged.

103 / 107

slide-104
SLIDE 104

To conclude

This co-iterative semantics interprets a stream as a process made of an initial state and a step function. A length preserving function only need its current input to produce its current output. A fix-point on time is replaced by a fix-point on the instant. If equations are syntactically guarded by unit delays, no fix point is needed. In this case, the step function can be expressed in a simple language with let/in construct and a call-by-value semantics. Non length preserving functions are treated by complementing instantaneous values with a explicit “absent”. Non synchrony means that an input is expected to be present but is present (or the converse). Program must fullfil static type constraint (clock calculus).

104 / 107

slide-105
SLIDE 105

Next lesson

Define a causality analysis which give sufficient conditions to ensure that streams never get stuck, i.e, the output does not contain any bottom element. Define it as a type-system which expresses instantaneous dependences only. Show that for programs that have passed the analysis, it is possible to generate statically scheduled code. Extend the language with high-order, making it closer to a typed lambda calculus of streams and stream functions. Give examples in Zelus for that extension.

105 / 107

slide-106
SLIDE 106

References I

  • F. Boussinot.

Réseaux de processus réactifs. Technical Report 1588, INRIA Sophia-Antipolis, janvier 1992. Paul Caspi and Marc Pouzet. A Co-iterative Characterization of Synchronous Stream Functions. In Coalgebraic Methods in Computer Science (CMCS’98), Electronic Notes in Theoretical Computer Science, March 1998. Extended version available as a VERIMAG tech. report no. 97–07 at www.di.ens.fr/∼pouzet/bib/bib.html. Jean-Louis Colaco, Bruno Pagano, and Marc Pouzet. Scade 6: A Formal Language for Embedded Critical Software Development. In Eleventh International Symposium on Theoretical Aspect of Software Engineering (TASE), Sophia Antipolis, France, September 13-15 2017.

  • L. Damas and R. Milner.

Principal type-schemes for functional programs. In Conference on Principles of Programming Languages, 1982.

  • N. Halbwachs, P. Caspi, P. Raymond, and D. Pilaud.

The synchronous dataflow programming language lustre. Proceedings of the IEEE, 79(9):1305–1320, September 1991. Christine Paulin-Mohring. Circuits as streams in Coq, verification of a sequential multiplier. Technical report, Laboratoire de l’Informatique du Parallélisme, September 1995. Available at http://www.ens-lyon.fr:80/LIP/lip/publis/. 106 / 107

slide-107
SLIDE 107

References II

Marc Pouzet. Lucid Synchrone, version 3. Tutorial and reference manual. Université Paris-Sud, LRI, April 2006. Distribution available at: https://www.di.ens.fr/~pouzet/lucid-synchrone/. 107 / 107