String-diagram semantics for functional languages with data-flow - - PowerPoint PPT Presentation

string diagram semantics for functional languages with
SMART_READER_LITE
LIVE PREVIEW

String-diagram semantics for functional languages with data-flow - - PowerPoint PPT Presentation

String-diagram semantics for functional languages with data-flow Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 5th Jan 2018 GoI machine [Mackie 95] [Danos & Regnier 99] Geometry of interaction [Girard


slide-1
SLIDE 1

String-diagram semantics for functional languages with data-flow

Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 5th Jan 2018

slide-2
SLIDE 2

GoI machine

[Mackie ’95] [Danos & Regnier ’99]

  • Geometry of interaction [Girard ’89]
  • Token passing machine
  • a λ-term → a graph (proof-net)
  • evaluation = specific path travelled by the token
slide-3
SLIDE 3

GoI machine

(λx.x + x) (1 + 1)

@ D + λ + C ! ! ! 1 1

slide-4
SLIDE 4

GoI machine

(λx.x + x) (1 + 1)

@ D + λ + C ! ! ! 1 1

Contraction Abstraction Constants box structure Dereliction Binary addition Application

slide-5
SLIDE 5

GoI machine

@ D + λ + C ! ! ! 1 1

? : □

slide-6
SLIDE 6

GoI machine

@ D + λ + C ! ! ! 1 1

R : ? : □

slide-7
SLIDE 7

GoI machine

@ D + λ + C ! ! ! 1 1

R : ? : □

slide-8
SLIDE 8

GoI machine

@ D + λ + C ! ! ! 1 1

R : ? : □

slide-9
SLIDE 9

GoI machine

@ D + λ + C ! ! ! 1 1

? : □

slide-10
SLIDE 10

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : □

slide-11
SLIDE 11

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : □

slide-12
SLIDE 12

GoI machine

@ D + λ + C ! ! ! 1 1

L : ? : ? : □

slide-13
SLIDE 13

GoI machine

@ D + λ + C ! ! ! 1 1

L : ? : ? : □

slide-14
SLIDE 14

GoI machine

@ D + λ + C ! ! ! 1 1

L : ? : ? : □

slide-15
SLIDE 15

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : □

slide-16
SLIDE 16

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : ? : □

slide-17
SLIDE 17

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : ? : □

slide-18
SLIDE 18

GoI machine

@ D + λ + C ! ! ! 1 1

1 : ? : ? : □

slide-19
SLIDE 19

GoI machine

@ D + λ + C ! ! ! 1 1

? : 1 : ? : ? : □

slide-20
SLIDE 20

GoI machine

@ D + λ + C ! ! ! 1 1

? : 1 : ? : ? : □

slide-21
SLIDE 21

GoI machine

@ D + λ + C ! ! ! 1 1

1 : 1 : ? : ? : □

slide-22
SLIDE 22

GoI machine

@ D + λ + C ! ! ! 1 1

2 : ? : □

slide-23
SLIDE 23

GoI machine

@ D + λ + C ! ! ! 1 1

L : 2 : ? : □

slide-24
SLIDE 24

GoI machine

@ D + λ + C ! ! ! 1 1

L : 2 : ? : □

slide-25
SLIDE 25

GoI machine

@ D + λ + C ! ! ! 1 1

L : 2 : ? : □

slide-26
SLIDE 26

GoI machine

@ D + λ + C ! ! ! 1 1

2 : ? : □

slide-27
SLIDE 27

GoI machine

@ D + λ + C ! ! ! 1 1

2 : ? : □

slide-28
SLIDE 28

GoI machine

@ D + λ + C ! ! ! 1 1

? : 2 : ? : □

slide-29
SLIDE 29

GoI machine

@ D + λ + C ! ! ! 1 1

? : 2 : ? : □

slide-30
SLIDE 30

GoI machine

@ D + λ + C ! ! ! 1 1

L : ? : 2 : ? : □

slide-31
SLIDE 31

GoI machine

@ D + λ + C ! ! ! 1 1

L : ? : 2 : ? : □

slide-32
SLIDE 32

GoI machine

@ D + λ + C ! ! ! 1 1

L : ? : 2 : ? : □

slide-33
SLIDE 33

GoI machine

@ D + λ + C ! ! ! 1 1

? : 2 : ? : □

slide-34
SLIDE 34

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : 2 : ? : □

slide-35
SLIDE 35

GoI machine

@ D + λ + C ! ! ! 1 1

? : ? : 2 : ? : □

slide-36
SLIDE 36

GoI machine

@ D + λ + C ! ! ! 1 1

1 : ? : 2 : ? : □

slide-37
SLIDE 37

GoI machine

@ D + λ + C ! ! ! 1 1

? : 1 : ? : 2 : ? : □

slide-38
SLIDE 38

GoI machine

@ D + λ + C ! ! ! 1 1

? : 1 : ? : 2 : ? : □

slide-39
SLIDE 39

GoI machine

@ D + λ + C ! ! ! 1 1

1 : 1 : ? : 2 : ? : □

slide-40
SLIDE 40

GoI machine

@ D + λ + C ! ! ! 1 1

2 : 2 : ? : □

slide-41
SLIDE 41

GoI machine

@ D + λ + C ! ! ! 1 1

L : 2 : 2 : ? : □

slide-42
SLIDE 42

GoI machine

@ D + λ + C ! ! ! 1 1

L : 2 : 2 : ? : □

slide-43
SLIDE 43

GoI machine

@ D + λ + C ! ! ! 1 1

L : 2 : 2 : ? : □

slide-44
SLIDE 44

GoI machine

@ D + λ + C ! ! ! 1 1

2 : 2 : ? : □

slide-45
SLIDE 45

GoI machine

@ D + λ + C ! ! ! 1 1

2 : 2 : ? : □

slide-46
SLIDE 46

GoI machine

@ D + λ + C ! ! ! 1 1

4 : □

slide-47
SLIDE 47

GoI machine

@ D + λ + C ! ! ! 1 1

R : 4 : □

slide-48
SLIDE 48

GoI machine

@ D + λ + C ! ! ! 1 1

R : 4 : □

slide-49
SLIDE 49

GoI machine

@ D + λ + C ! ! ! 1 1

R : 4 : □

slide-50
SLIDE 50

GoI machine

@ D + λ + C ! ! ! 1 1

4 : □

slide-51
SLIDE 51

GoI machine

@ D + λ + C ! ! ! 1 1

4 : □

slide-52
SLIDE 52

GoI machine

[Mackie ’95] [Danos & Regnier ’99]

  • Fixed graph = constants space
  • Compilation to circuit [Ghica ’07]
  • ✔ call-by-name, ? call-by-value
slide-53
SLIDE 53

Dynamic GoI machine

[Muroya & Ghica ’17]

  • Framework for defining FPL semantics
  • Combine token passing & graph rewriting
  • different interleaving = different strategy
slide-54
SLIDE 54

Dynamic GoI machine

[Muroya & Ghica ’17]

(λx.x + x) (1 + 1)

@ D + λ + C ! ! ! 1 1

slide-55
SLIDE 55

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-56
SLIDE 56

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-57
SLIDE 57

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-58
SLIDE 58

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-59
SLIDE 59

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-60
SLIDE 60

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-61
SLIDE 61

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-62
SLIDE 62

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-63
SLIDE 63

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1

(λx.x + x) (1 + 1)

slide-64
SLIDE 64

Dynamic GoI machine

@ D + λ + C ! ! ! 1 1 @ D ! λ + C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-65
SLIDE 65

Dynamic GoI machine

@ D ! λ + C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-66
SLIDE 66

Dynamic GoI machine

@ D ! λ + C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-67
SLIDE 67

Dynamic GoI machine

@ D ! λ + C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-68
SLIDE 68

Dynamic GoI machine

@ D ! λ + C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-69
SLIDE 69

Dynamic GoI machine

@ D ! λ + C ! 2

@ ! λ 2 + C

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-70
SLIDE 70

Dynamic GoI machine

@ ! λ 2 + C

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-71
SLIDE 71

Dynamic GoI machine

@ ! λ 2 + C

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2

slide-72
SLIDE 72

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]

@ ! λ 2 + C + 2 ! C

slide-73
SLIDE 73

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]

+ 2 ! C

slide-74
SLIDE 74

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]

+ 2 ! C

slide-75
SLIDE 75

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]

+ 2 ! C

slide-76
SLIDE 76

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]

+ 2 ! C

slide-77
SLIDE 77

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

+ 2 ! C

+ 2 ! C ! 2

slide-78
SLIDE 78

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

+ 2 ! C ! 2

slide-79
SLIDE 79

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

+ 2 ! C ! 2

slide-80
SLIDE 80

Dynamic GoI machine

+ 2 ! C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

slide-81
SLIDE 81

Dynamic GoI machine

+ 2 ! C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

slide-82
SLIDE 82

Dynamic GoI machine

+ 2 ! C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

slide-83
SLIDE 83

Dynamic GoI machine

+ 2 ! C ! 2

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]

+ 2 ! ! 2

slide-84
SLIDE 84

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]

+ 2 ! ! 2

slide-85
SLIDE 85

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]

+ 2 ! ! 2

slide-86
SLIDE 86

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]

+ 2 ! ! 2

slide-87
SLIDE 87

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2] ↦ 4 [x ↦ 2]

+ 2 ! ! 2

! 4

slide-88
SLIDE 88

DGoIM & data-flow model

  • GoI graph ~ data-flow graph
  • preserve data-flow by suppressing certain rewrites
slide-89
SLIDE 89

Dynamic GoI machine

(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]

+ 2 ! C

+ 2 ! C ! 2

slide-90
SLIDE 90

DGoIM & data-flow model

+ 2 ¡ Ɔ

slide-91
SLIDE 91

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : □

slide-92
SLIDE 92

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : ? : □

slide-93
SLIDE 93

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : ? : □

slide-94
SLIDE 94

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : ? : □

slide-95
SLIDE 95

DGoIM & data-flow model

+ 2 ¡ Ɔ

2 : ? : □

slide-96
SLIDE 96

DGoIM & data-flow model

+ 2 ¡ Ɔ

2 : ? : □

slide-97
SLIDE 97

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : 2 : ? : □

slide-98
SLIDE 98

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : 2 : ? : □

slide-99
SLIDE 99

DGoIM & data-flow model

+ 2 ¡ Ɔ

? : 2 : ? : □

slide-100
SLIDE 100

DGoIM & data-flow model

+ 2 ¡ Ɔ

2 : 2 : ? : □

slide-101
SLIDE 101

DGoIM & data-flow model

+ 2 ¡ Ɔ

2 : 2 : ? : □

slide-102
SLIDE 102

DGoIM & data-flow model

+ 2 ¡ Ɔ

4 : □

slide-103
SLIDE 103

DGoIM & data-flow model

+ 2 ¡ Ɔ

4 : □

slide-104
SLIDE 104

DGoIM & data-flow model

  • Natural and intuitive
  • Transitions are local
  • makes implementation more tractable
  • Examples:
  • PL for Machine learning
  • PL for Self-adjusting computation
slide-105
SLIDE 105

Machine learning

Direct mode Training mode

data = {(xi, yi)}

x0

m(x0) m(x) = W × x + b trained parameters W, b

slide-106
SLIDE 106

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32)

slide-107
SLIDE 107

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) sess = tf.Session() print(sess.run(y, {x:7})) # output 7

Direct mode

slide-108
SLIDE 108

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) # loss loss = tf.reduce_sum(tf.square(linear_model - y)) # optimizer

  • ptimizer = tf.train.GradientDescentOptimizer(0.01)

train = optimizer.minimize(loss) # training data x_train = [1, 2, 3, 4] y_train = [0, -1, -2, -3] # training loop init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) # reset values to wrong for i in range(1000): sess.run(train, {x: x_train, y: y_train}) # evaluate training accuracy curr_W, curr_b, curr_loss = sess.run([W, b, loss], {x: x_train, y: y_train}) print("W: %s b: %s loss: %s"%(curr_W, curr_b, curr_loss))

Training mode

slide-109
SLIDE 109

DSL for machine learning

  • Goals:
  • Proper functional language for machine learning
  • Support both direct and training mode seamlessly
slide-110
SLIDE 110

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …

slide-111
SLIDE 111

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …

λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¡ ¡ 1

Parameters

slide-112
SLIDE 112

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …

Direct mode

slide-113
SLIDE 113

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in … let f@p = linear_model in … f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0] decoupling

slide-114
SLIDE 114

DSL for machine learning

let f@p = linear_model in … f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0]

A ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1

! ! D ! [1,0] λ D P2 ! λ ? D + * D D D ? ? ! ? ?

Decoupling

slide-115
SLIDE 115

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …

Training mode

slide-116
SLIDE 116

Parameters management

Confidence interval Confidence interval

let linear W b x = W * x + b in let lower p = linear (nth 0 p) (nth 1 p) in let upper p = linear (nth 0 p) (nth 2 p) in let ci p = (lower [(nth 0 p);(nth 1 p)], upper [(nth 0 p);(nth 2 p)]) in let newP = optimise_ci ci [1;1;2] … in let newCi = ci newP in …

Without decoupling

slide-117
SLIDE 117

Parameters management

let linear W b x = W * x + b in let lower p = linear (nth 0 p) (nth 1 p) in let upper p = linear (nth 0 p) (nth 2 p) in let ci p = (lower [(nth 0 p);(nth 1 p)], upper [(nth 0 p);(nth 2 p)]) in let newP = optimise_ci ci [1;1;2] … in let newCi = ci newP in …

With decoupling Without decoupling Confidence interval Confidence interval

let linear W b x = W * x + b in let a = {1} in let lower = linear a {1} in let upper = linear a {2} in let ci = (lower , upper) in let f@p = ci in let newP = optimise_ci f p … in let newCi = f newP in …

slide-118
SLIDE 118

DSL for machine learning

  • Proved soundness (including garbage collection)
  • Looking into equational theory (beta, eta, etc)
  • Implementation:
  • https://cwtsteven.github.io/GoI-TF-Visualiser/
  • as an extension to OCaml (work in progress)
slide-119
SLIDE 119

Self-adjusting Computation

[Acar ’05]

  • Concern the input-output relationship in a program
  • Efficient updates
  • using dependency graph and memoization
slide-120
SLIDE 120

Self-adjusting Computation

[Acar ’05]

  • Think of spreadsheets

1

x

2

m

+ 1 2

y

3

n

+ 1 5

z

+

let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise();

slide-121
SLIDE 121

Self-adjusting Computation

[Acar ’05]

  • Think of spreadsheets

3

x

2

m

+ 1 2

y

3

n

+ 1 5

z

+

let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise();

slide-122
SLIDE 122

Self-adjusting Computation

[Acar ’05]

  • Think of spreadsheets

3

x

4

m

+ 1 2

y

3

n

+ 1 7

z

+

let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise();

slide-123
SLIDE 123

Self-adjusting Computation

[Acar ’05]

  • Think of spreadsheets

3

x

4

m

+ 1 2

y

3

n

+ 1 7

z

+

let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise(); Circular dependence

slide-124
SLIDE 124

DSL for SAC

  • Allow circular dependence
  • Break down stabilisation into step by step propagation
slide-125
SLIDE 125

DSL for SAC

M 1 ! M 1 + ! 2 3

Cell x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y Value Dependency

slide-126
SLIDE 126

DSL for SAC

let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y

C M 1 + + ! 2 M 3 ! 3

x y

slide-127
SLIDE 127

DSL for SAC

let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y

C M 1 + + ! 2 M 3 ! 3

x y ? : □ ? : □

slide-128
SLIDE 128

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : ? : □ ? : ? : □

slide-129
SLIDE 129

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : ? : □ ? : ? : □

slide-130
SLIDE 130

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 3 : ? : □ 2 : ? : □

slide-131
SLIDE 131

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : 3 : ? : □ ? : 2 : ? : □

slide-132
SLIDE 132

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : 3 : ? : □ ? : 2 : ? : □

slide-133
SLIDE 133

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : 3 : ? : □ 1 : 2 : ? : □

slide-134
SLIDE 134

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 3 : 3 : ? : □ 3 : □

slide-135
SLIDE 135

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 3 : 3 : ? : □ 3 : □

slide-136
SLIDE 136

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 6 : □ 3 : □

slide-137
SLIDE 137

DSL for SAC

C M 1 + + ! 2 M 3 ! 3

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 6 : □ 3 : □

slide-138
SLIDE 138

DSL for SAC

x y

C M 6 + + ! 2 M 3 ! 3

let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y b = true

slide-139
SLIDE 139

DSL for SAC

x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y b = true

C M 6 + + ! 2 M 8 ! 3

slide-140
SLIDE 140

Comparison

  • Simulate stabilise()
  • let rec stabilise = if prop then stabilise else ()
  • Synchronous languages
  • Create data-flow equations
  • X(n) = Y(n-1) + Z(n)
  • Lustre: X = pre(Y) + Z
  • Ours: let x = {y} + z
  • prop = explicit clock tick
  • set x to … = alter data-flow equations
slide-141
SLIDE 141

Comparison

  • FRP vs Self-adjusting computation
  • Synchronous languages vs Ours
slide-142
SLIDE 142

DSL for SAC

  • Proving soundness and other equational laws
  • Implementation:
  • https://cwtsteven.github.io/GoI-SAC-Visualiser/
  • as an extension to OCaml (work in progress)
slide-143
SLIDE 143

Conclusion

  • Example PLs:
  • PL for machine learning
  • PL for self-adjusting computation
  • GoI-style semantics
  • natural and intuitive for data flow models