String-diagram semantics for functional languages with data-flow
Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 5th Jan 2018
String-diagram semantics for functional languages with data-flow - - PowerPoint PPT Presentation
String-diagram semantics for functional languages with data-flow Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 5th Jan 2018 GoI machine [Mackie 95] [Danos & Regnier 99] Geometry of interaction [Girard
Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 5th Jan 2018
[Mackie ’95] [Danos & Regnier ’99]
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
Contraction Abstraction Constants box structure Dereliction Binary addition Application
@ D + λ + C ! ! ! 1 1
? : □
@ D + λ + C ! ! ! 1 1
R : ? : □
@ D + λ + C ! ! ! 1 1
R : ? : □
@ D + λ + C ! ! ! 1 1
R : ? : □
@ D + λ + C ! ! ! 1 1
? : □
@ D + λ + C ! ! ! 1 1
? : ? : □
@ D + λ + C ! ! ! 1 1
? : ? : □
@ D + λ + C ! ! ! 1 1
L : ? : ? : □
@ D + λ + C ! ! ! 1 1
L : ? : ? : □
@ D + λ + C ! ! ! 1 1
L : ? : ? : □
@ D + λ + C ! ! ! 1 1
? : ? : □
@ D + λ + C ! ! ! 1 1
? : ? : ? : □
@ D + λ + C ! ! ! 1 1
? : ? : ? : □
@ D + λ + C ! ! ! 1 1
1 : ? : ? : □
@ D + λ + C ! ! ! 1 1
? : 1 : ? : ? : □
@ D + λ + C ! ! ! 1 1
? : 1 : ? : ? : □
@ D + λ + C ! ! ! 1 1
1 : 1 : ? : ? : □
@ D + λ + C ! ! ! 1 1
2 : ? : □
@ D + λ + C ! ! ! 1 1
L : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : 2 : ? : □
@ D + λ + C ! ! ! 1 1
2 : ? : □
@ D + λ + C ! ! ! 1 1
2 : ? : □
@ D + λ + C ! ! ! 1 1
? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
? : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
? : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
1 : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
? : 1 : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
? : 1 : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
1 : 1 : ? : 2 : ? : □
@ D + λ + C ! ! ! 1 1
2 : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : 2 : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : 2 : 2 : ? : □
@ D + λ + C ! ! ! 1 1
L : 2 : 2 : ? : □
@ D + λ + C ! ! ! 1 1
2 : 2 : ? : □
@ D + λ + C ! ! ! 1 1
2 : 2 : ? : □
@ D + λ + C ! ! ! 1 1
4 : □
@ D + λ + C ! ! ! 1 1
R : 4 : □
@ D + λ + C ! ! ! 1 1
R : 4 : □
@ D + λ + C ! ! ! 1 1
R : 4 : □
@ D + λ + C ! ! ! 1 1
4 : □
@ D + λ + C ! ! ! 1 1
4 : □
[Mackie ’95] [Danos & Regnier ’99]
[Muroya & Ghica ’17]
[Muroya & Ghica ’17]
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1
(λx.x + x) (1 + 1)
@ D + λ + C ! ! ! 1 1 @ D ! λ + C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ D ! λ + C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ D ! λ + C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ D ! λ + C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ D ! λ + C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ D ! λ + C ! 2
@ ! λ 2 + C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ ! λ 2 + C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
@ ! λ 2 + C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]
@ ! λ 2 + C + 2 ! C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]
+ 2 ! C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]
+ 2 ! C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]
+ 2 ! C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2]
+ 2 ! C
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C
+ 2 ! C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C ! 2
+ 2 ! C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]
+ 2 ! ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]
+ 2 ! ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]
+ 2 ! ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2]
+ 2 ! ! 2
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2] ↦ 2 + 2 [x ↦ 2] ↦ 4 [x ↦ 2]
+ 2 ! ! 2
! 4
(λx.x + x) (1 + 1) ↦ (λx.x + x) 2 ↦ x + x [x ↦ 2] ↦ x + 2 [x ↦ 2]
+ 2 ! C
+ 2 ! C ! 2
+ 2 ¡ Ɔ
+ 2 ¡ Ɔ
? : □
+ 2 ¡ Ɔ
? : ? : □
+ 2 ¡ Ɔ
? : ? : □
+ 2 ¡ Ɔ
? : ? : □
+ 2 ¡ Ɔ
2 : ? : □
+ 2 ¡ Ɔ
2 : ? : □
+ 2 ¡ Ɔ
? : 2 : ? : □
+ 2 ¡ Ɔ
? : 2 : ? : □
+ 2 ¡ Ɔ
? : 2 : ? : □
+ 2 ¡ Ɔ
2 : 2 : ? : □
+ 2 ¡ Ɔ
2 : 2 : ? : □
+ 2 ¡ Ɔ
4 : □
+ 2 ¡ Ɔ
4 : □
Direct mode Training mode
data = {(xi, yi)}
x0
m(x0) m(x) = W × x + b trained parameters W, b
import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32)
import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) sess = tf.Session() print(sess.run(y, {x:7})) # output 7
Direct mode
import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) # loss loss = tf.reduce_sum(tf.square(linear_model - y)) # optimizer
train = optimizer.minimize(loss) # training data x_train = [1, 2, 3, 4] y_train = [0, -1, -2, -3] # training loop init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) # reset values to wrong for i in range(1000): sess.run(train, {x: x_train, y: y_train}) # evaluate training accuracy curr_W, curr_b, curr_loss = sess.run([W, b, loss], {x: x_train, y: y_train}) print("W: %s b: %s loss: %s"%(curr_W, curr_b, curr_loss))
Training mode
let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …
let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …
λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¡ ¡ 1
Parameters
let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …
Direct mode
let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in … let f@p = linear_model in … f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0] decoupling
let f@p = linear_model in … f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0]
A ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1
! ! D ! [1,0] λ D P2 ! λ ? D + * D D D ? ? ! ? ?
Decoupling
let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let output = linear_model 7 in let f@p = linear_model in let trained_model = f (grad_desc f p loss rate) in …
Training mode
Confidence interval Confidence interval
let linear W b x = W * x + b in let lower p = linear (nth 0 p) (nth 1 p) in let upper p = linear (nth 0 p) (nth 2 p) in let ci p = (lower [(nth 0 p);(nth 1 p)], upper [(nth 0 p);(nth 2 p)]) in let newP = optimise_ci ci [1;1;2] … in let newCi = ci newP in …
Without decoupling
let linear W b x = W * x + b in let lower p = linear (nth 0 p) (nth 1 p) in let upper p = linear (nth 0 p) (nth 2 p) in let ci p = (lower [(nth 0 p);(nth 1 p)], upper [(nth 0 p);(nth 2 p)]) in let newP = optimise_ci ci [1;1;2] … in let newCi = ci newP in …
With decoupling Without decoupling Confidence interval Confidence interval
let linear W b x = W * x + b in let a = {1} in let lower = linear a {1} in let upper = linear a {2} in let ci = (lower , upper) in let f@p = ci in let newP = optimise_ci f p … in let newCi = f newP in …
[Acar ’05]
[Acar ’05]
1
x
2
m
+ 1 2
y
3
n
+ 1 5
z
+
let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise();
[Acar ’05]
3
x
2
m
+ 1 2
y
3
n
+ 1 5
z
+
let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise();
[Acar ’05]
3
x
4
m
+ 1 2
y
3
n
+ 1 7
z
+
let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise();
[Acar ’05]
3
x
4
m
+ 1 2
y
3
n
+ 1 7
z
+
let x = 1 in let y = 2 in let m = x + 1 in let n = y + 1 in let z = m + n in set x to 3; stabilise(); Circular dependence
M 1 ! M 1 + ! 2 3
Cell x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y Value Dependency
let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y
C M 1 + + ! 2 M 3 ! 3
x y
let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y
C M 1 + + ! 2 M 3 ! 3
x y ? : □ ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : ? : □ ? : ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : ? : □ ? : ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 3 : ? : □ 2 : ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : 3 : ? : □ ? : 2 : ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : 3 : ? : □ ? : 2 : ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y ? : 3 : ? : □ 1 : 2 : ? : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 3 : 3 : ? : □ 3 : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 3 : 3 : ? : □ 3 : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 6 : □ 3 : □
C M 1 + + ! 2 M 3 ! 3
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y 6 : □ 3 : □
x y
C M 6 + + ! 2 M 3 ! 3
let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y b = true
x y let x = {1} in let y = {x + 2} in set x to y + 3; let b = prop in let b = prop in y b = true
C M 6 + + ! 2 M 8 ! 3