String-diagram semantics for functional languages with data-flow - - PowerPoint PPT Presentation

string diagram semantics for functional languages with
SMART_READER_LITE
LIVE PREVIEW

String-diagram semantics for functional languages with data-flow - - PowerPoint PPT Presentation

String-diagram semantics for functional languages with data-flow Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 9th Sep 2017 GoI machine [Mackie 95] [Danos & Regnier 99] Geometry of interaction [Girard


slide-1
SLIDE 1

String-diagram semantics for functional languages with data-flow

Steven Cheung & Dan Ghica & Koko Moruya University of Birmingham 9th Sep 2017

slide-2
SLIDE 2

GoI machine

[Mackie ’95] [Danos & Regnier ’99]

  • Geometry of interaction [Girard ’89]
  • a λ-term → a graph (proof-net)
  • evaluation = specific path in the fixed graph
  • ✔ call-by-name, ? call-by-value
slide-3
SLIDE 3

Dynamic GoI machine

[Muroya & Ghica ’17]

  • Framework for defining FPL semantics
  • Combine token passing & graph rewriting
  • different interleaving = different strategy
slide-4
SLIDE 4

Dynamic GoI machine

[Muroya & Ghica ’17]

@ D + λ + C D D ! D D 1 ! 1 !

(λx.x + x) (1 + 1)

slide-5
SLIDE 5

Dynamic GoI machine

[Muroya & Ghica ’17]

@ D + λ + C D D ! D D 1 ! 1 !

(λx.x + x) (1 + 1)

slide-6
SLIDE 6

Dynamic GoI machine

[Muroya & Ghica ’17]

@ D + λ + C D D ! D D 1 ! 1 !

(λx.x + x) (1 + 1) → (λx.x + x) 2

@ D ! λ + C D D ! 2

slide-7
SLIDE 7

Dynamic GoI machine

[Muroya & Ghica ’17]

@ D ! λ + C D D ! 2

(λx.x + x) (1 + 1) → (λx.x + x) 2

slide-8
SLIDE 8

Dynamic GoI machine

[Muroya & Ghica ’17]

@ D ! λ + C D D ! 2

(λx.x + x) (1 + 1) → (λx.x + x) 2 → x + x [x ↦ 2]

+ 2 ! D D C

slide-9
SLIDE 9

Dynamic GoI machine

[Muroya & Ghica ’17]

+ 2 ! D D C

(λx.x + x) (1 + 1) → (λx.x + x) 2 → x + x [x ↦ 2]

slide-10
SLIDE 10

Dynamic GoI machine

[Muroya & Ghica ’17]

+ 2 ! D D C

(λx.x + x) (1 + 1) → (λx.x + x) 2 → x + x [x ↦ 2] → x + 2 [x ↦ 2]

+ 2 ! D D C ! 2

slide-11
SLIDE 11

Dynamic GoI machine

[Muroya & Ghica ’17]

+ 2 ! D D C ! 2

(λx.x + x) (1 + 1) → (λx.x + x) 2 → x + x [x ↦ 2] → x + 2 [x ↦ 2] → 2 + 2

slide-12
SLIDE 12

Dynamic GoI machine

[Muroya & Ghica ’17]

+ 2 ! D D C ! 2

(λx.x + x) (1 + 1) → (λx.x + x) 2 → x + x [x ↦ 2] → x + 2 [x ↦ 2] → 2 + 2 → 4

! 4

slide-13
SLIDE 13

Dynamic GoI machine

[Muroya & Ghica ’17]

@ D + λ + C D D ! D D 1 ! 1 !

(λx.x + x) (1 + 1)

slide-14
SLIDE 14

DGoIM & data-flow model

  • Machine graph ~ data-flow graph
  • GoI-style semantics
  • natural & intuitive
  • make the complex operational semantics tractable
  • Examples:
  • PL for Machine learning
  • PL for Self-adjusting computation
slide-15
SLIDE 15

Machine learning

  • m(x) = W * x + b
  • init: W = 1, b = 0
  • tune the value of W and b to minimise the loss function
  • TensorFlow
slide-16
SLIDE 16

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) # loss loss = tf.reduce_sum(tf.square(linear_model - y)) # sum of the squares # optimizer

  • ptimizer = tf.train.GradientDescentOptimizer(0.01)

train = optimizer.minimize(loss) # training data x_train = [1, 2, 3, 4] y_train = [0, -1, -2, -3] # training loop init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) # reset values to wrong for i in range(1000): sess.run(train, {x: x_train, y: y_train}) # evaluate training accuracy curr_W, curr_b, curr_loss = sess.run([W, b, loss], {x: x_train, y: y_train}) print("W: %s b: %s loss: %s"%(curr_W, curr_b, curr_loss))

slide-17
SLIDE 17

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) # loss loss = tf.reduce_sum(tf.square(linear_model - y)) # sum of the squares # optimizer

  • ptimizer = tf.train.GradientDescentOptimizer(0.01)

train = optimizer.minimize(loss) # training data x_train = [1, 2, 3, 4] y_train = [0, -1, -2, -3] # training loop init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) # reset values to wrong for i in range(1000): sess.run(train, {x: x_train, y: y_train}) # evaluate training accuracy curr_W, curr_b, curr_loss = sess.run([W, b, loss], {x: x_train, y: y_train}) print("W: %s b: %s loss: %s"%(curr_W, curr_b, curr_loss))

slide-18
SLIDE 18

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) # loss loss = tf.reduce_sum(tf.square(linear_model - y)) # sum of the squares # optimizer

  • ptimizer = tf.train.GradientDescentOptimizer(0.01)

train = optimizer.minimize(loss) # training data x_train = [1, 2, 3, 4] y_train = [0, -1, -2, -3] # training loop init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) # reset values to wrong for i in range(1000): sess.run(train, {x: x_train, y: y_train}) # evaluate training accuracy curr_W, curr_b, curr_loss = sess.run([W, b, loss], {x: x_train, y: y_train}) print("W: %s b: %s loss: %s"%(curr_W, curr_b, curr_loss))

slide-19
SLIDE 19

TensorFlow

import tensorflow as tf # Model parameters W = tf.Variable([1], dtype=tf.float32) b = tf.Variable([0], dtype=tf.float32) # Model input and output x = tf.placeholder(tf.float32) linear_model = W * x + b y = tf.placeholder(tf.float32) # loss loss = tf.reduce_sum(tf.square(linear_model - y)) # sum of the squares # optimizer

  • ptimizer = tf.train.GradientDescentOptimizer(0.01)

train = optimizer.minimize(loss) # training data x_train = [1, 2, 3, 4] y_train = [0, -1, -2, -3] # training loop init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) # reset values to wrong for i in range(1000): sess.run(train, {x: x_train, y: y_train}) # evaluate training accuracy curr_W, curr_b, curr_loss = sess.run([W, b, loss], {x: x_train, y: y_train}) print("W: %s b: %s loss: %s"%(curr_W, curr_b, curr_loss))

slide-20
SLIDE 20

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let f@p = linear_model in let linear_model’ = f (grad_desc f p loss rate) in …

slide-21
SLIDE 21

DSL for machine learning

let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let f@p = linear_model in let linear_model’ = f (grad_desc f p loss rate) in …

slide-22
SLIDE 22

DSL for machine learning

let f@p = linear_model in … f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0] “Abductive” decoupling let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let f@p = linear_model in let linear_model’ = f (grad_desc f p loss rate) in …

slide-23
SLIDE 23

DSL for machine learning

let f@p = linear_model in … f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0] “Abductive” decoupling let loss f p = … in let grad_desc f p loss rate = … in let linear_model x = {1} * x + {0} in let f@p = linear_model in let linear_model’ = f (grad_desc f p loss rate) in …

slide-24
SLIDE 24

Abductive decoupling

let f@p = λx. {1} * x + {0} in f p

@ λ ! A D @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1

slide-25
SLIDE 25

Abductive decoupling

@ λ ! A D @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1

D A ! @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1

slide-26
SLIDE 26

Abductive decoupling

D A ! @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1

slide-27
SLIDE 27

Abductive decoupling

D A ! @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1

slide-28
SLIDE 28

Abductive decoupling

D A ! @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1 D @ D ! ! D ? ! λ D P2 ! λ ? D + * D D D ? ? ! ? ? [1,0] ! 1 ¡ ¡ Ɔ0 Ɔ0

slide-29
SLIDE 29

Abductive decoupling

D A ! @ D ! ? D ? ? ! λ ? D + * ᗡ ᗡ D ¿ ¿ ! ¿ ¿ ¡ ¡ 1 D @ D ! ! D ? ! λ D P2 ! λ ? D + * D D D ? ? ! ? ? [1,0] ! 1 ¡ ¡ Ɔ0 Ɔ0

f ≜ λp. λx. p[0] * x + p[1] p ≜ [1, 0]

slide-30
SLIDE 30

Self-adjusting Computation

[Acar ’05]

  • Spreadsheet meets functional programming
  • Adjust the output with minimal re-computation

let x = 1, y = 2, m = x + 1, n = y + 1 in m + n

1

x

2

m

+ 1 2

y

3

n

+ 1 5

z

+

slide-31
SLIDE 31

Self-adjusting Computation

[Acar ’05]

  • Spreadsheet meets functional programming
  • Adjust the output with minimal re-computation

let x = 3, y = 2, m = x + 1, n = y + 1 in m + n

3

x

4

m

+ 1 2

y

3

n

+ 1 7

z

+

slide-32
SLIDE 32

Self-adjusting Computation

[Acar ’05]

  • Spreadsheet meets functional programming
  • Adjust the output with minimal re-computation
  • ↓ re-computation = ↑ performance
  • Dynamic dependency graph + memoisation

let x = 3, y = 2, m = x + 1, n = y + 1 in m + n

3

x

4

m

+ 1 2

y

3

n

+ 1 7

z

+

slide-33
SLIDE 33

DGoIM & SAC

@ D λ @ C D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! D D 2 ! ! 1 !

let x = {1} in let y = x + 2 in (set x to 3); prop; y

slide-34
SLIDE 34

DGoIM & SAC

@ D λ @ C D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! D D 2 ! ! 1 !

let x = {1} in let y = x + 2 in (set x to 3); prop; y

slide-35
SLIDE 35

DGoIM & SAC

@ D λ @ C D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! D D 2 ! ! 1 ! @ D M λ @ C D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! D D 2 ! ! 1 ! ! 1

let x = {1} in let y = x + 2 in (set x to 3); prop; y

slide-36
SLIDE 36

DGoIM & SAC

@ D M λ @ C D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! D D 2 ! ! 1 ! ! 1

let x = {1} in let y = x + 2 in (set x to 3); prop; y

slide-37
SLIDE 37

DGoIM & SAC

@ D M λ @ C D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! D D 2 ! ! 1 ! ! 1

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! 1 D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! C D D 2 !

slide-38
SLIDE 38

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! 1 D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! C D D 2 !

slide-39
SLIDE 39

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! 1 D + λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! C D D 2 ! @ 1 ! M ! 1 D M λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! C + D 2 ! 3

slide-40
SLIDE 40

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! 1 D M λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! C + D 2 ! 3

slide-41
SLIDE 41

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! 1 D M λ @ ? D @ λ C0 ! D Δ λ P C0 ! ! ? 3 ! C + D 2 ! 3 @ 1 ! M ! 1 + D 2 C M ! 3 D @ λ C0 ? ! D Δ λ P C0 ! ! 3

slide-42
SLIDE 42

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! 1 + D 2 C M ! 3 D @ λ C0 ? ! D Δ λ P C0 ! ! 3

slide-43
SLIDE 43

DGoIM & SAC

@ 1 ! M ! 1 + D 2 C M ! 3 D @ λ C0 ? ! D Δ λ P C0 ! ! 3

@ 1 ! R ! C 1 + D 2 C M ! 3 D @ λ C0 ? ! D Δ λ P C0 ! 3 ! C0

let x = {1} in let y = x + 2 in (set x to 3); prop; y

slide-44
SLIDE 44

DGoIM & SAC

@ 1 ! R ! C 1 + D 2 C M ! 3 D @ λ C0 ? ! D Δ λ P C0 ! 3 ! C0

let x = {1} in let y = x + 2 in (set x to 3); prop; y

slide-45
SLIDE 45

DGoIM & SAC

@ 1 ! R ! C 1 + D 2 C M ! 3 D @ λ C0 ? ! D Δ λ P C0 ! 3 ! C0

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D @ λ C0 ? ! D ! λ P C0 ! 3 ! C0 C0 C0 ()

slide-46
SLIDE 46

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D @ λ C0 ? ! D ! λ P C0 ! 3 ! C0 C0 C0 ()

slide-47
SLIDE 47

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D @ λ C0 ? ! D ! λ P C0 ! 3 ! C0 C0 C0 () @ 1 ! R ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0

slide-48
SLIDE 48

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0

slide-49
SLIDE 49

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0 @ 1 ! M ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0

slide-50
SLIDE 50

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0 @ 1 ! M ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0

slide-51
SLIDE 51

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! R ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0 @ 1 ! M ! C 1 + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C0 C0 () ! C0

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3

slide-52
SLIDE 52

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3

slide-53
SLIDE 53

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3

slide-54
SLIDE 54

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3

slide-55
SLIDE 55

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3

slide-56
SLIDE 56

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3

slide-57
SLIDE 57

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 @ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-58
SLIDE 58

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-59
SLIDE 59

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-60
SLIDE 60

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-61
SLIDE 61

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-62
SLIDE 62

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-63
SLIDE 63

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-64
SLIDE 64

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D P λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5

slide-65
SLIDE 65

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D ! λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5 ()

slide-66
SLIDE 66

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

@ 1 ! M ! ! 1 ! + D 2 C M ! 3 ! D ! λ C0 ? ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5 ()

M 1 ! M ! ! 1 ! + D 2 C ! 3 ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5 () ! C0

slide-67
SLIDE 67

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

M 1 ! M ! ! 1 ! + D 2 C ! 3 ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5 () ! C0

slide-68
SLIDE 68

DGoIM & SAC

let x = {1} in let y = x + 2 in (set x to 3); prop; y

M 1 ! M ! ! 1 ! + D 2 C ! 3 ! 3 ! C0 C C0 C0 () ! C0 3 C0 3 C0 5 () ! C0

slide-69
SLIDE 69

Conclusion

  • Soundness of execution & garbage collection, equational

laws (beta, eta)

  • Difficult to reason about using alternative semantics style
slide-70
SLIDE 70

Visualisers

  • for machine learning
  • http://www.cs.bham.ac.uk/~drg/goa/visualiser/
  • for self-adjusting computation
  • https://cwtsteven.github.io/GoI-SAC-Visualiser/