Propagators
Edward Kmett Yow! LambdaJam 2016
Propagators Edward Kmett Yow! LambdaJam 2016 Semilattices - - PowerPoint PPT Presentation
Propagators Edward Kmett Yow! LambdaJam 2016 Semilattices Commutative: a Commutative: a b = b a Associative: (a b) c = a (b c) Associative: (a Idempotent: a a = a Idempotent:
Edward Kmett Yow! LambdaJam 2016
man adult woman male female boy girl child ⟘ ⟙
Every function in Haskell is monotone
A between join-semilattices. A propagators for hyperedges, and join- semilattices for nodes. A propagator is a monotone function between join-semilattices. A propagator network is a hypergraph with propagators for hyperedges, and join- semilattices for nodes.
10
propagators that lead out of that node to fire.
then join them with the values in the targets.
terminates and yields a deterministic result, regardless of scheduling strategy, redundant firings, or evaluation order.
Given a1 ⩽ a2 ⩽ a3 ⩽ … There exists n such that an = an+1 = an+2 = …
Every strictly ascending sequence of elements terminates.
condition, then naïve propagation terminates and yields a deterministic result, regardless of scheduling strategy, redundant firings, or evaluation order.
Lets you:
Par computations yield monotone effects
Problem: Par does not play nice. readIVar :: IVar s a -> Par s a but we we want readIVar :: IVar s a -> a Par doesn’t mix well with lazy IO or even laziness in general. Solution: Build Par with MVars, not a fancy work-stealing deque.
Problem: Too much overhead! Solution: We can build capability-local MVars by hacking up custom primitives and get most of the performance of the unthreaded runtime system while threaded.
Problem: Too many listeners get awoken upon every update! Solution: Decompose them into smaller LVars and finer-grained tasks.
set of core primitives and more threads, reducing contention:
publish-subscribe
queue of listeners.
We just need to build a whole new class of wait-free (except for GC), population-oblivious, but capability- aware algorithms in order to execute these primitives efficiently as the number of cores rise! (Out of scope for this talk)
T F ⟘ T F ⟘ T F ⟘
Promises SAT Solving Datalog CvRDTs Constraint Programming Unification Interval Arithmetic Integer Linear Programming Cone Programming Hybrid Constraint Linear Programming Functional Reactive Programming Probabilistic Programming Provenance Tracking Incremental Programming unamb …
Idea: Steal the features that make dedicated solvers for each of these specialized domains fast and apply them to the other domains!
techniques evolved in zChaff and since honed in solvers such as miniSAT.
(x ⋁ ¬y ⋁ z) ⋀ (¬x ⋁ y ⋁ ¬z) ⋀ (x ⋁ ¬z ⋁ w) Unit Propagation: (x) implies x = True Empty Clauses: () implies we need to learn a clause and backtrack.
their inputs are non-⊥: p(…,⊥,…) = ⊥
the propagator when those arguments increase, not when any argument increases.
When all of the inputs of a propagator are covered by top, the propagator can never fire again and can be removed from the network. p(x1,x2,…,xn) ∀n. xn ⋖ ⟙
Input: A set of variables X A set of domains D(x) for each variable x in X. D(x) contains vx0, vx1... vxn, the possible values of x A set of unary constraints R1(x) on variable x that must be satisfied A set of binary constraints R2(x, y) on variables x and y that must be satisfied function ac3 (X, D, R1, R2) for each x in X D(x) := { x in D(x) | R1(x) } worklist := { (x, y) | there exists a relation R2(x, y) or a relation R2(y, x) } do select any arc (x, y) from worklist worklist := worklist - (x, y) if arc-reduce (x, y) if D(x) is empty return failure else worklist := worklist + { (z, x) | z != y and there exists a relation R2(x, z) or a relation R2(z, x) } while worklist not empty function arc-reduce (x, y) bool change = false for each vx in D(x) find a value vy in D(y) such that vx and vy satisfy the constraint R2(x, y) if there is no such vy { D(x) := D(x) - vx change := true } return change
consistency (AC-3) is a form of propagation.
ancestor(X,Y) :- parent(X,Y) ancestor(X,Z) :- ancestor(X,Y), parent(X,Z) parent(bob, nancy). parent(nancy, drew). parent(dan, nancy).
ancestor(X,Y) :- parent(X,Y) ancestor(X,Z) :- ancestor(X,Y), parent(X,Z) Δn-1ancestor(X,Z) :- Δnancestor(X,Y), parent(X,Z) ancestor 1 2 1 2 parent
construction good for cells with many outbound propagators.
monoidal annotations in the tree. Δ is a prefix sum. Merge updates into the tree on the left.
update in between.
maximizes the size of the Δs used, reducing the number of times we join against full tables.
us to be able to compute a dynamic topological
store perfectly, but we don’t need perfection.
for a small k.
aggregation” operations.
a cycle in the graph. If they do, blow up the world.
monotone.
“A program has an eventually consistent, coordination-free execution strategy if and only if it is expressible in (monotonic) Datalog”
— Joe Hellerstein (Consistency and Logical Modularity)
living in a distributed propagator network.
Conal Elliott’s work on unamb
computations that can make more computations demand driven, which also always build monotone propagators.
Any Questions
lock-free. some worker finishes in finite time wait free. every worker finishes in finite time wait free bounded O(f(n)). every worker finishes in time O(f(n)) time when there are n workers. wait free population oblivious is wait free bounded O(1)