On Higher-Order Probabilistic Computation: Relational Reasoning, - - PowerPoint PPT Presentation
On Higher-Order Probabilistic Computation: Relational Reasoning, - - PowerPoint PPT Presentation
On Higher-Order Probabilistic Computation: Relational Reasoning, Termination, and Bayesian Programming Ugo Dal Lago (Based on joint work with Michele Alberti, Raphalle Crubill, Charles Grellois, Davide Sangiorgi,. . . ) IFIP WG 2.2 Annual
Probabilistic Models
◮ The environment is supposed not to behave
deterministically, but probabilistically.
Probabilistic Models
◮ The environment is supposed not to behave
deterministically, but probabilistically.
◮ Crucial when modeling uncertainty.
Probabilistic Models
◮ The environment is supposed not to behave
deterministically, but probabilistically.
◮ Crucial when modeling uncertainty. ◮ Useful to handle complex domains.
Probabilistic Models
◮ The environment is supposed not to behave
deterministically, but probabilistically.
◮ Crucial when modeling uncertainty. ◮ Useful to handle complex domains. ◮ Example:
q0 q1 q2 q3
1 4 3 4 1 1 2 1 2 1 3 2 3
Probabilistic Models
◮ The environment is supposed not to behave
deterministically, but probabilistically.
◮ Crucial when modeling uncertainty. ◮ Useful to handle complex domains. ◮ Example:
q0 q1 q2 q3
1 4 3 4 1 1 2 1 2 1 3 2 3 ◮ Abstractions:
◮ (Labelled) Markov Chains.
Probabilistic Models
ROBOTICS
Probabilistic Models
ARTIFICIAL INTELLIGENCE
Probabilistic Models
NATURAL LANGUAGE PROCESSING
Randomized Computation
◮ Algorithms and automata are assumed to have the ability
to sample from a distribution [dLMSS1956,R1963].
Randomized Computation
◮ Algorithms and automata are assumed to have the ability
to sample from a distribution [dLMSS1956,R1963].
◮ This is a powerful tool when solving computational
problems.
Randomized Computation
◮ Algorithms and automata are assumed to have the ability
to sample from a distribution [dLMSS1956,R1963].
◮ This is a powerful tool when solving computational
problems.
◮ Example:
Randomized Computation
◮ Algorithms and automata are assumed to have the ability
to sample from a distribution [dLMSS1956,R1963].
◮ This is a powerful tool when solving computational
problems.
◮ Example:
Randomized Computation
◮ Algorithms and automata are assumed to have the ability
to sample from a distribution [dLMSS1956,R1963].
◮ This is a powerful tool when solving computational
problems.
◮ Example: ◮ Abstractions:
◮ Randomized algorithms; ◮ Probabilistic Turing machines. ◮ Labelled Markov chains.
Randomized Computation
ALGORITHMICS
Randomized Computation
CRYPTOGRAPHY
Randomized Computation
PROGRAM VERIFICATION
Higher-Order Computation
◮ Mainly useful in programming.
Higher-Order Computation
◮ Mainly useful in programming. ◮ Functions are first-class citizens:
◮ They can be passed as arguments; ◮ They can be obtained as results.
Higher-Order Computation
◮ Mainly useful in programming. ◮ Functions are first-class citizens:
◮ They can be passed as arguments; ◮ They can be obtained as results.
◮ Motivations:
◮ Modularity; ◮ Code reuse; ◮ Conciseness.
Higher-Order Computation
◮ Mainly useful in programming. ◮ Functions are first-class citizens:
◮ They can be passed as arguments; ◮ They can be obtained as results.
◮ Motivations:
◮ Modularity; ◮ Code reuse; ◮ Conciseness.
◮ Example:
Higher-Order Computation
◮ Mainly useful in programming. ◮ Functions are first-class citizens:
◮ They can be passed as arguments; ◮ They can be obtained as results.
◮ Motivations:
◮ Modularity; ◮ Code reuse; ◮ Conciseness.
◮ Example:
Higher-Order Computation
◮ Mainly useful in programming. ◮ Functions are first-class citizens:
◮ They can be passed as arguments; ◮ They can be obtained as results.
◮ Motivations:
◮ Modularity; ◮ Code reuse; ◮ Conciseness.
◮ Example: ◮ Models:
◮ λ-calculus
Higher-Order Computation
FUNCTIONAL PROGRAMMING
Higher-Order Computation
FUNCTIONAL DATA STRUCTURES
Higher-Order Computation
λ-CALCULUS
Higher-Order Probabilistic Computation
Does it Make Sense?
Higher-Order Probabilistic Computation
Does it Make Sense? What Kind of Metatheory Does it Have?
Higher-Order Probabilistic Computation
Does it Make Sense? What Kind of Metatheory Does it Have? Applications?
1980 1990 2000 2010 [Saheb-Djaromi] [JonesPlotkin] [JungTix] [DanosHarmer]
1980 1990 2000 2010 [Saheb-Djaromi] [JonesPlotkin] [JungTix] [DanosHarmer] . . . too many
Outline Part I Relational Reasoning Part II Bayesian Functional Programming Part III Termination
Part I Relational Reasoning
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M;
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M;
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M; ◮ Value Distributions:
V
D
− → D(V ) ∈ R[0,1]
- D =
- V
D(V ) ≤ 1.
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M; ◮ Value Distributions:
V
D
− → D(V ) ∈ R[0,1]
- D =
- V
D(V ) ≤ 1.
◮ Semantics: M = supM⇓D D;
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M; ◮ Value Distributions:
V
D
− → D(V ) ∈ R[0,1]
- D =
- V
D(V ) ≤ 1.
◮ Semantics: M = supM⇓D D;
M ⇓ ∅ V ⇓ {V 1} M ⇓ D N ⇓ E M ⊕ N ⇓ 1
2D + 1 2E
M ⇓ K {P[N/x] ⇓ E P }λx.P ∈SK MN ⇓
- λx.P ∈SK
K (λx.P) · EP
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M; ◮ Value Distributions:
V
D
− → D(V ) ∈ R[0,1]
- D =
- V
D(V ) ≤ 1.
◮ Semantics: M = supM⇓D D; ◮ Context Equivalence: M ≡ N iff for every context C it
holds that C[M] = C[N].
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M; ◮ Value Distributions:
V
D
− → D(V ) ∈ R[0,1]
- D =
- V
D(V ) ≤ 1.
◮ Semantics: M = supM⇓D D; ◮ Context Equivalence: M ≡ N iff for every context C it
holds that C[M] = C[N]. C ::= [·] | λx.C | CM | MC | C ⊕M | M ⊕C
Syntax and Operational Semantics of Λ⊕
◮ Terms: M ::= x | λx.M | MM | M ⊕ M; ◮ Values: V ::= λx.M; ◮ Value Distributions:
V
D
− → D(V ) ∈ R[0,1]
- D =
- V
D(V ) ≤ 1.
◮ Semantics: M = supM⇓D D; ◮ Context Equivalence: M ≡ N iff for every context C it
holds that C[M] = C[N].
◮ Context Distance:
δC(M, N) = supC | C[M] − C[N]|.
Examples I ⊕ Ω vs. I
Examples I ⊕ Ω vs. I
λx.x
Examples I ⊕ Ω vs. I
∆∆ = (λx.xx)(λx.xx)
Examples I ⊕ Ω vs. I
Not Context Equivalent: C = [·]. Context Distance? Consider Cn = (λx. x . . . x n times )[·].
Examples I ⊕ Ω vs. I I ⊕ Ω vs. Ω
Examples I ⊕ Ω vs. I I ⊕ Ω vs. Ω
Not Context Equivalent: C = [·]. Context Distance? Cannot Easily Amplify.
Examples I ⊕ Ω vs. I I ⊕ Ω vs. Ω (λx.I) ⊕ (λx.Ω) vs. λx.I ⊕ Ω
Examples I ⊕ Ω vs. I I ⊕ Ω vs. Ω (λx.I) ⊕ (λx.Ω) vs. λx.I ⊕ Ω
Not Context Equivalent in CBV: C = (λx.x(xI))[·] Apparently Context Equivalent in CBN.
Examples I ⊕ Ω vs. I I ⊕ Ω vs. Ω (λx.I) ⊕ (λx.Ω) vs. λx.I ⊕ Ω Y1 vs. Y2
Examples I ⊕ Ω vs. I I ⊕ Ω vs. Ω (λx.I) ⊕ (λx.Ω) vs. λx.I ⊕ Ω Y1 vs. Y2
Y1M →∗ M(Y2M) ⊕ M(Y3M) Y2M →∗ M(Y1M) ⊕ M(Y3M) Y3M →∗ M(Y1M) ⊕ M(Y2M)
A Labelled Markov Chain for Λ⊕
Terms
A Labelled Markov Chain for Λ⊕
Terms Values
A Labelled Markov Chain for Λ⊕
Terms Values M
A Labelled Markov Chain for Λ⊕
Terms Values M V W Z . . . eval, M(V ) eval, M(W) eval, M(Z)
A Labelled Markov Chain for Λ⊕
Terms Values λx.N
A Labelled Markov Chain for Λ⊕
Terms Values λx.N N{W/x} W, 1
Probabilistic Applicative Bisimulation
λx.M R λx.N
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R M R N
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R M R N M eval
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R M R N M eval N eval
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R M R N M eval N eval M(E)
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R M R N M eval N eval M(E) N(E)
Probabilistic Applicative Bisimulation
λx.M R λx.N M{L/x} L N{L/x} L R M R N M eval N eval M(E) N(E) =
Applicative Bisimilarity vs. Context Equivalence
◮ Bisimilarity: the union ∼ of all bisimulation relations. ◮ Is it that ∼ is included in ≡? How to prove it? ◮ Natural strategy: is ∼ a congruence?
◮ If this is the case:
M ∼ N = ⇒ C[M] ∼ C[N] = ⇒
- C[M] =
- C[N]
= ⇒ M ≡ N.
◮ This is a necessary sanity check anyway.
◮ The naïve proof by induction fails, due to application:
from M ∼ N, one cannot directly conclude that LM ∼ LN.
Howe’s Technique
R RH
Howe’s Technique
R RH ⊆
Howe’s Technique
R RH ⊆ RH is a Congruence whenever R is an equivalence
Howe’s Technique
⊆ ∼H is a Congruence ∼ ∼H
Howe’s Technique
⊆ ∼H is a Congruence ∼ ∼H ⊇ Key Lemma
Our Neighborhood
◮ Λ, where we observe convergence
∼ ⊆ ≡ ≡ ⊆ ∼ CBN
- CBV
- [Abramsky1990, Howe1993]
◮ Λ⊕ with nondeterministic semantics, where we observe
convergence, in its may or must flavors.
∼ ⊆ ≡ ≡ ⊆ ∼ CBN
- ×
CBV
- ×
[Ong1993, Lassen1998]
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
∼ ⊆ ≡ ≡ ⊆ ∼ CBN
- ×
CBV
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
∼ ⊆ ≡ ≡ ⊆ ∼ CBN
- ×
CBV
- ◮ Counterexample for CBN: (λx.I) ⊕ (λx.Ω) ∼ λx.I ⊕ Ω
◮ Where these discrepancies come from?
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
∼ ⊆ ≡ ≡ ⊆ ∼ CBN
- ×
CBV
- ◮ Counterexample for CBN: (λx.I) ⊕ (λx.Ω) ∼ λx.I ⊕ Ω
◮ Where these discrepancies come from? ◮ From testing!
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
∼ ⊆ ≡ ≡ ⊆ ∼ CBN
- ×
CBV
- ◮ Counterexample for CBN: (λx.I) ⊕ (λx.Ω) ∼ λx.I ⊕ Ω
◮ Where these discrepancies come from? ◮ From testing! ◮ Bisimulation can be characterized by testing equivalence as
follows: Calculus Testing Λ T ::= ω | a · T PΛ⊕ T ::= ω | a · T | T, T NΛ⊕ T ::= ω | a · T | ∧i∈I Ti | . . .
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
⊆ ≤ ≤ ⊆ CBN
- ×
CBV
- ×
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
⊆ ≤ ≤ ⊆ CBN
- ×
CBV
- ×
◮ Probabilistic simulation can be characterized by testing as
follows: T ::= ω | a · T | T, T | T ∨ T
The Probabilistic Case
◮ Λ⊕ with probabilistic semantics.
⊆ ≤ ≤ ⊆ CBN
- ×
CBV
- ×
◮ Probabilistic simulation can be characterized by testing as
follows: T ::= ω | a · T | T, T | T ∨ T
◮ Full abstraction can be recovered if endowing Λ⊕ with
parallel disjunction [CDLSV2015].
⊆ ≤ ≤ ⊆ CBN
- ×
CBV
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first.
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω;
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω; ◮ Terms: any preterm M such that Γ ⊢ M.
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω; ◮ Terms: any preterm M such that Γ ⊢ M.
Γ, x ⊢ x x, Γ ⊢ M Γ ⊢ λx.M Γ ⊢ M ∆ ⊢ N Γ, ∆ ⊢ MN Γ ⊢ M Γ ⊢ N Γ ⊢ M ⊕ N
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω; ◮ Terms: any preterm M such that Γ ⊢ M. ◮ Behavioural Distance δb.
◮ The metric analogue to bisimilarity.
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω; ◮ Terms: any preterm M such that Γ ⊢ M. ◮ Behavioural Distance δb.
◮ The metric analogue to bisimilarity.
◮ Trace Distance δt.
◮ The maximum distance induced by traces, i.e., sequences of
actions: δt(M, N) = supT |Pr(M, T) − Pr(N, T)|.
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω; ◮ Terms: any preterm M such that Γ ⊢ M. ◮ Behavioural Distance δb.
◮ The metric analogue to bisimilarity.
◮ Trace Distance δt.
◮ The maximum distance induced by traces, i.e., sequences of
actions: δt(M, N) = supT |Pr(M, T) − Pr(N, T)|.
◮ Soundness and Completeness Results:
δb ≤ δc δc ≤ δb δt ≤ δc δc ≤ δt
- ×
Context Distance: the Affine Case [CDL2015]
◮ Let us consider a simple fragment of Λ⊕, first. ◮ Preterms: M, N ::= x | λx.M | MM | M ⊕ M | Ω; ◮ Terms: any preterm M such that Γ ⊢ M. ◮ Behavioural Distance δb.
◮ The metric analogue to bisimilarity.
◮ Trace Distance δt.
◮ The maximum distance induced by traces, i.e., sequences of
actions: δt(M, N) = supT |Pr(M, T) − Pr(N, T)|.
◮ Soundness and Completeness Results:
δb ≤ δc δc ≤ δb δt ≤ δc δc ≤ δt
- ×
- ◮ Example: δt(I, I ⊕ Ω) = δt(I ⊕ Ω, Ω) = 1
2.
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying.
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying. ◮ A Tuple LMC.
◮ Preterms:
M ::= x | λx.M | λ!x.M | MM | M ⊕ M | !M
◮ Terms: any preterm M such that Γ ⊢ M. ◮ States: sequences of terms, rather than terms. ◮ Actions not only model parameter passing, but also
copying of terms.
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying. ◮ A Tuple LMC.
◮ Preterms:
M ::= x | λx.M | λ!x.M | MM | M ⊕ M | !M
◮ Terms: any preterm M such that Γ ⊢ M.
!Γ, x ⊢ x !Γ, !x ⊢ x x, Γ ⊢ M Γ ⊢ λx.M !x, Γ ⊢ M Γ ⊢ λ!x.M !Γ ⊢ M !Γ ⊢!M Γ, !Θ ⊢ M ∆, !Θ ⊢ N Γ, ∆, Θ ⊢ MN Γ ⊢ M Γ ⊢ N Γ ⊢ M ⊕ N
◮ States: sequences of terms, rather than terms. ◮ Actions not only model parameter passing, but also
copying of terms.
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying. ◮ A Tuple LMC.
◮ Preterms:
M ::= x | λx.M | λ!x.M | MM | M ⊕ M | !M
◮ Terms: any preterm M such that Γ ⊢ M. ◮ States: sequences of terms, rather than terms. ◮ Actions not only model parameter passing, but also
copying of terms.
◮ Soundness and Completeness Results:
δt ≤ δc δc ≤ δt
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying. ◮ A Tuple LMC.
◮ Preterms:
M ::= x | λx.M | λ!x.M | MM | M ⊕ M | !M
◮ Terms: any preterm M such that Γ ⊢ M. ◮ States: sequences of terms, rather than terms. ◮ Actions not only model parameter passing, but also
copying of terms.
◮ Soundness and Completeness Results:
δt ≤ δc δc ≤ δt
- ◮ Examples: δt(!(I ⊕ Ω), !Ω) = 1
2
δt(!(I ⊕ Ω), !I) = 1.
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying. ◮ A Tuple LMC.
◮ Preterms:
M ::= x | λx.M | λ!x.M | MM | M ⊕ M | !M
◮ Terms: any preterm M such that Γ ⊢ M. ◮ States: sequences of terms, rather than terms. ◮ Actions not only model parameter passing, but also
copying of terms.
◮ Soundness and Completeness Results:
δt ≤ δc δc ≤ δt
- ◮ Examples: δt(!(I ⊕ Ω), !Ω) = 1
2
δt(!(I ⊕ Ω), !I) = 1.
◮ Trivialisation: the context distance collapses to an
equivalence in strongly normalising fragments or in presence
- f parellel disjuction.
Context Distance: the General Case [CDL2016]
◮ The LMC we have have worked so far with induces
unsound metrics for Λ⊕. . .
◮ . . . because it does not adequately model copying. ◮ A Tuple LMC.
◮ Preterms:
M ::= x | λx.M | λ!x.M | MM | M ⊕ M | !M
◮ Terms: any preterm M such that Γ ⊢ M. ◮ States: sequences of terms, rather than terms. ◮ Actions not only model parameter passing, but also
copying of terms.
◮ Soundness and Completeness Results:
δt ≤ δc δc ≤ δt
- ◮ Examples: δt(!(I ⊕ Ω), !Ω) = 1
2
δt(!(I ⊕ Ω), !I) = 1.
◮ Trivialisation: the context distance collapses to an
equivalence in strongly normalising fragments or in presence
- f parellel disjuction.
What would a sensible notion of distance look like?
Part II Bayesian Functional Programming
1. normalize( 2. let x = sample(bern 5 7
- ) in
3. let r = if x then 10 else 3 in 4.
- bserve 4 from poisson(r);
5. return(x)) x = true x = false
5 7 2 7
r = 10 r = 3 x = true x = false
1. normalize( 2. let x = sample(bern 5 7
- ) in
3. let r = if x then 10 else 3 in 4.
- bserve 4 from poisson(r);
5. return(x)) x = true x = false
5 7 2 7
r = 10 r = 3 x = true x = false
1. normalize( 2. let x = sample(bern 5 7
- ) in
3. let r = if x then 10 else 3 in 4.
- bserve 4 from poisson(r);
5. return(x)) x = true x = false
5 7 2 7
r = 10 r = 3 x = true x = false
1. normalize( 2. let x = sample(bern 5 7
- ) in
3. let r = if x then 10 else 3 in 4.
- bserve 4 from poisson(r);
5. return(x)) x = true x = false
5 7 2 7
r = 10 r = 3 x = true x = false
1. normalize( 2. let x = sample(bern 5 7
- ) in
3. let r = if x then 10 else 3 in 4.
- bserve 4 from poisson(r);
5. return(x)) x = true x = false
5 7 2 7
r = 10 r = 3 x = true x = false poisson(10)(4) ∼ 0.016 poisson(3)(4) ∼ 0.168
1. normalize( 2. let x = sample(bern 5 7
- ) in
3. let r = if x then 10 else 3 in 4.
- bserve 4 from poisson(r);
5. return(x)) x = true x = false
5 7 2 7
r = 10 r = 3 x = true x = false 0.22 0.78
Bayesian Functional Programming
ANGLICAN
Bayesian Functional Programming
HAKARU
1. normalize( 2. let x = sample(gauss (0, 1)) in 4.
- bserve d from exp(1/f(x));
5. return(x))
1. normalize( 2. let x = sample(gauss (0, 1)) in 4.
- bserve d from exp(1/f(x));
5. return(x))
Bayesian Programming: Semantics
◮ Giving semantics to programming languages like Anglican
- r Hakaru is nontrivial:
◮ Real numbers; ◮ Sampling from continuous distributions; ◮ Conditioning.
Bayesian Programming: Semantics
◮ Giving semantics to programming languages like Anglican
- r Hakaru is nontrivial:
◮ Real numbers; ◮ Sampling from continuous distributions; ◮ Conditioning.
◮ Key ingredients:
◮ In M ⇓ D, we need D to be a measure, because the set of
term is not countable anymore.
Bayesian Programming: Semantics
◮ Giving semantics to programming languages like Anglican
- r Hakaru is nontrivial:
◮ Real numbers; ◮ Sampling from continuous distributions; ◮ Conditioning.
◮ Key ingredients:
◮ In M ⇓ D, we need D to be a measure, because the set of
term is not countable anymore.
◮ Terms must thus be equipped with the structure of a
measurable space.
Bayesian Programming: Semantics
◮ Giving semantics to programming languages like Anglican
- r Hakaru is nontrivial:
◮ Real numbers; ◮ Sampling from continuous distributions; ◮ Conditioning.
◮ Key ingredients:
◮ In M ⇓ D, we need D to be a measure, because the set of
term is not countable anymore.
◮ Terms must thus be equipped with the structure of a
measurable space.
◮ From
M ⇓ K {P[N/x] ⇓ E P }λx.P ∈SK MN ⇓
- λx.P ∈SK
K (λx.P) · EP we go to M ⇓ K {P[N/x] ⇓ E P }λx.P ∈SK MN ⇓
- EP · dK (λx.P)
Bayesian Programming: Semantics
◮ Giving semantics to programming languages like Anglican
- r Hakaru is nontrivial:
◮ Real numbers; ◮ Sampling from continuous distributions; ◮ Conditioning.
◮ Key ingredients:
◮ In M ⇓ D, we need D to be a measure, because the set of
term is not countable anymore.
◮ Terms must thus be equipped with the structure of a
measurable space.
◮ From
M ⇓ K {P[N/x] ⇓ E P }λx.P ∈SK MN ⇓
- λx.P ∈SK
K (λx.P) · EP we go to M ⇓ K {P[N/x] ⇓ E P }λx.P ∈SK MN ⇓
- EP · dK (λx.P)
◮ This Lebesgue integral
does not necessarily exist.
◮ We must ensure that ⇓
gives rise to a stochastic kernel.
◮ In presence of conditioning,
we need even more.
Part III Termination
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
◮ Sound for termination, in absence
- f recursion.
◮ Poor expressive power. ◮ Intuitionistic Logic.
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
◮ Second-order Logic. ◮ Very expressive, extensionally. ◮ Still poor, intensionally.
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
◮ Motivated by Semantics. ◮ Complete for termination. ◮ Type inference is undecidable.
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
The Landscape: Type Theory
Simple Types Polymorphic Types Intersection Types Sized Types
τ ::= ι | τ → τ τ ::= · · · | α | ∀α.τ τ ::= · · · | τ ∧ τ τ ::= · · · | ι[ξ]
◮ Reasonably expressive,
intensionally.
◮ Type inference remains decidable
The Landscape: Recursion Theory
Determinism Ms →∗ Ns
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds
Ds can be smaller than 1.
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF
Undecidable; Σ0
1-complete.
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF Ds = 1
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF Ds = 1
Almost-Sure Termination Π0
2-complete.
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF Ds = 1 Uniform Termination ∀s.∃Ns ∈ NF
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF Ds = 1 Uniform Termination ∀s.∃Ns ∈ NF
Π0
2-complete.
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF Ds = 1 Uniform Termination ∀s.∃Ns ∈ NF ∀s. Ds = 1
The Landscape: Recursion Theory
Determinism Probabilism Ms →∗ Ns Ms = Ds Termination ∃Ns ∈ NF Ds = 1 Uniform Termination ∀s.∃Ns ∈ NF ∀s. Ds = 1
Π0
2-complete.
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ For every type τ, define a set of
reducible terms Redτ.
◮ Prove that all reducible terms are
- normalizing. . .
◮ . . . and that all typable terms are
reducible.
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder?
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder?
(fix x.M)V → M{fix x.M/x}V
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder? ◮ All the termination properties are lost, for very good
reasons.
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder? ◮ All the termination properties are lost, for very good
reasons.
◮ Is everything lost?
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder? ◮ All the termination properties are lost, for very good
reasons.
◮ Is everything lost? ◮ NO!
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder? ◮ All the termination properties are lost, for very good
reasons.
◮ Is everything lost? ◮ NO!
fix f λx : ι
f(x − 1) f(x) f(x) f(x + 1)
M fix f λx : ι
f(x − 1) f(x − 2) f(x − 3)
M
BAD! GOOD!
Deterministic Sized Types
◮ Pure λ-calculus with simple types is terminating.
◮ This can be proved in many ways, including by
reducibility.
◮ But useless as a programming language.
◮ What if we endow it with full recursion as a fix binder? ◮ All the termination properties are lost, for very good
reasons.
◮ Is everything lost? ◮ NO!
fix f λx : ι
f(x − 1) f(x) f(x) f(x + 1)
M fix f λx : ι
f(x − 1) f(x − 2) f(x − 3)
M
BAD! GOOD!
fix f λx : ι
f(x − 1) f(x − 2) f(x − 3)
M
GOOD!
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ.
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ. Index Terms
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ.
◮ Typing Fixpoints.
Γ, x : ι[a] → τ ⊢ M : ι[a + 1] → τ Γ ⊢ fix x.M : ι[ξ] → τ
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ.
◮ Typing Fixpoints.
Γ, x : ι[a] → τ ⊢ M : ι[a + 1] → τ Γ ⊢ fix x.M : ι[ξ] → τ
◮ Quite Powerful.
◮ Can type many forms of structural recursion.
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ.
◮ Typing Fixpoints.
Γ, x : ι[a] → τ ⊢ M : ι[a + 1] → τ Γ ⊢ fix x.M : ι[ξ] → τ
◮ Quite Powerful.
◮ Can type many forms of structural recursion.
◮ Termination.
◮ Proved by Reducibility. ◮ . . . but of an indexed form.
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ.
◮ Typing Fixpoints.
Γ, x : ι[a] → τ ⊢ M : ι[a + 1] → τ Γ ⊢ fix x.M : ι[ξ] → τ
◮ Quite Powerful.
◮ Can type many forms of structural recursion.
◮ Termination.
◮ Proved by Reducibility. ◮ . . . but of an indexed form.
◮ Reducibility sets are of the form Redθ τ. ◮ θ is an environment for index variables. ◮ Proof of reducibility for fix x.M is
rather delicate.
Deterministic Sized Types, Technically
◮ Types.
ξ ::= a | ω | ξ + 1; τ ::= ι[ξ] | τ → τ.
◮ Typing Fixpoints.
Γ, x : ι[a] → τ ⊢ M : ι[a + 1] → τ Γ ⊢ fix x.M : ι[ξ] → τ
◮ Quite Powerful.
◮ Can type many forms of structural recursion.
◮ Termination.
◮ Proved by Reducibility. ◮ . . . but of an indexed form.
◮ Type Inference.
◮ It is indeed decidable. ◮ But nontrivial.
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
Unbiased Random Walk
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
Unbiased Random Walk Biased Randomn Walk
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
◮ Non-Examples:
fix f.λx.if FairCoin then f(x − 1) else (f(x + 1); f(x + 1)); fix f.λx.if BiasedCoin then f(x + 1) else f(x − 1);
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
◮ Non-Examples:
fix f.λx.if FairCoin then f(x − 1) else (f(x + 1); f(x + 1)); fix f.λx.if BiasedCoin then f(x + 1) else f(x − 1);
Unbiased Random Walk, with two upward calls.
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
◮ Non-Examples:
fix f.λx.if FairCoin then f(x − 1) else (f(x + 1); f(x + 1)); fix f.λx.if BiasedCoin then f(x + 1) else f(x − 1);
Unbiased Random Walk, with two upward calls. Biased Random Walk, the “wrong” way.
Probabilistic Termination
◮ Examples:
fix f.λx.if x > 0 then if FairCoin then f(x − 1) else f(x + 1); fix f.λx.if x > 0 then if BiasedCoin then f(x − 1) else f(x + 1); fix f.λx.if BiasedCoin then f(x + 1) else x.
◮ Non-Examples:
fix f.λx.if FairCoin then f(x − 1) else (f(x + 1); f(x + 1)); fix f.λx.if BiasedCoin then f(x + 1) else f(x − 1);
◮ Probabilistic termination is thus:
◮ Sensitive to the actual distribution from which we sample. ◮ Sensitive to how many recursive calls we perform.
One-Counter Blind Markov Chains
◮ They are automata of the form (Q, δ) where
◮ Q is a finite set of states. ◮ δ : Q → Dist(Q × {−1, 0, 1}).
◮ They are a very special form of One-Counter Markov
Decision Processeses [BBEK2011].
◮ The model is fully probabilistic, there is no nondeterminism. ◮ The counter value is ignored.
One-Counter Blind Markov Chains
◮ They are automata of the form (Q, δ) where
◮ Q is a finite set of states. ◮ δ : Q → Dist(Q × {−1, 0, 1}).
◮ They are a very special form of One-Counter Markov
Decision Processeses [BBEK2011].
◮ The model is fully probabilistic, there is no nondeterminism. ◮ The counter value is ignored.
◮ The probability of reaching a configuration where the
counter is 0 can be approximated arbitrarily well in polynomial time.
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ Every higher-order variable occurs at most once.
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ
◮ Typing Fixpoints.
Γ | x : σ ⊢ V : ι[a + 1] → τ OCBMC(σ) terminates. Γ | x : σ ⊢ V : ι[ξ] → τ
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ
◮ Typing Fixpoints.
Γ | x : σ ⊢ V : ι[a + 1] → τ OCBMC(σ) terminates. Γ | x : σ ⊢ V : ι[ξ] → τ
This is sufficient for typing:
◮ Unbiased random walks; ◮ Biased random walks.
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ
◮ Typing Fixpoints.
Γ | x : σ ⊢ V : ι[a + 1] → τ OCBMC(σ) terminates. Γ | x : σ ⊢ V : ι[ξ] → τ
◮ Typing Probabilistic Choice
Γ | ∆ ⊢ M : τ Γ | Ω ⊢ N : ρ Γ | 1
2∆ + 1 2Ω ⊢ M ⊕ N : 1 2τ + 1 2ρ
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ
◮ Typing Fixpoints.
Γ | x : σ ⊢ V : ι[a + 1] → τ OCBMC(σ) terminates. Γ | x : σ ⊢ V : ι[ξ] → τ
◮ Typing Probabilistic Choice
Γ | ∆ ⊢ M : τ Γ | Ω ⊢ N : ρ Γ | 1
2∆ + 1 2Ω ⊢ M ⊕ N : 1 2τ + 1 2ρ
◮ Termination.
◮ By a quantitative nontrivial refinement of reducibility.
Probabilistic Sized Types [DLGrellois2017]
◮ Basic Idea: craft a sized-type system in such a way as to
mimick the recursive structure by a OCBMC.
◮ Judgments.
Γ | ∆ ⊢ M : µ
◮ Typing Fixpoints.
Γ | x : σ ⊢ V : ι[a + 1] → τ OCBMC(σ) terminates. Γ | x : σ ⊢ V : ι[ξ] → τ
◮ Typing Probabilistic Choice
Γ | ∆ ⊢ M : τ Γ | Ω ⊢ N : ρ Γ | 1
2∆ + 1 2Ω ⊢ M ⊕ N : 1 2τ + 1 2ρ
◮ Termination.
◮ By a quantitative nontrivial refinement of reducibility.
◮ Reducibility sets are now on the form Redθ,p τ ◮ p stands for the probability of being reducible. ◮ Reducibility sets are continuous:
Redθ,p
τ
=
- q<p
Redθ,q
τ
Deterministic Intersection Types
◮ Question: what are simple types missing as a way to
precisely capture termination?
Deterministic Intersection Types
◮ Question: what are simple types missing as a way to
precisely capture termination?
◮ Very simple examples of normalizing terms which canoot be
typed: ∆ = λx.xx ∆(λx.x).
Deterministic Intersection Types
◮ Question: what are simple types missing as a way to
precisely capture termination?
◮ Very simple examples of normalizing terms which canoot be
typed: ∆ = λx.xx ∆(λx.x).
◮ Types
τ ::= ⋆ | A → B A ::= {τ1, . . . , τn}
Deterministic Intersection Types
◮ Question: what are simple types missing as a way to
precisely capture termination?
◮ Very simple examples of normalizing terms which canoot be
typed: ∆ = λx.xx ∆(λx.x).
◮ Types
τ ::= ⋆ | A → B A ::= {τ1, . . . , τn}
◮ Typing Rules: Examples
{Γ ⊢ M : τi}1≤i≤n Γ ⊢ M : {τ1, . . . , τn} Γ ⊢ M : {A → B} Γ ⊢ N : A Γ ⊢ MN : B
Deterministic Intersection Types
◮ Question: what are simple types missing as a way to
precisely capture termination?
◮ Very simple examples of normalizing terms which canoot be
typed: ∆ = λx.xx ∆(λx.x).
◮ Types
τ ::= ⋆ | A → B A ::= {τ1, . . . , τn}
◮ Typing Rules: Examples
{Γ ⊢ M : τi}1≤i≤n Γ ⊢ M : {τ1, . . . , τn} Γ ⊢ M : {A → B} Γ ⊢ N : A Γ ⊢ MN : B
◮ Termination
◮ Again by reducibility.
Deterministic Intersection Types
◮ Question: what are simple types missing as a way to
precisely capture termination?
◮ Very simple examples of normalizing terms which canoot be
typed: ∆ = λx.xx ∆(λx.x).
◮ Types
τ ::= ⋆ | A → B A ::= {τ1, . . . , τn}
◮ Typing Rules: Examples
{Γ ⊢ M : τi}1≤i≤n Γ ⊢ M : {τ1, . . . , τn} Γ ⊢ M : {A → B} Γ ⊢ N : A Γ ⊢ MN : B
◮ Termination
◮ Again by reducibility.
◮ Completeness
◮ By subject expansion, the dual of subject reduction.
Oracle Intersection Types [BreuvartDL2017]
◮ Probabilistic choice can be seen as a form of read operation:
M ⊕ N = if BitInput then M else N
Oracle Intersection Types [BreuvartDL2017]
◮ Probabilistic choice can be seen as a form of read operation:
M ⊕ N = if BitInput then M else N
◮ Types
τ ::= ⋆ | A → s · B A ::= {τ1, . . . , τn} s ∈ {0, 1}∗
Oracle Intersection Types [BreuvartDL2017]
◮ Probabilistic choice can be seen as a form of read operation:
M ⊕ N = if BitInput then M else N
◮ Types
τ ::= ⋆ | A → s · B A ::= {τ1, . . . , τn} s ∈ {0, 1}∗
◮ Typing Rules: Examples
Γ ⊢ M : s · A Γ ⊢ M ⊕ N : 0s · A Γ ⊢ M : r · {A → s · B} Γ ⊢ N : q · A Γ ⊢ MN : (rqs) · B
Oracle Intersection Types [BreuvartDL2017]
◮ Probabilistic choice can be seen as a form of read operation:
M ⊕ N = if BitInput then M else N
◮ Types
τ ::= ⋆ | A → s · B A ::= {τ1, . . . , τn} s ∈ {0, 1}∗
◮ Typing Rules: Examples
Γ ⊢ M : s · A Γ ⊢ M ⊕ N : 0s · A Γ ⊢ M : r · {A → s · B} Γ ⊢ N : q · A Γ ⊢ MN : (rqs) · B
◮ Termination and Completeness
◮ Formulated in a rather unusual way. ◮ Proved as usual, but relative to a single probabilistic branch
Oracle Intersection Types [BreuvartDL2017]
◮ Probabilistic choice can be seen as a form of read operation:
M ⊕ N = if BitInput then M else N
◮ Types
τ ::= ⋆ | A → s · B A ::= {τ1, . . . , τn} s ∈ {0, 1}∗
◮ Typing Rules: Examples
Γ ⊢ M : s · A Γ ⊢ M ⊕ N : 0s · A Γ ⊢ M : r · {A → s · B} Γ ⊢ N : q · A Γ ⊢ MN : (rqs) · B
◮ Termination and Completeness
◮ Formulated in a rather unusual way. ◮ Proved as usual, but relative to a single probabilistic branch
P(M ↓) =
- ⊢M:s·⋆
2|s|
Oracle Intersection Types [BreuvartDL2017]
◮ Probabilistic choice can be seen as a form of read operation:
M ⊕ N = if BitInput then M else N
◮ Types
τ ::= ⋆ | A → s · B A ::= {τ1, . . . , τn} s ∈ {0, 1}∗
◮ Typing Rules: Examples
Γ ⊢ M : s · A Γ ⊢ M ⊕ N : 0s · A Γ ⊢ M : r · {A → s · B} Γ ⊢ N : q · A Γ ⊢ MN : (rqs) · B
◮ Termination and Completeness
◮ Formulated in a rather unusual way. ◮ Proved as usual, but relative to a single probabilistic branch
P(M ↓) =
- ⊢M:s·⋆
2|s| This is unavoidable, due to recursion theory.
Intersection Types and Computations
M V
Intersection Types and Computations
M V
Intersection Types
Intersection Types and Computations
M V M W V . . . . . .
Intersection Types and Computations
M V M W V . . . . . .
Oracle Intersection Types
Intersection Types and Computations
M V M W V . . . . . . M W V . . . . . .
Intersection Types and Computations
M V M W V . . . . . . M W V . . . . . .
Monadic Intersection Types [BDL2017]
◮ They are a combination of oracle and
sized types.
◮ Intersections are needed for preciseness. ◮ Distributions of types allow to analyse