Continuous models of computation: computability, complexity, - - PowerPoint PPT Presentation

continuous models of computation computability complexity
SMART_READER_LITE
LIVE PREVIEW

Continuous models of computation: computability, complexity, - - PowerPoint PPT Presentation

Continuous models of computation: computability, complexity, universality Amaury Pouly Universit de Paris, IRIF, CNRS 27 january 2020 1 / 41 Analog computers : the come back! analog computers : hard to program 1930 highly


slide-1
SLIDE 1

Continuous models of computation: computability, complexity, universality

Amaury Pouly

Université de Paris, IRIF, CNRS

27 january 2020

1 / 41

slide-2
SLIDE 2

Analog computers : the come back!

1930 analog computers : ◮ hard to program ◮ highly specialized

2 / 41

slide-3
SLIDE 3

Analog computers : the come back!

2000 1930 analog computers : ◮ hard to program ◮ highly specialized digital computers : ◮ « easy » to program ◮ general purpose

2 / 41

slide-4
SLIDE 4

Analog computers : the come back!

2000 1930 analog computers : ◮ hard to program ◮ highly specialized ◮ obsolete? digital computers : ◮ « easy » to program ◮ general purpose

2 / 41

slide-5
SLIDE 5

Analog computers : the come back!

2030? 2000 1930 analog computers : ◮ hard to program ◮ highly specialized ◮ obsolete? digital computers : ◮ « easy » to program ◮ general purpose ◮ analog? ◮ digital?

2 / 41

slide-6
SLIDE 6

Analog computers : the come back!

2030? 2000 1930 analog computers : ◮ hard to program ◮ highly specialized ◮ obsolete? digital computers : ◮ « easy » to program ◮ general purpose ◮ analog? ◮ digital? ◮ both!

2 / 41

slide-7
SLIDE 7

Analog computers

Differential Analyser “Mathematica of 1920” Admiralty Fire Control Table British Navy (WW2)

3 / 41

slide-8
SLIDE 8

Church Thesis

Computability discrete Turing machine boolean circuits logic recursive functions lambda calculus quantum analog continuous

Church Thesis

All reasonable models of computation are equivalent.

4 / 41

slide-9
SLIDE 9

Church Thesis

Complexity discrete Turing machine boolean circuits logic recursive functions lambda calculus quantum analog continuous

  • ?

?

Effective Church Thesis

All reasonable models of computation are equivalent for complexity.

4 / 41

slide-10
SLIDE 10

From machines to models

Differential analyzer

5 / 41

slide-11
SLIDE 11

From machines to models

Differential analyzer k

k

+

u+v u v

×

uv u v

  • u

u

General Purpose Analog Computer, Shannon 1936

5 / 41

slide-12
SLIDE 12

From machines to models

Differential analyzer k

k

+

u+v u v

×

uv u v

  • u

u

General Purpose Analog Computer, Shannon 1936 y(0) = y0, y′(t) = p(y(t)) Polynomial Differential Equation, Graça 2004 t

y1(t)

5 / 41

slide-13
SLIDE 13

From machines to models

Differential analyzer k

k

+

u+v u v

×

uv u v

  • u

u

General Purpose Analog Computer, Shannon 1936 y(0) = y0, y′(t) = p(y(t)) Polynomial Differential Equation, Graça 2004 t

y1(t)

5 / 41

slide-14
SLIDE 14

Example of dynamical system

θ ℓ

m

g ¨ θ + g

ℓ sin(θ) = 0

6 / 41

slide-15
SLIDE 15

Example of dynamical system

θ ℓ

m

g ¨ θ + g

ℓ sin(θ) = 0

       y′

1 = y2

y′

2 = − g l y3

y′

3 = y2y4

y′

4 = −y2y3

⇔        y1 = θ y2 = ˙ θ y3 = sin(θ) y4 = cos(θ)

6 / 41

slide-16
SLIDE 16

Example of dynamical system

θ ℓ

m

g ×

  • ×
  • −g

× ×

−1

  • y1

y2 y3 y4 ¨ θ + g

ℓ sin(θ) = 0

       y′

1 = y2

y′

2 = − g l y3

y′

3 = y2y4

y′

4 = −y2y3

⇔        y1 = θ y2 = ˙ θ y3 = sin(θ) y4 = cos(θ)

6 / 41

slide-17
SLIDE 17

Example of dynamical system

θ ℓ

m

g ×

  • ×
  • −g

× ×

−1

  • y1

y2 y3 y4 ¨ θ + g

ℓ sin(θ) = 0

       y′

1 = y2

y′

2 = − g l y3

y′

3 = y2y4

y′

4 = −y2y3

⇔        y1 = θ y2 = ˙ θ y3 = sin(θ) y4 = cos(θ)

Historical remark : the word “analog”

The pendulum and the circuit have the same equation. One can study

  • ne using the other by analogy.

6 / 41

slide-18
SLIDE 18

Does a balance scale compute a function?

Inputs : x, y ∈ [0, +∞) x y

7 / 41

slide-19
SLIDE 19

Does a balance scale compute a function?

Inputs : x, y ∈ [0, +∞) x y x y x = y

7 / 41

slide-20
SLIDE 20

Does a balance scale compute a function?

Inputs : x, y ∈ [0, +∞) x y x y x y x = y x > y

7 / 41

slide-21
SLIDE 21

Does a balance scale compute a function?

Inputs : x, y ∈ [0, +∞) x y x y x y x y x = y x > y x < y

7 / 41

slide-22
SLIDE 22

Does a balance scale compute a function?

Inputs : x, y ∈ [0, +∞) x y x y x y x y x = y x > y x < y Output : sign(x − y)?

7 / 41

slide-23
SLIDE 23

Computing with differential equations

Generable functions y(0)= y0 y′(x)= p(y(x)) x ∈ R f(x) = y1(x) x

y1(x)

Shannon’s notion sin, cos, exp, log, ...

8 / 41

slide-24
SLIDE 24

Computing with differential equations

Generable functions y(0)= y0 y′(x)= p(y(x)) x ∈ R f(x) = y1(x) x

y1(x)

Shannon’s notion sin, cos, exp, log, ... Computable y(0)= q(x) y′(t)= p(y(t)) x ∈ R t ∈ R+ f(x) = lim

t→∞ y1(t)

t

f(x) x y1(t)

Modern notion sin, cos, exp, log, Γ, ζ, ...

8 / 41

slide-25
SLIDE 25

Computing with differential equations

Generable functions y(0)= y0 y′(x)= p(y(x)) x ∈ R f(x) = y1(x) x

y1(x)

Shannon’s notion sin, cos, exp, log, ... Considered "weak" : not Γ and ζ Only analytic functions Computable y(0)= q(x) y′(t)= p(y(t)) x ∈ R t ∈ R+ f(x) = lim

t→∞ y1(t)

t

f(x) x y1(t)

Modern notion sin, cos, exp, log, Γ, ζ, ...

8 / 41

slide-26
SLIDE 26

Computing with differential equations

Generable functions y(0)= y0 y′(x)= p(y(x)) x ∈ R f(x) = y1(x) x

y1(x)

Shannon’s notion sin, cos, exp, log, ... Considered "weak" : not Γ and ζ Only analytic functions Computable y(0)= q(x) y′(t)= p(y(t)) x ∈ R t ∈ R+ f(x) = lim

t→∞ y1(t)

t

f(x) x y1(t)

Modern notion sin, cos, exp, log, Γ, ζ, ... Turing powerful [Bournez et al., 2007]

8 / 41

slide-27
SLIDE 27

More formally

t

1 −1 Yes No

y1(t) y1(t) y1(t) x

9 / 41

slide-28
SLIDE 28

More formally

t

1 −1 Yes No

y1(t) y1(t) y1(t) x

Theorem (Bournez et al, 2010)

This is equivalent to a Turing machine.

9 / 41

slide-29
SLIDE 29

More formally

t

1 −1 Yes No

y1(t) y1(t) y1(t) x

Theorem (Bournez et al, 2010)

This is equivalent to a Turing machine. ◮ analog computability theory ◮ purely continuous characterization of classical computability

9 / 41

slide-30
SLIDE 30

How does one prove such a result?

By computing/programming with differential equations! Two levels : Generable functions : ◮ « simple » basic blocks ◮ lots of way to combine them ◮ very low level Computable functions : ◮ more comprehensible ◮ harder to combine ◮ higher level

10 / 41

slide-31
SLIDE 31

The theory of generable functions

11 / 41

slide-32
SLIDE 32

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd x

y1(x)

Note : existence and unicity of y by Cauchy-Lipschitz theorem.

12 / 41

slide-33
SLIDE 33

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) = x ◮ identity y(0) = 0, y′ = 1

  • y(x) = x

12 / 41

slide-34
SLIDE 34

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) = x2 ◮ squaring y1(0)= 0, y′

1= 2y2

  • y1(x)= x2

y2(0)= 0, y′

2= 1

  • y2(x)= x

12 / 41

slide-35
SLIDE 35

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) = xn ◮ nth power y1(0)= 0, y′

1= ny2

  • y1(x)= xn

y2(0)= 0, y′

2= (n − 1)y3

  • y2(x)= xn−1

. . . . . . . . . yn(0)= 0, yn= 1

  • yn(x)= x

12 / 41

slide-36
SLIDE 36

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) = exp(x) ◮ exponential y(0)= 1, y′= y

  • y(x)= exp(x)

12 / 41

slide-37
SLIDE 37

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) = sin(x) or f(x) = cos(x) ◮ sine/cosine y1(0)= 0, y′

1= y2

  • y1(x)= sin(x)

y2(0)= 1, y′

2= −y1

  • y2(x)= cos(x)

12 / 41

slide-38
SLIDE 38

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) = tanh(x) ◮ hyperbolic tangent y(0)= 0, y′= 1 − y2

  • y(x)= tanh(x)

x tanh(x)

12 / 41

slide-39
SLIDE 39

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(x) =

1 1+x2

◮ rational function f ′(x) =

−2x (1+x2)2 = −2xf(x)2

y1(0)= 1, y′

1= −2y2y2 1

  • y1(x)=

1 1+x2

y2(0)= 0, y′

2= 1

  • y2(x)= x

12 / 41

slide-40
SLIDE 40

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f = g ± h ◮ sum/difference (f ± g)′ = f ′ ± g′

12 / 41

slide-41
SLIDE 41

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f = gh ◮ product (gh)′ = g′h + gh′

12 / 41

slide-42
SLIDE 42

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f = 1

g

◮ inverse f ′ = −g′

g2 = −g′f 2

12 / 41

slide-43
SLIDE 43

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f =

  • g

◮ integral f ′ = g

12 / 41

slide-44
SLIDE 44

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f = g′ ◮ derivative f ′ = g′′ = (p1(z))′ = ∇p1(z) · z′

12 / 41

slide-45
SLIDE 45

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f = g ◦ h ◮ composition (z ◦ h)′ = (z′ ◦ h)h′ = p(z ◦ h)h′

12 / 41

slide-46
SLIDE 46

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f ′ = tanh ◦f ◮ Non-polynomial differential equation f ′′ = (tanh′ ◦f)f ′ = (1 − (tanh ◦f)2)f ′

12 / 41

slide-47
SLIDE 47

Generable functions (total, univariate)

Definition

f : R → R is generable if there exists d, p and y0 such that the solution y to y(0) = y0, y′(x) = p(y(x)) satisfies f(x) = y1(x) for all x ∈ R.

Types

◮ d ∈ N : dimension ◮ p ∈ Rd[Rn] : polynomial vector ◮ y0 ∈ Rd, y : R → Rd Example : f(0) = f0, f ′ = g ◦ f ◮ Initial Value Problem (IVP) f ′ = g′′ = (p(z))′ = ∇p(z) · z′

12 / 41

slide-48
SLIDE 48

Generable functions : a first summary

Nice theory for the class of total and univariate generable functions : ◮ analytic ◮ contains polynomials, sin, cos, tanh, exp ◮ stable under ±, ×, /, ◦ and Initial Value Problems (IVP) ◮ technicality on the field K of coefficients for stability under ◦

13 / 41

slide-49
SLIDE 49

Generable functions : a first summary

Nice theory for the class of total and univariate generable functions : ◮ analytic ◮ contains polynomials, sin, cos, tanh, exp ◮ stable under ±, ×, /, ◦ and Initial Value Problems (IVP) ◮ technicality on the field K of coefficients for stability under ◦ Limitations : ◮ total functions ◮ univariate

13 / 41

slide-50
SLIDE 50

Generable functions (generalization)

Definition

f : X ⊆ Rn → R is generable if X is open connected and ∃d, p, x0, y0, y such that y(x0) = y0, Jy(x) = p(y(x)) and f(x) = y1(x) for all x ∈ X. Jy(x) = Jacobian matrix of y at x

Types

◮ n ∈ N : input dimension ◮ d ∈ N : dimension ◮ p ∈ Kd×d[Rd] : polynomial matrix ◮ x0 ∈ Kn ◮ y0 ∈ Kd, y : X → Rd Notes : ◮ Partial differential equation! ◮ Unicity of solution y... ◮ ... but not existence (ie you have to show it exists)

14 / 41

slide-51
SLIDE 51

Generable functions (generalization)

Definition

f : X ⊆ Rn → R is generable if X is open connected and ∃d, p, x0, y0, y such that y(x0) = y0, Jy(x) = p(y(x)) and f(x) = y1(x) for all x ∈ X. Jy(x) = Jacobian matrix of y at x

Types

◮ n ∈ N : input dimension ◮ d ∈ N : dimension ◮ p ∈ Kd×d[Rd] : polynomial matrix ◮ x0 ∈ Kn ◮ y0 ∈ Kd, y : X → Rd Example : f(x1, x2) = x1x2

2

(n = 2, d = 3) ◮ monomial y(0, 0) =     , Jy =   y2

3

3y2y3 1 1  

  • y(x) =

  x1x2

2

x1 x2  

14 / 41

slide-52
SLIDE 52

Generable functions (generalization)

Definition

f : X ⊆ Rn → R is generable if X is open connected and ∃d, p, x0, y0, y such that y(x0) = y0, Jy(x) = p(y(x)) and f(x) = y1(x) for all x ∈ X. Jy(x) = Jacobian matrix of y at x

Types

◮ n ∈ N : input dimension ◮ d ∈ N : dimension ◮ p ∈ Kd×d[Rd] : polynomial matrix ◮ x0 ∈ Kn ◮ y0 ∈ Kd, y : X → Rd Example : f(x1, x2) = x1x2

2

◮ monomial y1(0, 0)= 0, ∂x1y1= y2

3,

∂x2y1= 3y2y3

  • y1(x) = x1x2

2

y2(0, 0)= 0, ∂x1y2= 1, ∂x2y2= 0

  • y2(x) = x1

y3(0, 0)= 0, ∂x1y3= 0, ∂x2y3= 1

  • y3(x) = x2

This is tedious!

14 / 41

slide-53
SLIDE 53

Generable functions (generalization)

Definition

f : X ⊆ Rn → R is generable if X is open connected and ∃d, p, x0, y0, y such that y(x0) = y0, Jy(x) = p(y(x)) and f(x) = y1(x) for all x ∈ X. Jy(x) = Jacobian matrix of y at x

Types

◮ n ∈ N : input dimension ◮ d ∈ N : dimension ◮ p ∈ Kd×d[Rd] : polynomial matrix ◮ x0 ∈ Kn ◮ y0 ∈ Kd, y : X → Rd Last example : f(x) = 1

x for x ∈ (0, ∞)

◮ inverse function y(1)= 1, ∂xy= −y2

  • y(x) = 1

x

14 / 41

slide-54
SLIDE 54

Generable functions : summary

Nice theory for the class of multivariate generable functions (over connected domains) : ◮ analytic ◮ contains polynomials, sin, cos, tanh, exp ◮ stable under ±, ×, /, ◦ and Initial Value Problems (IVP) ◮ technicality on the field K of coefficients for stability under ◦

15 / 41

slide-55
SLIDE 55

Generable functions : summary

Nice theory for the class of multivariate generable functions (over connected domains) : ◮ analytic ◮ contains polynomials, sin, cos, tanh, exp ◮ stable under ±, ×, /, ◦ and Initial Value Problems (IVP) ◮ technicality on the field K of coefficients for stability under ◦ Natural questions : ◮ analytic → isn’t that very limited? ◮ can we generate all analytic functions?

15 / 41

slide-56
SLIDE 56

Generable functions : summary

Nice theory for the class of multivariate generable functions (over connected domains) : ◮ analytic ◮ contains polynomials, sin, cos, tanh, exp ◮ stable under ±, ×, /, ◦ and Initial Value Problems (IVP) ◮ technicality on the field K of coefficients for stability under ◦ Natural questions : ◮ analytic → isn’t that very limited? ◮ can we generate all analytic functions? No Riemann Γ and ζ are not generable.

15 / 41

slide-57
SLIDE 57

Why is this useful?

Writing polynomial ODEs by hand is hard.

16 / 41

slide-58
SLIDE 58

Why is this useful?

Writing polynomial ODEs by hand is hard. Using generable functions, we can build complicated multivariate partial functions using other operations, and we know they are solutions to polynomial ODEs by construction.

16 / 41

slide-59
SLIDE 59

Why is this useful?

Writing polynomial ODEs by hand is hard. Using generable functions, we can build complicated multivariate partial functions using other operations, and we know they are solutions to polynomial ODEs by construction.

Example : almost rounding function

There exists a generable function round such that for any n ∈ Z, x ∈ R, λ > 2 and µ 0 : ◮ if x ∈

  • n − 1

2, n + 1 2

  • then | round(x, µ, λ) − n| 1

2,

◮ if x ∈

  • n − 1

2 + 1 λ, n + 1 2 − 1 λ

  • then | round(x, µ, λ) − n| e−µ.

16 / 41

slide-60
SLIDE 60

The theory of computable functions

17 / 41

slide-61
SLIDE 61

Computable function

y(0)= q(x) y′(t)= p(y(t)) x ∈ R t ∈ R+ f(x) = lim

t→∞ y1(t)

t

f(x) x y1(t)

x y x y x y x y x = y x > y x < y Inputs : x, y ∈ [0, +∞) Output : sign(x − y)?

18 / 41

slide-62
SLIDE 62

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions

19 / 41

slide-63
SLIDE 63

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions ◮ stable under ±, ×, /

19 / 41

slide-64
SLIDE 64

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions ◮ stable under ±, ×, / ◮ stable under ◦

19 / 41

slide-65
SLIDE 65

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions ◮ stable under ±, ×, / ◮ stable under ◦ ◮ stable under limits

19 / 41

slide-66
SLIDE 66

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions ◮ stable under ±, ×, / ◮ stable under ◦ ◮ stable under limits ◮ stable under iteration (with conditions)

19 / 41

slide-67
SLIDE 67

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions ◮ stable under ±, ×, / ◮ stable under ◦ ◮ stable under limits ◮ stable under iteration (with conditions) Enough to simulate a Turing machine!

19 / 41

slide-68
SLIDE 68

The theory of computable functions

Important fact : ◮ contains generable functions ◮ continuous functions ◮ stable under ±, ×, / ◮ stable under ◦ ◮ stable under limits ◮ stable under iteration (with conditions) Enough to simulate a Turing machine! Proof are too complicated but essentially this is all error management.

19 / 41

slide-69
SLIDE 69

Proof gem : iteration with differential equations

Assume f is generable, can we iterate f with an ODE? That is, build a generable y such that y(x, n) ≈ f [n](x) for all n ∈ N

20 / 41

slide-70
SLIDE 70

Proof gem : iteration with differential equations

Assume f is generable, can we iterate f with an ODE? That is, build a generable y such that y(x, n) ≈ f [n](x) for all n ∈ N t x f(x)

1 2

1

3 2

2

y′≈0 z′≈f(y)−z

20 / 41

slide-71
SLIDE 71

Proof gem : iteration with differential equations

Assume f is generable, can we iterate f with an ODE? That is, build a generable y such that y(x, n) ≈ f [n](x) for all n ∈ N t x f(x)

1 2

1

3 2

2

y′≈0 z′≈f(y)−z y′≈z−y z′≈0

20 / 41

slide-72
SLIDE 72

Proof gem : iteration with differential equations

Assume f is generable, can we iterate f with an ODE? That is, build a generable y such that y(x, n) ≈ f [n](x) for all n ∈ N t x f(x) f [2](x)

1 2

1

3 2

2

y′≈0 z′≈f(y)−z y′≈z−y z′≈0

20 / 41

slide-73
SLIDE 73

Recap

t

1 −1 Yes No

y1(t) y1(t) y1(t) x

Theorem (Bournez et al, 2010)

This is equivalent to a Turing machine. ◮ analog computability theory ◮ purely continuous characterization of classical computability

21 / 41

slide-74
SLIDE 74

The complexity theory of computable functions

22 / 41

slide-75
SLIDE 75

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x

23 / 41

slide-76
SLIDE 76

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x ◮ GPAC :

Tentative definition

T(x) = ?? y(0) = (x, 0, . . . , 0) y′ = p(y) t

f(x) x y1(t)

23 / 41

slide-77
SLIDE 77

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x ◮ GPAC :

Tentative definition

T(x, µ) = y(0) = (x, 0, . . . , 0) y′ = p(y) t

f(x) x y1(t)

23 / 41

slide-78
SLIDE 78

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x ◮ GPAC :

Tentative definition

T(x, µ) = first time t so that |y1(t) − f(x)| e−µ y(0) = (x, 0, . . . , 0) y′ = p(y) t

f(x) x y1(t)

23 / 41

slide-79
SLIDE 79

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x ◮ GPAC :

Tentative definition

T(x, µ) = first time t so that |y1(t) − f(x)| e−µ y(0) = (x, 0, . . . , 0) y′ = p(y) t

f(x) x y1(t)

  • z(t) = y(et)

t

f(x) x z1(t)

23 / 41

slide-80
SLIDE 80

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x ◮ GPAC :

Tentative definition

T(x, µ) = first time t so that |y1(t) − f(x)| e−µ y(0) = (x, 0, . . . , 0) y′ = p(y) t

f(x) x y1(t)

  • z(t) = y(et)

t

f(x) x z1(t)

w(t) = y(eet) t

f(x) x w1(t)

23 / 41

slide-81
SLIDE 81

Complexity of analog systems

◮ Turing machines : T(x) = number of steps to compute on x ◮ GPAC : time contraction problem → open problem

Tentative definition

T(x, µ) = first time t so that |y1(t) − f(x)| e−µ y(0) = (x, 0, . . . , 0) y′ = p(y) t

f(x) x y1(t)

  • z(t) = y(et)

t

f(x) x z1(t)

Something is wrong...

All functions have constant time complexity. w(t) = y(eet) t

f(x) x w1(t)

23 / 41

slide-82
SLIDE 82

Time-space correlation of the GPAC

y(0) = q(x) y′ = p(y) t

f(x) q(x) y1(t)

  • z(t) = y(et)

t

f(x) ˜ q(x) z1(t)

24 / 41

slide-83
SLIDE 83

Time-space correlation of the GPAC

y(0) = q(x) y′ = p(y) t

f(x) q(x) y1(t)

  • z(t) = y(et)

t

f(x) ˜ q(x) z1(t)

extra component : w(t) = et t

w(t)

24 / 41

slide-84
SLIDE 84

Time-space correlation of the GPAC

y(0) = q(x) y′ = p(y) t

f(x) q(x) y1(t)

  • z(t) = y(et)

t

f(x) ˜ q(x) z1(t)

Observation

Time scaling costs “space”.

  • Time complexity for the GPAC

must involve time and space! extra component : w(t) = et t

w(t)

24 / 41

slide-85
SLIDE 85

Complexity in the analog world

Complexity measure : length of the curve x y(10) = x y(1) Time acceleration : same curve = same complexity!

25 / 41

slide-86
SLIDE 86

Complexity in the analog world

Complexity measure : length of the curve x y(10) = x y(1) Time acceleration : same curve = same complexity! x y(1) ≪ x y(1) Same time, different curves : different complexity!

25 / 41

slide-87
SLIDE 87

Characterization of polynomial time

Definition : L ∈ ANALOG-PTIME ⇔ ∃p polynomial, ∀ word w y(0) = (ψ(w), |w|, 0, . . . , 0) y′ = p(y) ψ(w) =

|w|

  • i=1

wi2−i ℓ(t) = length of y 1 −1

y1(t) ψ(w)

26 / 41

slide-88
SLIDE 88

Characterization of polynomial time

Definition : L ∈ ANALOG-PTIME ⇔ ∃p polynomial, ∀ word w y(0) = (ψ(w), |w|, 0, . . . , 0) y′ = p(y) ψ(w) =

|w|

  • i=1

wi2−i ℓ(t) = length of y 1 −1 accept : w ∈ L computing

y1(t) ψ(w)

satisfies

  • 1. if y1(t) 1 then w ∈ L

26 / 41

slide-89
SLIDE 89

Characterization of polynomial time

Definition : L ∈ ANALOG-PTIME ⇔ ∃p polynomial, ∀ word w y(0) = (ψ(w), |w|, 0, . . . , 0) y′ = p(y) ψ(w) =

|w|

  • i=1

wi2−i ℓ(t) = length of y 1 −1 accept : w ∈ L reject : w / ∈ L computing

y1(t) ψ(w)

satisfies

  • 2. if y1(t) −1 then w /

∈ L

26 / 41

slide-90
SLIDE 90

Characterization of polynomial time

Definition : L ∈ ANALOG-PTIME ⇔ ∃p polynomial, ∀ word w y(0) = (ψ(w), |w|, 0, . . . , 0) y′ = p(y) ψ(w) =

|w|

  • i=1

wi2−i ℓ(t) = length of y 1 −1

poly(|w|)

accept : w ∈ L reject : w / ∈ L computing forbidden

y1(t) ψ(w)

satisfies

  • 3. if ℓ(t) poly(|w|) then |y1(t)| 1

26 / 41

slide-91
SLIDE 91

Characterization of polynomial time

Definition : L ∈ ANALOG-PTIME ⇔ ∃p polynomial, ∀ word w y(0) = (ψ(w), |w|, 0, . . . , 0) y′ = p(y) ψ(w) =

|w|

  • i=1

wi2−i ℓ(t) = length of y 1 −1

poly(|w|)

accept : w ∈ L reject : w / ∈ L computing forbidden

y1(t) y1(t) y1(t) ψ(w)

Theorem

PTIME = ANALOG-PTIME

26 / 41

slide-92
SLIDE 92

Summary

ANALOG-PTIME ANALOG-PR

ℓ(t)

1 −1

poly(|w|) w∈L w / ∈L y1(t) y1(t) y1(t) ψ(w) ℓ(t) f(x) x y1(t)

Theorem

◮ L ∈ PTIME of and only if L ∈ ANALOG-PTIME ◮ f : [a, b] → R computable in polynomial time ⇔ f ∈ ANALOG-PR ◮ Analog complexity theory based on length ◮ Time of Turing machine ⇔ length of the GPAC ◮ Purely continuous characterization of PTIME

27 / 41

slide-93
SLIDE 93

Summary

ANALOG-PTIME ANALOG-PR

ℓ(t)

1 −1

poly(|w|) w∈L w / ∈L y1(t) y1(t) y1(t) ψ(w) ℓ(t) f(x) x y1(t)

Theorem

◮ L ∈ PTIME of and only if L ∈ ANALOG-PTIME ◮ f : [a, b] → R computable in polynomial time ⇔ f ∈ ANALOG-PR ◮ Analog complexity theory based on length ◮ Time of Turing machine ⇔ length of the GPAC ◮ Purely continuous characterization of PTIME ◮ Only rational coefficients needed

27 / 41

slide-94
SLIDE 94

Chemical Reaction Networks

28 / 41

slide-95
SLIDE 95

Chemical Reaction Networks

Definition : a reaction system is a finite set of ◮ molecular species y1, . . . , yn ◮ reactions of the form

i aiyi f

− →

i biyi

(ai, bi ∈ N, f = rate) Example (any resemblance to chemistry is purely coincidental) : 2H + O → H2O C + O2 → CO2

29 / 41

slide-96
SLIDE 96

Chemical Reaction Networks

Definition : a reaction system is a finite set of ◮ molecular species y1, . . . , yn ◮ reactions of the form

i aiyi f

− →

i biyi

(ai, bi ∈ N, f = rate) Example (any resemblance to chemistry is purely coincidental) : 2H + O → H2O C + O2 → CO2 Assumption : law of mass action

  • i

aiyi

k

− →

  • i

biyi

  • f(y) = k
  • i

yai

i

29 / 41

slide-97
SLIDE 97

Chemical Reaction Networks

Definition : a reaction system is a finite set of ◮ molecular species y1, . . . , yn ◮ reactions of the form

i aiyi f

− →

i biyi

(ai, bi ∈ N, f = rate) Example (any resemblance to chemistry is purely coincidental) : 2H + O → H2O C + O2 → CO2 Assumption : law of mass action

  • i

aiyi

k

− →

  • i

biyi

  • f(y) = k
  • i

yai

i

Semantics : ◮ discrete ◮ differential ◮ stochastic

29 / 41

slide-98
SLIDE 98

Chemical Reaction Networks

Definition : a reaction system is a finite set of ◮ molecular species y1, . . . , yn ◮ reactions of the form

i aiyi f

− →

i biyi

(ai, bi ∈ N, f = rate) Example (any resemblance to chemistry is purely coincidental) : 2H + O → H2O C + O2 → CO2 Assumption : law of mass action

  • i

aiyi

k

− →

  • i

biyi

  • f(y) = k
  • i

yai

i

Semantics : ◮ discrete ◮ differential → ◮ stochastic y′

i =

  • reaction R

(bR

i − aR i )f R(y)

29 / 41

slide-99
SLIDE 99

Chemical Reaction Networks

Definition : a reaction system is a finite set of ◮ molecular species y1, . . . , yn ◮ reactions of the form

i aiyi f

− →

i biyi

(ai, bi ∈ N, f = rate) Example (any resemblance to chemistry is purely coincidental) : 2H + O → H2O C + O2 → CO2 Assumption : law of mass action

  • i

aiyi

k

− →

  • i

biyi

  • f(y) = k
  • i

yai

i

Semantics : ◮ discrete ◮ differential → ◮ stochastic y′

i =

  • reaction R

(bR

i − aR i )kR j

y

aj j

29 / 41

slide-100
SLIDE 100

Chemical Reaction Networks (CRNs)

◮ CRNs with differential semantics and mass action law = polynomial ODEs ◮ polynomial ODEs are Turing complete

30 / 41

slide-101
SLIDE 101

Chemical Reaction Networks (CRNs)

◮ CRNs with differential semantics and mass action law = polynomial ODEs ◮ polynomial ODEs are Turing complete CRNs are Turing complete?

30 / 41

slide-102
SLIDE 102

Chemical Reaction Networks (CRNs)

CRNs are Turing complete? Two “slight” problems : ◮ concentrations cannot be negative (yi < 0) ◮ arbitrary reactions are not realistic

30 / 41

slide-103
SLIDE 103

Chemical Reaction Networks (CRNs)

CRNs are Turing complete? Two “slight” problems : ◮ concentrations cannot be negative (yi < 0) ◮ easy to solve ◮ arbitrary reactions are not realistic ◮ what is realistic?

30 / 41

slide-104
SLIDE 104

Chemical Reaction Networks (CRNs)

CRNs are Turing complete? Two “slight” problems : ◮ concentrations cannot be negative (yi < 0) ◮ easy to solve ◮ arbitrary reactions are not realistic ◮ what is realistic? Definition : a reaction is elementary if it has at most two reactants ⇒ can be implemented with DNA, RNA or proteins

30 / 41

slide-105
SLIDE 105

Chemical Reaction Networks (CRNs)

CRNs are Turing complete? Two “slight” problems : ◮ concentrations cannot be negative (yi < 0) ◮ easy to solve ◮ arbitrary reactions are not realistic ◮ what is realistic? Definition : a reaction is elementary if it has at most two reactants ⇒ can be implemented with DNA, RNA or proteins Elementary reactions correspond to quadratic ODEs : ay + bz

k

− → · · ·

  • f(y, z) = kyazb

30 / 41

slide-106
SLIDE 106

Chemical Reaction Networks (CRNs)

CRNs are Turing complete? Two “slight” problems : ◮ concentrations cannot be negative (yi < 0) ◮ easy to solve ◮ arbitrary reactions are not realistic ◮ what is realistic? Definition : a reaction is elementary if it has at most two reactants ⇒ can be implemented with DNA, RNA or proteins Elementary reactions correspond to quadratic ODEs : ay + bz

k

− → · · ·

  • f(y, z) = kyazb

Theorem (Folklore)

Every polynomial ODE can be rewritten as a quadratic ODE.

30 / 41

slide-107
SLIDE 107

Chemical Reaction Networks (CRNs)

Definition : a reaction is elementary if it has at most two reactants ⇒ can be implemented with DNA, RNA or proteins Elementary reactions correspond to quadratic ODEs : ay + bz

k

− → · · ·

  • f(y, z) = kyazb

Theorem (Work with François Fages, Guillaume Le Guludec)

Elementary mass-action-law reaction system on finite universes of molecules are Turing-complete under the differential semantics. Notes : ◮ proof preserves polynomial length ◮ in fact the following elementary reactions suffice : ∅ k − → x x

k

− → x + z x + y

k

− → x + y + z x + y

k

− → ∅

30 / 41

slide-108
SLIDE 108

Universal differential equation

31 / 41

slide-109
SLIDE 109

Universal differential equations

Generable functions x

y1(x)

subclass of analytic functions Computable functions t

f(x) x y1(t)

any computable function

32 / 41

slide-110
SLIDE 110

Universal differential equations

Generable functions x

y1(x)

subclass of analytic functions Computable functions t

f(x) x y1(t)

any computable function x

y1(x)

32 / 41

slide-111
SLIDE 111

Universal differential algebraic equation (DAE)

x

y(x)

Theorem (Rubel, 1981)

For any continuous functions f and ε, there exists y : R → R solution to 3y′4y

′′y ′′′′2

−4y′4y

′′′2y ′′′′ + 6y′3y ′′2y ′′′y ′′′′ + 24y′2y ′′4y ′′′′

−12y′3y

′′y ′′′3 − 29y′2y ′′3y ′′′2 + 12y ′′7

= 0 such that ∀t ∈ R, |y(t) − f(t)| ε(t).

33 / 41

slide-112
SLIDE 112

Universal differential algebraic equation (DAE)

x

y(x)

Theorem (Rubel, 1981)

There exists a fixed polynomial p and k ∈ N such that for any conti- nuous functions f and ε, there exists a solution y : R → R to p(y, y′, . . . , y(k)) = 0 such that ∀t ∈ R, |y(t) − f(t)| ε(t).

33 / 41

slide-113
SLIDE 113

Universal differential algebraic equation (DAE)

x

y(x)

Theorem (Rubel, 1981)

There exists a fixed polynomial p and k ∈ N such that for any conti- nuous functions f and ε, there exists a solution y : R → R to p(y, y′, . . . , y(k)) = 0 such that ∀t ∈ R, |y(t) − f(t)| ε(t). Problem : this is «weak» result.

33 / 41

slide-114
SLIDE 114

The problem with Rubel’s DAE

The solution y is not unique, even with added initial conditions : p(y, y′, . . . , y(k)) = 0, y(0) = α0, y′(0) = α1, . . . , y(k)(0) = αk In fact, this is fundamental for Rubel’s proof to work!

34 / 41

slide-115
SLIDE 115

The problem with Rubel’s DAE

The solution y is not unique, even with added initial conditions : p(y, y′, . . . , y(k)) = 0, y(0) = α0, y′(0) = α1, . . . , y(k)(0) = αk In fact, this is fundamental for Rubel’s proof to work! ◮ Rubel’s statement : this DAE is universal ◮ More realistic interpretation : this DAE allows almost anything

Open Problem (Rubel, 1981)

Is there a universal ODE y′ = p(y)? Note : explicit polynomial ODE ⇒ unique solution

34 / 41

slide-116
SLIDE 116

Rubel’s proof in one slide

◮ Take f(t) = e

−1 1−t2 for −1 < t < 1 and f(t) = 0 otherwise.

It satisfies (1 − t2)2f

′′(t) + 2tf ′(t) = 0.

t

35 / 41

slide-117
SLIDE 117

Rubel’s proof in one slide

◮ Take f(t) = e

−1 1−t2 for −1 < t < 1 and f(t) = 0 otherwise.

It satisfies (1 − t2)2f

′′(t) + 2tf ′(t) = 0.

◮ For any a, b, c ∈ R, y(t) = cf(at + b) satisfies 3y′4y′′y′′′′2 −4y′4y′′2y′′′′ + 6y′3y′′2y′′′y′′′′ + 24y′2y′′4y′′′′ −12y′3y′′y′′′3 − 29y′2y′′3y′′′2 + 12y′′7 = 0 Translation and rescaling : t

35 / 41

slide-118
SLIDE 118

Rubel’s proof in one slide

◮ Take f(t) = e

−1 1−t2 for −1 < t < 1 and f(t) = 0 otherwise.

It satisfies (1 − t2)2f

′′(t) + 2tf ′(t) = 0.

◮ For any a, b, c ∈ R, y(t) = cf(at + b) satisfies

3y′4y′′y′′′′2−4y′4y′′2y′′′′+6y′3y′′2y′′′y′′′′+24y′2y′′4y′′′′−12y′3y′′y′′′3−29y′2y′′3y′′′2+12y′′7=0

◮ Can glue together arbitrary many such pieces t

35 / 41

slide-119
SLIDE 119

Rubel’s proof in one slide

◮ Take f(t) = e

−1 1−t2 for −1 < t < 1 and f(t) = 0 otherwise.

It satisfies (1 − t2)2f

′′(t) + 2tf ′(t) = 0.

◮ For any a, b, c ∈ R, y(t) = cf(at + b) satisfies

3y′4y′′y′′′′2−4y′4y′′2y′′′′+6y′3y′′2y′′′y′′′′+24y′2y′′4y′′′′−12y′3y′′y′′′3−29y′2y′′3y′′′2+12y′′7=0

◮ Can glue together arbitrary many such pieces ◮ Can arrange so that

  • f is solution : piecewise pseudo-linear

t

35 / 41

slide-120
SLIDE 120

Rubel’s proof in one slide

◮ Take f(t) = e

−1 1−t2 for −1 < t < 1 and f(t) = 0 otherwise.

It satisfies (1 − t2)2f

′′(t) + 2tf ′(t) = 0.

◮ For any a, b, c ∈ R, y(t) = cf(at + b) satisfies

3y′4y′′y′′′′2−4y′4y′′2y′′′′+6y′3y′′2y′′′y′′′′+24y′2y′′4y′′′′−12y′3y′′y′′′3−29y′2y′′3y′′′2+12y′′7=0

◮ Can glue together arbitrary many such pieces ◮ Can arrange so that

  • f is solution : piecewise pseudo-linear

t Conclusion : Rubel’s equation allows any piecewise pseudo-linear functions, and those are dense in C0

35 / 41

slide-121
SLIDE 121

Universal initial value problem (IVP)

x

y1(x)

Theorem

There exists a fixed (vector of) polynomial p such that for any continuous functions f and ε, there exists α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) has a unique solution y : R → Rd and ∀t ∈ R, |y1(t) − f(t)| ε(t).

36 / 41

slide-122
SLIDE 122

Universal initial value problem (IVP)

x

y1(x)

Notes : ◮ system of ODEs, ◮ y is analytic, ◮ we need d ≈ 300.

Theorem

There exists a fixed (vector of) polynomial p such that for any continuous functions f and ε, there exists α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) has a unique solution y : R → Rd and ∀t ∈ R, |y1(t) − f(t)| ε(t).

36 / 41

slide-123
SLIDE 123

Universal initial value problem (IVP)

x

y1(x)

Notes : ◮ system of ODEs, ◮ y is analytic, ◮ we need d ≈ 300.

Theorem

There exists a fixed (vector of) polynomial p such that for any continuous functions f and ε, there exists α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) has a unique solution y : R → Rd and ∀t ∈ R, |y1(t) − f(t)| ε(t). Remark : α is usually transcendental, but computable from f and ε

36 / 41

slide-124
SLIDE 124

A simplified proof α ∈ R ODE

t 0 1 1 0 1 0 1 0 0 1 1 1 . . . digits of α binary stream generator This is the ideal curve, the real

  • ne is an approximation of it.

N O T E

37 / 41

slide-125
SLIDE 125

A simplified proof α ∈ R ODE

t 0 1 1 0 1 0 1 0 0 1 1 1 . . . digits of α binary stream generator

ODE

t “Digital” to Analog Converter (fixed frequency) Approximate Lipschitz and bounded functions with fixed precision. N O T E That’s the trickiest part. N O T E

37 / 41

slide-126
SLIDE 126

A simplified proof α ∈ R ODE

t 0 1 1 0 1 0 1 0 0 1 1 1 . . . digits of α binary stream generator

ODE

t “Digital” to Analog Converter (fixed frequency)

ODE?

t We need something more : a fast-growing ODE. N O T E

37 / 41

slide-127
SLIDE 127

A simplified proof α ∈ R ODE

t 0 1 1 0 1 0 1 0 0 1 1 1 . . . digits of α binary stream generator

ODE

t “Digital” to Analog Converter (fixed frequency)

ODE?

t We need something more : an arbitrarily fast-growing ODE. N O T E

37 / 41

slide-128
SLIDE 128

A less simplified proof

binary stream generator : digits of α ∈ R t 1 1 1 1 f(α, µ, λ, t) = 1

2 + 1 2 tanh(µ sin(2απ4round(t−1/4,λ) + 4π/3))

It’s horrible, but generable round is the mysterious rounding function...

38 / 41

slide-129
SLIDE 129

A less simplified proof

binary stream generator : digits of α ∈ R t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3 dyadic stream generator : di = mi2−di, ai = 9i +

j<i dj

f(α, γ, t) = sin(2απ2round(t−1/4,γ))) round is the mysterious rounding function...

38 / 41

slide-130
SLIDE 130

A less simplified proof

t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3

38 / 41

slide-131
SLIDE 131

A less simplified proof

t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3

copy signal

38 / 41

slide-132
SLIDE 132

A less simplified proof

t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3

copy signal copy signal

38 / 41

slide-133
SLIDE 133

A less simplified proof

t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3

copy signal copy signal copy signal

38 / 41

slide-134
SLIDE 134

A less simplified proof

t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3

copy signal copy signal copy signal copy signal

38 / 41

slide-135
SLIDE 135

A less simplified proof

t 1 1 1 1 t d0 a0 d1 a1 d2 a2 d3 a3

copy signal copy signal copy signal copy signal

This copy operation is the “non-trivial” part.

38 / 41

slide-136
SLIDE 136

A less simplified proof

t We can do almost piecewise constant functions...

38 / 41

slide-137
SLIDE 137

A less simplified proof

t We can do almost piecewise constant functions... ◮ ...that are bounded by 1... ◮ ...and have super slow changing frequency.

38 / 41

slide-138
SLIDE 138

A less simplified proof

t We can do almost piecewise constant functions... ◮ ...that are bounded by 1... ◮ ...and have super slow changing frequency. How do we go to arbitrarily large and growing functions? Can a polynomial ODE even have arbitrary growth?

38 / 41

slide-139
SLIDE 139

An old question on growth

Building a fast-growing ODE, that exists over R : y′

1 = y1

  • y1(t) = exp(t)

39 / 41

slide-140
SLIDE 140

An old question on growth

Building a fast-growing ODE, that exists over R : y′

1 = y1

  • y1(t) = exp(t)

y′

2 = y1y2

  • y1(t) = exp(exp(t))

39 / 41

slide-141
SLIDE 141

An old question on growth

Building a fast-growing ODE, that exists over R : y′

1 = y1

  • y1(t) = exp(t)

y′

2 = y1y2

  • y1(t) = exp(exp(t))

. . . . . . y′

n = y1 · · · yn

  • yn(t) = exp(· · · exp(t) · · · ) := en(t)

39 / 41

slide-142
SLIDE 142

An old question on growth

Building a fast-growing ODE, that exists over R : y′

1 = y1

  • y1(t) = exp(t)

y′

2 = y1y2

  • y1(t) = exp(exp(t))

. . . . . . y′

n = y1 · · · yn

  • yn(t) = exp(· · · exp(t) · · · ) := en(t)

Conjecture (Emil Borel, 1899)

With n variables, cannot do better than Ot(en(Atk)).

39 / 41

slide-143
SLIDE 143

An old question on growth

Counter-example (Vijayaraghavan, 1932)

1 2 − cos(t) − cos(αt) t Sequence of arbitrarily growing spikes.

39 / 41

slide-144
SLIDE 144

An old question on growth

Counter-example (Vijayaraghavan, 1932)

1 2 − cos(t) − cos(αt) t Sequence of arbitrarily growing spikes. But not good enough for us.

39 / 41

slide-145
SLIDE 145

An old question on growth

Theorem

There exists a polynomial p : Rd → Rd such that for any continuous function f : R+ → R, we can find α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) satisfies y1(t) f(t), ∀t 0.

39 / 41

slide-146
SLIDE 146

An old question on growth

Theorem

There exists a polynomial p : Rd → Rd such that for any continuous function f : R+ → R, we can find α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) satisfies y1(t) f(t), ∀t 0. Note : both results require α to be transcendental. Conjecture still

  • pen for rational (or algebraic) coefficients.

39 / 41

slide-147
SLIDE 147

Universal initial value problem (IVP)

x

y1(x)

Theorem

There exists a fixed (vector of) polynomial p such that for any continuous functions f and ε, there exists α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) has a unique solution y : R → Rd and ∀t ∈ R, |y1(t) − f(t)| ε(t).

40 / 41

slide-148
SLIDE 148

Universal initial value problem (IVP)

x

y1(x)

Notes : ◮ system of ODEs, ◮ y is analytic, ◮ we need d ≈ 300.

Theorem

There exists a fixed (vector of) polynomial p such that for any continuous functions f and ε, there exists α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) has a unique solution y : R → Rd and ∀t ∈ R, |y1(t) − f(t)| ε(t).

40 / 41

slide-149
SLIDE 149

Universal initial value problem (IVP)

x

y1(x)

Notes : ◮ system of ODEs, ◮ y is analytic, ◮ we need d ≈ 300.

Theorem

There exists a fixed (vector of) polynomial p such that for any continuous functions f and ε, there exists α ∈ Rd such that y(0) = α, y′(t) = p(y(t)) has a unique solution y : R → Rd and ∀t ∈ R, |y1(t) − f(t)| ε(t). Remark : α is usually transcendental, but computable from f and ε

40 / 41

slide-150
SLIDE 150

Future work

Reaction networks : ◮ chemical ◮ enzymatic y′ = p(y) y′ = p(y) + e(t) ? ◮ Finer time complexity (linear) ◮ Nondeterminism ◮ Robustness ◮ « Space» complexity ◮ Other models ◮ Stochastic

41 / 41

slide-151
SLIDE 151

Backup slides

42 / 41

slide-152
SLIDE 152

Complexity of solving polynomial ODEs

y(0) = x y′(t) = p(y(t)) x y(t) x y(t)

43 / 41

slide-153
SLIDE 153

Complexity of solving polynomial ODEs

y(0) = x y′(t) = p(y(t))

Theorem

If y(t) exists, one can compute p, q such that

  • p

q − y(t)

  • 2−n in time

poly (size of x and p, n, ℓ(t)) where ℓ(t) ≈ length of the curve (between x and y(t)) x y(t) x y(t) length of the curve = complexity = ressource

43 / 41

slide-154
SLIDE 154

Universal DAE revisited

x

y1(x)

Theorem

There exists a fixed polynomial p and k ∈ N such that for any continuous functions f and ε, there exists α0, . . . , αk ∈ R such that p(y, y′, . . . , y(k)) = 0, y(0) = α0, y′(0) = α1, . . . , y(k)(0) = αk has a unique analytic solution and this solution satisfies such that |y(t) − f(t)| ε(t).

44 / 41