Solving Differential Equations through Means of Deep Learning - - PowerPoint PPT Presentation

solving differential equations through means of deep
SMART_READER_LITE
LIVE PREVIEW

Solving Differential Equations through Means of Deep Learning - - PowerPoint PPT Presentation

Solving Differential Equations through Means of Deep Learning Juliane Braunsmann February 8, 2019 Table of Contents Neural Networks 1 Introduction to machine learning applied to PDEs 2 Learning with a residual loss... 3 ... and hard


slide-1
SLIDE 1

Solving Differential Equations through Means of Deep Learning

Juliane Braunsmann

February 8, 2019

slide-2
SLIDE 2

Table of Contents

1

Neural Networks

2

Introduction to machine learning applied to PDEs

3

Learning with a residual loss... ... and hard assignment of constraints ... and soft assignment of constraints

4

Summary

slide-3
SLIDE 3

Table of Contents

1

Neural Networks

2

Introduction to machine learning applied to PDEs

3

Learning with a residual loss... ... and hard assignment of constraints ... and soft assignment of constraints

4

Summary

slide-4
SLIDE 4

Solving Differential Equations through Means of Deep Learning

Machine Learning

Machine learning discovers rules to execute a data-processing task, given examples of what’s expected.

— François Chollet, Deep Learning with Python [Cho17]

We require three things:

▶ Input data points e.g. images for image tagging, speech for speech recognition ▶ Examples of the expected output e.g. tagged images, transcribed audio files ▶ A way to measure if the algorithm is doing a good job to determine if the output of the

algorithm is close to the expected output, this enables learning

Juliane Braunsmann

1 46

slide-5
SLIDE 5

Solving Differential Equations through Means of Deep Learning

Formalization

Training data, consiting of pairs of input data and expected output: {(x1, y1), … , (xN, yN)} ⊆ 𝒴 × 𝒵, assumed to be i.i.d distributed samples (observations) from an unknown probability distribution P. Goal: Given a new input x ∈ 𝒴, predict output ̂ y ∈ 𝒵, i. e. find a prediction function f: x ↦ ̂ y. Assumption: (x, y) is another independent observation of P To measure if the prediction function is doing a good job, we have to define a loss function L: 𝒵 × 𝒵 → ℝ, where L(y, ̂ y) measures how close the expected output y is to the predicted output ̂ y.

Juliane Braunsmann

2 46

slide-6
SLIDE 6

Solving Differential Equations through Means of Deep Learning

Typical losses

Some typical loss functions are: squared loss L(y, ̂ y) = |y − ̂ y|

2 for 𝒵 = ℝ,

zero-one loss L(y, ̂ y) = 𝟚y( ̂ y) for arbitrary𝒵, cross-entropy loss L(y, ̂ y) = −(y log( ̂ y) + (1 − y) log(1 − ̂ y)) for 𝒵 = [0, 1].

Juliane Braunsmann

3 46

slide-7
SLIDE 7

Solving Differential Equations through Means of Deep Learning

Average loss

The quality of a prediction function is given by E(f) = 𝔽P[L(y, f(x))] = ∫

𝒴

L(y, f(x))dP(x) ≈ 1 N

N

i=1

L(yi, f(xi)) =

N

i=1

Ei(f) Then find f∗ = arg min

f

E(f), where the type of f is determined by the applied prediction algorithm.

Juliane Braunsmann

4 46

slide-8
SLIDE 8

Solving Differential Equations through Means of Deep Learning

Machine learning techniques

There are many different types of prediction functions in machine learning:

▶ decision trees ▶ support vector machines ▶ naive Bayes classifiers ▶ k-nearest neighbor algorithm ▶ neural networks, specifically deep learning

Juliane Braunsmann

5 46

slide-9
SLIDE 9

Solving Differential Equations through Means of Deep Learning

Machine learning techniques

There are many different types of prediction functions in machine learning:

▶ decision trees ▶ support vector machines ▶ naive Bayes classifiers ▶ k-nearest neighbor algorithm ▶ neural networks, specifically deep learning

Juliane Braunsmann

6 46

slide-10
SLIDE 10

Solving Differential Equations through Means of Deep Learning

Deep Learning

Deep learning is a specific subfield of machine learning: a new take on learning representations from data that puts an emphasis on learning successive layers of increasingly meaningful representations.

— François Chollet, Deep Learning with Python [Cho17]

Juliane Braunsmann

7 46

slide-11
SLIDE 11

Solving Differential Equations through Means of Deep Learning

Feedforward Neural Network1

Σ 𝜏

+1 x1 x2 x3 xn

b w1 w2 w3 wn 𝜏 (

n

i=1

wixi + b) ⋮

1https://github.com/PetarV-/TikZ/tree/master/Multilayer%20perceptron Juliane Braunsmann

8 46

slide-12
SLIDE 12

Solving Differential Equations through Means of Deep Learning

Feedforward Neural Network1

Σ 𝜏

+1 x1 x2 x3 xn

b w1 w2 w3 w

n

𝜏 (

n

i=1

wixi + b) ⋮

I1 I2 I3 Input layer Hidden layer Output layer O1 O2

1https://github.com/PetarV-/TikZ/tree/master/Multilayer%20perceptron Juliane Braunsmann

9 46

slide-13
SLIDE 13

Solving Differential Equations through Means of Deep Learning

Typical activation functions

Some typical activation functions are: Sigmoid function 𝜏(z) = 1 1 + exp(−z) Tanh function 𝜏(z) = exp(z) − exp(−z) exp(z) + exp(−z) ReLu function 𝜏(z) = max(z, 0)

Juliane Braunsmann

10 46

slide-14
SLIDE 14

Solving Differential Equations through Means of Deep Learning

Typical activation functions

−1 1 ReLu (scaled) Tanh Sigmoid

Juliane Braunsmann

11 46

slide-15
SLIDE 15

Solving Differential Equations through Means of Deep Learning

Formalization of (feed-forward) neural network

Given an input vector zl ∈ ℝn, the output of the fully connected layer l + 1 is (𝜏 (

n

i=1

wl

i,jzi + bl i)) j=1,…,m

∈ ℝm. Such layers can be concatenated, yielding a deep neural network parametrized by weight matrices Wl and bias vectors bl for each layer. We denote these parameters by 𝜄 and the corresponding neural network by f𝜄. We write E(𝜄) =

N

i=1

Ei(𝜄) = 1 N

N

i=1

L(yi, f𝜄(xi)).

Juliane Braunsmann

12 46

slide-16
SLIDE 16

Solving Differential Equations through Means of Deep Learning

Training loop

Source: [Cho17]

Juliane Braunsmann

13 46