Forecasting Non-Stationary Time Series without Recurrent Connections - - PowerPoint PPT Presentation

forecasting non stationary time series without recurrent
SMART_READER_LITE
LIVE PREVIEW

Forecasting Non-Stationary Time Series without Recurrent Connections - - PowerPoint PPT Presentation

Forecasting Non-Stationary Time Series without Recurrent Connections AP Engelbrecht Department of Industrial Enigneering, and Computer Science Division Stellenbosch University South Africa engel@sun.ac.za Engelbrecht (Stellenbosch


slide-1
SLIDE 1

Forecasting Non-Stationary Time Series without Recurrent Connections

AP Engelbrecht

Department of Industrial Enigneering, and Computer Science Division Stellenbosch University South Africa engel@sun.ac.za

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 1 / 30

slide-2
SLIDE 2

Presentation Outline I

1

Introduction

2

The Time Series Used

3

Recurrent Neural Networks

4

Dynamic Optimization Problems

5

Particle Swarm Optimization

6

PSO Training of NNs

7

Empirical Analysis

8

Conclusions

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 2 / 30

slide-3
SLIDE 3

Introduction

The main goal of this study was to investigate if recurrent connections

  • r time delays are necessary when training neural networks (NNs) for

non-stationary time series prediction using a dynamic particle swarm

  • ptimization (PSO) algorithm

Consider training of the NN as a dynamic optimization problem, due to the statistical properties of the time series changing over time The quantum-inspired PSO (QSO) is a dynamic PSO with the ability to track optima in changing landscapes

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 3 / 30

slide-4
SLIDE 4

The Time Series

Plots

International Airline Passengers (AIP) Australian Wine Sales (AWS)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 4 / 30

slide-5
SLIDE 5

The Time Series

Plots (cont)

US Accidental Death (USD) Sunspot Annual Measure (SAM)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 5 / 30

slide-6
SLIDE 6

The Time Series

Plots (cont)

Hourly Internet Traffic (HIT) Daily Minimum Temperature (DMT)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 6 / 30

slide-7
SLIDE 7

The Time Series

Plots (cont)

Mackey Glass (MG) Logistic Map (LM)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 7 / 30

slide-8
SLIDE 8

Feedforward Neural Networks

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 8 / 30

slide-9
SLIDE 9

Recurrent Neural Networks

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 9 / 30

slide-10
SLIDE 10

Dynamic Optimization Problems

Training of a NN is an optimization problem, with the objective to find best values for weights and biases such that a given error function is minimized Forecasting a non-stationary time series is a dynamic optimization process, due to the statistical properties of the time series changing

  • ver time

Dynamic optimization problems: search landscape properties change over time

  • ptima change over time, in value and in position

new optima may appear existing optima may disappear changes further characterized by change severity and change frequency

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 10 / 30

slide-11
SLIDE 11

Dynamic Optimization Problems (cont)

Implications Optimization Algorithms: Need to adjust values assigned to decision variables in order to track changing optima, without re-optimizing For NN training, need to adapt weight and bias values to cope with concept drift, without re-training Should have the ability to escape local minima Need to continually inject diversity into the search

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 11 / 30

slide-12
SLIDE 12

Particle Swarm Optimization

Introduction

What is particle swarm optimization (PSO)? a simple, computationally efficient optimization method population-based, stochastic search individuals follow very simple behaviors:

emulate the success of neighboring individuals, but also bias towards own experience of success

emergent behavior: discovery of optimal regions within a high dimensional search space

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 12 / 30

slide-13
SLIDE 13

Particle Swarm Optimization

Main Components

What are the main components? a swarm of particles each particle represents a candidate solution elements of a particle represent parameters to be optimized The search process: Position updates xi(t + 1) = xi(t) + vi(t + 1), xij(0) ⇠ U(xmin,j, xmax,j) Velocity (step size)

drives the optimization process reflects experiential knowledge of the particles and socially exchanged information about promising areas in the search space

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 13 / 30

slide-14
SLIDE 14

Particle Swarm Optimization

Inertia Weight PSO

used either the star (gbest PSO) or social (lbest PSO) topology velocity update per dimension: vij(t + 1) = wvij(t) + c1r1j(t)[yij(t) xij(t)] + c2r2j(t)[ˆ yij(t) xij(t)] vij(0) = 0 w is the inertia weight c1, c2 are positive acceleration coefficients r1j(t), r2j(t) ⇠ U(0, 1) note that a random number is sampled for each dimension

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 14 / 30

slide-15
SLIDE 15

Particle Swarm Optimization

PSO Algorithm

Create and initialize an nx-dimensional swarm, S; repeat for each particle i = 1, . . . , S.ns do if f(S.xi) < f(S.yi) then S.yi = S.xi; end for each particle ˆ i with particle i in its neighborhood do if f(S.yi) < f(S.ˆ yˆ

i) then

S.ˆ yˆ

i = S.yi;

end end end for each particle i = 1, . . . , S.ns do update the velocity and position; end until stopping condition is true;

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 15 / 30

slide-16
SLIDE 16

Particle Swarm Optimization

Quantum-Inspired PSO (QSO)

Developed to find and track an optimum in changing search landscapes Based on quantum model of an atom, where orbiting electrons are replaced by a quantum cloud which is a probability distribution governing the position of each electron Swarm contains

neutral particles following standard PSO updates charged, or quantum particles, randomly placed within a multi-dimensional sphere xi(t + 1) = ⇢ xi(t) + vi(t + 1) if Qi = 0 Bˆ

y(rcloud)

if Qi 6= 0

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 16 / 30

slide-17
SLIDE 17

Particle Swarm Optimization

Cooperative PSO

For large-scale optimization problems, a divide-and-conquer approach to address the curse of dimensionality: Each particle is split into K separate parts of smaller dimension Each part is then optimized using a separate sub-swarm If K = nx, each dimension is optimized by a separate sub-swarm Cooperative quantum PSO (CQSO) uses QSO in the sub-swarms

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 17 / 30

slide-18
SLIDE 18

PSO Training of NNs

When using PSO to train a NN: each particle represents the weights and biases of one NN

  • bjective function is a cost function, e.g. SSE

to prevent hidden unit saturation, use ReLU any activation function in the output units For non-stationary time series prediction: Used cooperative PSO with QSO in sub-swarms RNNs used modified hyperbolic tangent: f(net) = 1.7159 tanh(1

3net)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 18 / 30

slide-19
SLIDE 19

Control Parameters

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 19 / 30

slide-20
SLIDE 20

Dynamic Scenarios

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 20 / 30

slide-21
SLIDE 21

Performance Measure

Used the collective mean error, Fmean(t) = PT

t=1 F(t)

T where F(t) is the MSE at time t Number of independent runs: 30

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 21 / 30

slide-22
SLIDE 22

Results

MG (Mackey Glass)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 22 / 30

slide-23
SLIDE 23

Results

HIT (Hourly Internet Traffic)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 23 / 30

slide-24
SLIDE 24

Results

DMT (Daily Minimum Temperature)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 24 / 30

slide-25
SLIDE 25

Results

SAM (Sunspot Annual Measure)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 25 / 30

slide-26
SLIDE 26

Results

LM (Logistic Map)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 26 / 30

slide-27
SLIDE 27

Results

AWS (Australian Wine Sales)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 27 / 30

slide-28
SLIDE 28

Results

AIP (International Airline Passengers)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 28 / 30

slide-29
SLIDE 29

Results

USD (US Accidental Death)

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 29 / 30

slide-30
SLIDE 30

Conclusions

The aim of the study was to investigate if recurrent connections or delays are necessary if a dynamic PSO is used to train a NN for time series prediction Main observation: A FFNN trained with the cooperative quantum PSO performed better than the RNNs used for most problems and scenarios Where the CQSO FFNN algorithm did not perform best, differences in performance were not statistically significant Future work will: expand the study to other variants of recurrent NNs, and more time series develop dynamic architecture optimization approaches

Engelbrecht (Stellenbosch University) Non-Stationary Time Series Forecasting 3 May 2019 30 / 30