Model Structure Selection Tartu 2008 Neuron Takes number of - - PowerPoint PPT Presentation

model structure selection
SMART_READER_LITE
LIVE PREVIEW

Model Structure Selection Tartu 2008 Neuron Takes number of - - PowerPoint PPT Presentation

Model Structure Selection Tartu 2008 Neuron Takes number of inputs Processes them Uses the result for activation function http://www.bordalierinstitute.com/images/ Neuron.JPG Neural network Simple neural network Fully


slide-1
SLIDE 1

Model Structure Selection

Tartu 2008

slide-2
SLIDE 2

Neuron

  • Takes number of inputs
  • Processes them
  • Uses the result for

activation function

http://www.bordalierinstitute.com/images/ Neuron.JPG

slide-3
SLIDE 3

Neural network

  • Simple neural network
  • Fully connected two layer

feedforward network

  • Input layer
  • Hidden layer
  • Output layer
slide-4
SLIDE 4

Recurrent networks

  • Output of the hidden units

are fed back as inputs to the network

  • Time delay
  • Later considered networks

do not have internal feedback

slide-5
SLIDE 5

System identification

  • Goals

– High degree of automation – Numberical reliability – Computational effjciency

  • Experiment
  • Select model structure
  • Estimate model
  • Validate model
slide-6
SLIDE 6

Experiment

  • Purpose

– Data collection – Varying input(s) to observe the impact on

  • uputs
  • Main issues

– Choice of sampling frequency – Design of suitable input signal – Preprocessing of data (noise removal, nonlinearity tests, disturbances)

slide-7
SLIDE 7

Estimate & validate model

  • Neural network community -> training
  • r learning
  • Picking the model
  • Checking the requirements
slide-8
SLIDE 8

Used terms

  • Poles and zeros

– Consequences

  • Linear model structures
  • Linearity

y(t) = G(q-1)u(t) + H(q-1)e(t)

  • One-step ahead prediction:

ŷ(t|t-1) = H-1(q-1)G(q-1)u(t) + [1 - H-1(q-1)]y(t)

  • True system

y(t) = G0(q-1)u(t) + H0(q-1)e0(t)

slide-9
SLIDE 9

Used terms vol 2

  • Model structure
  • Predictor form:
slide-10
SLIDE 10

Used terms vol 3

  • Basic requirement:
  • Model is simply a particular choice of

parameter vector, say,

slide-11
SLIDE 11

Another form of a model

  • Rewriting general model structure as:
slide-12
SLIDE 12

The Finite Impulse Response FIR

  • Simplest type, let’s choose:
  • The predictor:
  • The parameter vector
slide-13
SLIDE 13

ARX model structure (AutoRegressive, eXternal input)

  • The model corresponds to the choice:
  • The predictor takes the form:
  • The parameter vector is:
slide-14
SLIDE 14

ARMAX model structure

(AutoRegressive, Moving Average, external input)

  • Corresponds to the choice:
  • The predictor takes the form:
  • The parameter vector is:
slide-15
SLIDE 15

Output error omdel structure OE

  • Used if only noise = white

measurement noise

  • Corresponding to the choice of G and

H

slide-16
SLIDE 16

Output error model structure continues

  • Regression vector
  • Parameter vector
slide-17
SLIDE 17

The State Space Innovations Form (SSIF)

  • Widely used alternative
  • Assume that system is described:
  • Optimal one-step ahead predictor:
slide-18
SLIDE 18

SSIF

  • Also known as Kalman filter
  • Matrix is known as Kalman gain
  • Poles of the predictor:
slide-19
SLIDE 19

Nonlinear Model Structures Based on Neural Networks

  • Selecting model structure more

complicated

  • Family of model structures -> MLP

networks

– With this choice 2 issues

  • Inputs to the network
  • Internal network architecture

– Often used approach  reusing input structures from linear models

slide-20
SLIDE 20

Nonlinear Model Structures Based on Neural Networks

  • Structural decisions are reasonable to handle
  • Suitable to design control systems
slide-21
SLIDE 21

NNFIR and NNARX

  • Predictors are

stable

  • This is important

as the stability issue is much more complex than for linear systems

  • Deterministic & noise level insignificant
slide-22
SLIDE 22

NNARMAX

  • Despite the feedforward

network, the predictor has feedback

  • Regressors:
  • Recurrent network
  • Stability as a local

properties

slide-23
SLIDE 23

NNOE

  • Some regressors are the predictions of

past outputs

  • Has the same problems as NNARMAX
slide-24
SLIDE 24

NNSSIF

  • Like NNOE and NNARMAX
  • Same problems with NNSSIF as with

SSIF

– Problem solving: two separate networks

slide-25
SLIDE 25

Stability

  • Very important in control theory
  • Necessary that the control system is

stable

  • Stability during training
  • Stability
slide-26
SLIDE 26

Asymptotic stability

slide-27
SLIDE 27

Exponential stability