Extended Kalman Filter Nonlinear State Space Model Derivation - - PowerPoint PPT Presentation

extended kalman filter nonlinear state space model
SMART_READER_LITE
LIVE PREVIEW

Extended Kalman Filter Nonlinear State Space Model Derivation - - PowerPoint PPT Presentation

Extended Kalman Filter Nonlinear State Space Model Derivation Consider the nonlinear state-space model of a random process Example application: frequency tracking x n +1 = f n ( x n ) + g n ( x n ) u n y n = h n ( x n ) + v n where


slide-1
SLIDE 1

Nonlinear Functions fn(xn), gn(xn) and hn(xn) are multivariate, time-varying functions. fn(xn) = ⎡ ⎢ ⎢ ⎢ ⎣ fn,1(xn) fn,2(xn) . . . fn,k(xn) ⎤ ⎥ ⎥ ⎥ ⎦ gn(xn) = ⎡ ⎢ ⎢ ⎢ ⎣ gn,11(xn) . . . gn,1m(xn) gn,21(xn) . . . gn,2m(xn) . . . ... . . . gn,ℓm(xn) . . . gn,ℓm(xn) ⎤ ⎥ ⎥ ⎥ ⎦ hn(xn) = ⎡ ⎢ ⎢ ⎢ ⎣ hn,1(xn) hn,2(xn) . . . hn,m(xn) ⎤ ⎥ ⎥ ⎥ ⎦

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

3

Extended Kalman Filter

  • Derivation
  • Example application: frequency tracking
  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

1

Taylor Series Approximations Now consider a first-order Taylor expansion of fn(x) and hn(x) about ˆ xn. fn(xn) ≈ fn(ˆ xn) + Fn(xn − ˆ xn) hn(xn) ≈ hn(ˆ xn) + Hn(xn − ˆ xn) gn(xn) ≈ gn(ˆ xn) where the matrices Fn and Hn are the Jacobians of fn(x) and hn(x) evaluated at ˆ xn Fn

ℓ×ℓ

= ∇xfn(x) = ∂fn(x) ∂x

  • x=ˆ

xn

Gn

m×m = gn(ˆ

xn) Hn

p×p = ∇xhn(x) = ∂hn(x)

∂x

  • x=ˆ

xn

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

4

Nonlinear State Space Model Consider the nonlinear state-space model of a random process xn+1 = fn(xn) + gn(xn)un yn = hn(xn) + vn where ⎡ ⎣ un vn x0 − ¯ x0 ⎤ ⎦ , ⎡ ⎢ ⎢ ⎣ um vm x0 − ¯ x0 1 ⎤ ⎥ ⎥ ⎦

  • =

⎡ ⎣ Qnδn,m Rnδn,m Π0 ⎤ ⎦ Assumes no interaction between the process and measurement noise, un, vm = 0

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

2

slide-2
SLIDE 2

Affine State Space Approximation xn+1 ≈ fn(ˆ xn) + Fn(xn − ˆ xn) + Gnun = Fnxn + fn(ˆ xn) − Fn ˆ xn

  • Requires estimation

+Gnun yn ≈ hn(ˆ xn) + Hn(xn − ˆ xn) + vn = Hnxn + (hn(ˆ xn) − Hn ˆ xn)

  • Requires estimation

+vn

  • The performance of the extended Kalman filter depends most

critically on the accuracy of the Taylor series approximations

  • Most accurate when the state residual (xn − ˆ

xn) is minimized

  • The EKF simply uses the best estimates that are available at the

time that the residuals must be calculated

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

7

Taylor Series Approximations: Linearization Point

  • Taylor series expansion could be about different estimates of xn

for each of the three nonlinear functions – ˆ xn|n is usually used for fn(·) and gn(·) – ˆ xn|n−1 is usually used for hn(·) – The best possible estimate should be used in all cases to minimize the error in the Taylor series approximations

  • Note that, even if the estimates ˆ

xn and ˆ yn−1 are unbiased, the estimates ˆ xn+1|n and ˆ yn|n−1 will be biased

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

5

Accounting for the Nonzero Mean xn+1 ≈ fn(ˆ xn) + Fn(xn − ˆ xn) + Gnun yn ≈ hn(ˆ xn) + Hn(xn − ˆ xn) + vn

  • The Taylor series approximation results in an affine model
  • There are two key differences between this model and the linear

model – xn and yn have nonzero means – The coefficients are now random variables — not constants

  • The typical approach to converting this to our linear model is to

subtract the means from both sides

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

8

Jacobians In expanded form, the Jacobians are given as follows ∇xfn(x) = ∂fn(x) ∂x = ⎡ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣

∂fn,1(x) ∂x(1) ∂fn,1(x) ∂x(2)

. . .

∂fn,1(x) ∂x(ℓ) ∂fn,2(x) ∂x(1) ∂fn,2(x) ∂x(2)

. . .

∂fn,2(x) ∂x(ℓ)

. . . . . . ... . . .

∂fn,ℓ(x) ∂x(1) ∂fn,ℓ(x) ∂x(2)

. . .

∂fn,ℓ(x) ∂x(ℓ)

⎤ ⎥ ⎥ ⎥ ⎥ ⎥ ⎦ ∇xhn(x) = ∂hn(x) ∂x = ⎡ ⎢ ⎢ ⎢ ⎢ ⎣

∂hn,1(x) ∂x(1) ∂hn,1(x) ∂x(2)

. . .

∂hn,1(x) ∂x(ℓ) ∂hn,2(x) ∂x(1) ∂hn,2(x) ∂x(2)

. . .

∂hn,2(x) ∂x(ℓ)

. . . . . . ... . . .

∂hn,p(x) ∂x(1) ∂hn,p(x) ∂x(2)

. . .

∂hn,p(x) ∂x(ℓ)

⎤ ⎥ ⎥ ⎥ ⎥ ⎦

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

6

slide-3
SLIDE 3

State Mean The expected value of xn+1 is E[xn+1] = Fn E[xn] + E[fn(ˆ xn)] − Fn E[ˆ xn] + Gn E[un] µn+1 = Fnµn + E[fn(ˆ xn)] − Fn E[ˆ xn] = Fn (µn − E[ˆ xn]) + E[fn(ˆ xn)] Thus we obtain a recursive update equation for the state mean. Again, if we assume that ˆ xn is nearly deterministic E[ˆ xn] ≈ ˆ xn E[fn(ˆ xn)] ≈ fn(ˆ xn) The mean recursion then simplifies to µn+1 = Fn (µn − ˆ xn) + fn(ˆ xn)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

11

Redefining the Normal Equations The optimal estimate of the state is now defined as the optimal affine transformation of yn ˆ xn+1|n Koyn + k = Ko (yn − E[yn]) + E[xn+1] In terms of the normal equations we have

ˆ xn+1|n = xn+1, col{1, y0, . . . , yn} col{1, y0, . . . , yn}−2 col{1, y0, . . . , yn} = xn+1, col{1, e0, . . . , en} col{1, e0, . . . , en}−2 col{1, e0, . . . , en} = µn+1 +

n

  • k=0

xn+1, ekR−1

e,kek

where the first innovation is now defined as e0 y0 − ˆ y0 = y0 − E[y0] and all the innovations have zero mean by definition

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

9

Centering the State µn+1 = Fn (µn − ˆ xn) + fn(ˆ xn) xc,n+1 xn+1 − µn+1 = (Fnxn + fn(ˆ xn) − Fn ˆ xn + Gnun) − (Fnµn + fn(ˆ xn) − Fn ˆ xn) = Fn(xn − µn) + (fn(ˆ xn) − fn(ˆ xn)) − Fn (ˆ xn − ˆ xn) + Gnun = Fnxc,n + Gnun Thus our complete linearized state space model becomes xc,n+1 = Fnxc,n + Gnun yc,n = Hnxc,n + vn

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

12

Centering the Output Equation E[yn] = E[hn(ˆ xn)] + Hn(E[xn] − E[ˆ xn]) + E[vn] = E[hn(ˆ xn)] + Hn(E[xn] − E[ˆ xn]) Now if we assume that ˆ xn is nearly deterministic (in other words, perfect) then E[ˆ xn] ≈ ˆ xn E[hn(ˆ xn)] ≈ hn(ˆ xn) and this simplifies to E[yn] ≈ hn(ˆ xn) + Hn(E[xn] − ˆ xn) Then the centered output is given by yc,n yn − E[yn] ≈ [hn(ˆ xn) + Hn(xn − ˆ xn) + vn] − [hn(ˆ xn) + Hn(E[xn] − ˆ xn)] = Hn(xn − E[xn]) + vn + (hn(ˆ xn) − hn(ˆ xn)) + Hn(ˆ xn − ˆ xn) = Hnxc,n + vn

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

10

slide-4
SLIDE 4

Predicted State Recursions µn+1 = Fn

  • µn − ˆ

xn|n

  • + fn(ˆ

xn|n) ˆ xn+1|n = ˆ xc,n+1|n + µn+1 =

  • Fn ˆ

xc,n|n

  • +
  • Fn
  • µn − ˆ

xn|n

  • + fn(ˆ

xn|n)

  • = Fn(ˆ

xc,n|n + µn) − Fn ˆ xn|n + fn(ˆ xn|n) = Fn ˆ xn|n − Fn ˆ xn|n + fn(ˆ xn|n) = fn(ˆ xn|n) Thus the time update can be obtained directly in terms of the estimated state without having to estimate the mean! ˆ xn+1|n = fn(ˆ xn|n)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

15

Linearization Points xn+1 ≈ fn(ˆ xn) + Fn(xn − ˆ xn) + Gnun yn ≈ hn(ˆ xn) + Hn(xn − ˆ xn) + vn

  • Best estimate of xn for the state prediction is ˆ

xn ˆ xn|n

  • Cannot be used for the output estimate because the innovation

en = yn − ˆ yn is required to calculate ˆ xn|n

  • Thus, ˆ

xn ˆ xn|n−1 is used for the output linearization point

  • These estimates result in several simplifications
  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

13

Filtered State Recursions ˆ xn|n = ˆ xc,n|n + µn = ˆ xc,n|n−1 + Kf,nen

  • + µn

= ˆ xc,n|n−1 + µn

  • + Kf,nen

= ˆ xn|n−1 + Kf,nen Thus the measurement updates can also be obtained directly in terms

  • f the estimated state without having to estimate the state mean!
  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

16

Estimating the State xc,n+1 = Fnxc,n + Gnun yc,n = Hnxc,n + vn

  • The KF recursions will give us the optimal estimated state for the

linearized model, ˆ xc,n|n−1 and ˆ xc,n|n

  • Conversion to the state estimates requires estimation of the state

mean ˆ xn|n−1 = ˆ xc,n|n−1 + µn ˆ xn|n = ˆ xc,n|n + µn

  • Would be convenient if we could estimate the state directly
  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

14

slide-5
SLIDE 5

Linearization Summary

  • The Kalman filter is optimal for the standard state space model
  • Minimizes the mean square error of all possible linear estimators
  • The EKF is only “approximately optimal” (nearly meaningless)
  • Three approximations

– First- and zero-order Taylor series approximations – Assumes ˆ xn|n−1 is unbiased in the output approximation – Assumes ˆ xn|n−1 and ˆ xn|n are nearly deterministic

  • It’s not clear to me how critical each of these assumptions are
  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

19

Estimating the Output The affine model for the output is yn ≈ hn(ˆ xn|n−1) + Hn(xn − ˆ xn|n−1) + vn Then the best affine estimate of yn given L{1, y0, . . . , yn−1} is given by ˆ yn|n−1 = ˆ hn(ˆ xn|n−1) + Hn(ˆ xn|n−1 − ˆ xn|n−1) = ˆ hn(ˆ xn|n−1) ≈ hn(ˆ xn|n−1) As before, these approximations assume ˆ xn|n−1 is nearly deterministic ˆ hn(ˆ xn|n−1) ≈ hn(ˆ xn|n−1)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

17

Extended Kalman Filter (unsimplified) ˆ x0|−1 = ¯ x0 P0|−1 = Π0 en = yn − hn(ˆ xn|n−1) Re,n = Rn + HnPn|n−1H∗

n

Kf,n = Pn|n−1H∗

nR−1 e,n

ˆ xn|n = ˆ xn|n−1 + Kf,nen ˆ xn+1|n = fn(ˆ xn|n) Pn|n = Pn|n−1 − Kf,nRe,nK∗

f,n

Pn+1|n = FnPn|nF ∗

n + GnQnG∗ n

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

20

Innovations It would be convenient if we could express the innovations directly in terms of the observations, rather than the centered observations. This would also eliminate the need to estimate E[yn]. en yc,n − ˆ yc,n|n−1 = (yn + E[yn]) − ˆ yn|n−1 + E[yn]

  • = yn − ˆ

yn|n−1 ≈ yn − hn(ˆ xn|n−1)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

18

slide-6
SLIDE 6

Extended Kalman Filter (complete) A more complete algorithm is as follows ˆ x0|−1 = ¯ x0 P0|−1 = Π0 Hn = ∇xhn(x)|x=ˆ

xn|n−1

Kf,n = Pn|n−1H∗

n(HnPn|n−1H∗ n + Rn)−1

ˆ xn|n = ˆ xn|n−1 + Kf,n

  • yn − hn(ˆ

xn|n−1)

  • Fn = ∇xfn(x)|x=ˆ

xn|n

Gn = gn(ˆ xn|n) ˆ xn+1|n = fn(ˆ xn|n)Pn|n = (I − Kf,nHn)Pn|n−1 Pn+1|n = FnPn|nF ∗

n + GnQnG∗ n

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

23

Simplifiying the EKF: Filtered Gain There are a number of ways in which these 9 equations can be simplified Kf,n = Pn|n−1H∗

n(HnPn|n−1H∗ n + Rn)−1

Re,nK∗

f,n = Re,n

  • R−1

e,nHnPn|n−1

  • = HnPn|n−1

Pn|n = Pn|n−1 − Kf,nRe,nK∗

f,n

= Pn|n−1 − Kf,nHnPn|n−1 = (I − Kf,nHn)Pn|n−1

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

21

where ∇ denotes the Jacobian operator ∇x ∂ ∂x (1)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

24

Extended Kalman Filter (simplified) The Schmidt extended Kalman filter (EKF) can now be initialized with ˆ x0|−1 = ¯ x0 P0|−1 = Π0 and the recursive algorithm can be expressed as Kf,n = Pn|n−1H∗

n(HnPn|n−1H∗ n + Rn)−1

ˆ xn|n = ˆ xn|n−1 + Kf,n

  • yn − hn(ˆ

xn|n−1)

  • ˆ

xn+1|n = fn(ˆ xn|n) Pn|n = (I − Kf,nHn)Pn|n−1 Pn+1|n = FnPn|nF ∗

n + GnQnG∗ n

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

22

slide-7
SLIDE 7

Example 1: Continuous- to Discrete-time Conversion Use the coarse CT-DT conversion, we obtain a DT state space model y(n) = a sin

  • 2πTs ¯

fn + θ(n)

  • + v(n)

θ(n + 1) = θ(n) + Tsω(n) ω(n + 1) = ω(n) + u(n) f(n) = ¯ f + 1 2π u(n) Note that the state update equation is linear.

  • θ(n + 1)

ω(n + 1)

  • =
  • 1

Ts 1 θ(n) ω(n)

  • + u(n)

xn+1 = Fxn + un where u(n), u(k) = Tsδkn

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

27

EKF Discussion

  • Can only be expected to perform well for nonlinear functions that

can be accurately approximated “locally” about ˆ xn as linear functions

  • Excludes functions with discontinuities and discontinuous slopes
  • Only smooth functions should be used
  • There is a positive feedback mechanism in the Taylor series

approximations – If the estimate ˆ xn is good, then the approximation will generally be accurate and the next estimate will be good (possibly better) – If the estimate is poor, then the next estimate will be poor (possibly worse)

  • If the EKF starts to lose track, the approximation error in the

Taylor series approximation may exacerbate this problem in subsequent estimates

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

25

Example 1: Output Linearization y(n) = a sin

  • 2πTs ¯

fn + θ(n)

  • + v(n)

So we have hn(xn) a sin

  • 2πTs ¯

fn + θ(n)

  • = a sin
  • 2πTs ¯

fn + xn(1)

  • and

Hn ∇xhn(x) =

  • a cos
  • 2πTs ¯

fn + xn(1)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

28

Example 1: Frequency Tracking Develop a state space model to track the frequency of a quasi-periodic sinusoid measured from a noisy sensor. Consider the following continuous-time state space model y(t) = a sin

  • 2πTs ¯

fn + θ(n)

  • + v(n)

˙ θ(t) = ω(t) ˙ ω(t) = u(t) f(t) = ¯ f + 1 2π ω(t)

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

26

slide-8
SLIDE 8

Example 1: Chirp Frequency Tracking

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 5 10 15 20 25 30 35 40 45 −1 1 Time (min) Signal q=5.01e-005 5 10 15 20 25 30 35 40 45 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

31

Example 1: Chirp Frequency Tracking

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 5 10 15 20 25 30 35 40 45 −1 1 Time (min) Signal q=5.01e-007 5 10 15 20 25 30 35 40 45 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

29

Example 1: Chirp Frequency Tracking

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 5 10 15 20 25 30 35 40 45 −1 1 Time (min) Signal q=0.000501 5 10 15 20 25 30 35 40 45 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

32

Example 1: Chirp Frequency Tracking

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 5 10 15 20 25 30 35 40 45 −1 1 Time (min) Signal q=5.01e-006 5 10 15 20 25 30 35 40 45 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

30

slide-9
SLIDE 9

Example 1: MATLAB Code

clear; close all; %==================================================================== % User-Specified Parameters %==================================================================== fs = 1; % Sample rate (Hz) N = 3000; % Number of observations from the process Np = 500; % Number of samples to throw away to account for transient k1 = 1:N/4; t1 = (k1-0.5)/fs; yc = chirp(t1,0.05,t1(end),0.45); y = [yc fliplr(yc) yc fliplr(yc)]’; %==================================================================== % Kalman Filter Estimates %==================================================================== q = [1e-6 1e-5 1e-4 1e-3 1e-2 1e-1]; for c1=1:length(q), fi = KalmanFrequencyTracker(y,fs,[],[q(c1)*var(y) var(y) 0.1]); NonparametricSpectrogram(y,fs,200); colormap(ColorSpiral); hold on; k = 1:length(y); t = (k-0.5)/(60*fs); h = plot(t,fi,’k’,t,fi,’w’); set(h(1),’LineWidth’,1.5); set(h(2),’LineWidth’,0.5); hold off; FigureSet(1,’LTX’); AxisSet(8);

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

35

Example 1: Chirp Frequency Tracking

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 5 10 15 20 25 30 35 40 45 −1 1 Time (min) Signal q=0.00501 5 10 15 20 25 30 35 40 45 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

33

title(sprintf(’$q$=%5.3g’,q(c1)*var(y))); print(sprintf(’KFChirpQ%d’,round(log10(q(c1)))),’-depsc’); end;

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

36

Example 1: Chirp Frequency Tracking

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 5 10 15 20 25 30 35 40 45 −1 1 Time (min) Signal q=0.0501 5 10 15 20 25 30 35 40 45 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

34

slide-10
SLIDE 10

end; pf = 0; % Default - no plotting if nargout==0, % Plot if no output arguments pf = 1; end; if exist(’pfa’) & ~isempty(pfa), pf = pfa; end; %==================================================================== % Parameter Definitions %==================================================================== nx = 2; % Dimension of the state vector %==================================================================== % Preprocessing %==================================================================== my = mean(y); % Save the mean value of y y = y(:) - mean(y); % Make sure y is a column vector ny = length(y); % No. samples Ts = 1/fs; % Sampling interval (sec) wmean = fmean/(2*pi); % Mean frequency Q = Ts*diag([0;nv(1)]); % Process noise covariance matrix for discrete-time process model R = nv(2); % Measurement noise covariance matrix for discrete-time process model P0 = nv(3)*eye(nx,nx); % Initial predicted error covariance matrix %==================================================================== % Memory Allocation %==================================================================== fi = zeros(ny,1); fv = zeros(ny,1); yh = zeros(ny,1); Hs = cell(ny,1);

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

39

Example 1: MATLAB Code

function [fi,fv,yh] = KalmanFrequencyTracker(y,fsa,fmna,nva,aa,pfa); %KalmanFilterFrequencyTracker: Tracks instantaneous frequency % % [fi,yh] = KalmanFrequencyTracker(y,fs,fmn,nv,a,pf); % % y Input signal. % fs Sample rate (Hz). Default = 1 Hz. % fmn A priori mean frequency % nv Noise variances of [fi y x0]. Default = [10 0.1 0.1]. % a Amplitude of the frequency component. Default = see manuscript. % pf Plot flag: 0=none (default), 1=screen, 2=current figure. % % fi Instantaneous frequency (fi) estimates. % yh Smoothed estimate of the input signal. % % Uses the following model a harmonic estimate of the signal % with time-varying amplitude and instantaneous frequency. % % Example: Generate the parametric spectrogram of an intracranial % pressure signal using a Blackman-Harris window that is 45 s in % duration. % % load Tremor.mat; % N = length(x); % fss = 750; % New sample rate (Hz) % Ns = ceil(N*fss/fs); % Number of samples % sti = floor((fss/fs)*(si-1))+1; % Indices of spikes % ks = 1:Ns; % Sample index % ts = (ks-0.5)/fss; % Time index % xs = zeros(Ns,1); % Allocate memory for spike train % xs(sti) = 1; % Create spike train % KalmanFrequencyTracker(xs,fss); % %

  • S. Kim, J. McNames, "Tracking tremor frequency in spike trains
  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

37

Fs = cell(ny,1); Res = cell(ny,1); es = cell(ny,1); xhps = cell(ny,1); Pps = cell(ny+1,1); %==================================================================== % Kalman Filter %==================================================================== %-------------------------------------------------------------------- % Variable Initialization %-------------------------------------------------------------------- xhp = zeros(nx,1); % Initial value of predicted state at time i=0 Pp = P0; % Predicted state error covariance Pps{1} = Pp; xhps{1} = xhp; %-------------------------------------------------------------------- % Kalman Filter Recursions %-------------------------------------------------------------------- for n=1:ny, H = [a*cos(wmean*n*Ts+xhp(1)),0]; Re = R + H*Pp*H’; % Innovation covariance matrix Kf = Pp*H’*inv(Re); % Filtered estimate Kalman gain yhp = a*sin(wmean*n*Ts+xhp(1)); % Predicted estimate of the output e = y(n) - yhp; % Innovation xhm = xhp + Kf*e; % Measurement update estimate of the state F = [1 Ts;0 1]; % Linearized state transition matrix xhp = [xhm(1)+Ts*xhm(2);xhm(2)]; % Predicted state estimates Pm = Pp - Kf*Re*Kf’; % Measurement state error covariance Pp = F*Pm*F’ + Q; % Predicted state error covariance %---------------------------------------------------------------- % Store Variables for Smoothing %----------------------------------------------------------------

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

40

% using the extended Kalman filter," Proceedings of the 27th Annual % International Conference of the Engineering in Medicine and % Biology Society, September 2005. % % Version 1.00 JM % % See also Lowpass, ParametricSpectrogram, and KernelFilter. %==================================================================== % Error Checking %==================================================================== if nargin<2, help KalmanFrequencyTracker; return; end; %==================================================================== % Process Function Arguments %==================================================================== fs = 1; if exist(’fsa’,’var’) & ~isempty(fsa), fs = fsa; end; fmean = 2; if exist(’fmna’,’var’) & ~isempty(fmna), fmean = fmna; end; a = 0.1*std(y); % Amplitude of the frequency component if exist(’aa’,’var’) & ~isempty(aa), a = aa; end; nv = [0.01*var(y) var(y) 0.1]; % Noise Variances if exist(’nva’,’var’) & ~isempty(nva), nv = nva;

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

38

slide-11
SLIDE 11

Hs{n} = H; Fs{n} = F; Res{n} = Re; es{n} = e; xhps{n+1} = xhp; Pps{n+1} = Pp; %---------------------------------------------------------------- % Store Variables of Interest %---------------------------------------------------------------- fi(n) = (xhm(2)+wmean)/(2*pi); % Instantaneous frequency estimate fv(n) = Pm(2,2)/((2*pi).^2); % Estimated error variance of estimate end; %==================================================================== % Post-processing %==================================================================== yh = yh + my; %==================================================================== % Plotting %==================================================================== if pf>=1, fmean = wmean/(2*pi); ds = floor(fs/(4*fmean)); Tx = ny/fs; NonparametricSpectrogram(decimate(y,ds),fs/ds,10/fmean,[],[],[],pf); hold on; k = 1:ny; t = (k-0.5)/fs; h = plot(t,fi,’k’,t,fi,’w’); set(h(1),’LineWidth’,2.5); set(h(2),’LineWidth’,1.5); hold off; end; %====================================================================

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

41

% Process Return Arguments %==================================================================== if nargout==0, clear(’fi’,’yh’); end;

  • J. McNames

Portland State University ECE 539/639 Extended Kalman Filter

  • Ver. 1.02

42