Autoregressive Models Overview Direct Structures P Direct - - PowerPoint PPT Presentation

autoregressive models overview direct structures
SMART_READER_LITE
LIVE PREVIEW

Autoregressive Models Overview Direct Structures P Direct - - PowerPoint PPT Presentation

Autoregressive Models Overview Direct Structures P Direct structures x ( n ) = a k x ( n k ) + w ( n ) Types of estimators k =1 Parametric spectral estimation Notation differs (again) from text Parametric


slide-1
SLIDE 1

Properties x(n) = −

P

  • k=1

akx(n − k) + w(n)

  • If ˆ

R is positive definite (and Toeplitz?), the model will be – Stable – Causal – Minimum phase – Invertible

  • True of all the estimators except the proposed “unbiased”

technique

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

3

Autoregressive Models Overview

  • Direct structures
  • Types of estimators
  • Parametric spectral estimation
  • Parametric time-frequency analysis
  • Order selection criteria
  • Lattice structures?
  • Maximum entropy
  • Excitation with line spectra
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

1

Problem Unification x(n) = −

P

  • k=1

akx(n − k) + w(n)

  • AR modeling is approximately equivalent to several other useful

problems – Estimating the coefficients of a whitening filter – One-step ahead prediction – Maximum entropy signal modeling

  • In the MSE case if the process is minimum phase, these are

exactly equivalent

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

4

Direct Structures x(n) = −

P

  • k=1

akx(n − k) + w(n)

  • Notation differs (again) from text
  • Essentially all of the techniques that we discussed for FIR filters

can be applied

  • Many ways to estimate

– How to estimate the autocorrelation matrix? – Degree of windowing (full, pre-, post-, no/short) – Weighted least squares?

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

2

slide-2
SLIDE 2

Frequency Domain Representation of the Error Signal x(n) = h(n) ∗ w(n) e(n) = ˆ w(n) = ˆ hi(n) ∗ x(n) E(ejω) = ˆ Hi(ejω)X(ejω) = X(ejω) ˆ H(ejω) E =

  • n=−∞

|e(n)|2 = 1 2π π

−π

|X(ejω)|2 | ˆ H(ejω)|2 dω

  • In the AR case, H(z) =

1 A(z)

  • Note the frequency domain equations only exist if the signals are

energy signals – Segments of stationary processes

  • This means solving for the coefficients that minimize the error also

minimize the integral of the ratio of the ESDs

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

7

Windowing We have discussed three types of windowing

  • Data Windowing: xw(n) = w(n)x(n)

– Used to reduce spectral leakage of nonparametric PSD estimators

  • Correlation Windowing: rw(ℓ) = ˆ

r(ℓ)w(ℓ) – Used to reduce variance of nonparametric PSD estimators

  • Weighted Least Squares: Ee = N−1

n=0 w2 n|y(n) − cTx(n)|2

– Used to weight the influence of some observations more than

  • thers

– Optimal when error variance is non-constant in the deterministic data matrix case

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

5

AR Spectral Estimation E =

  • n=−∞

|e(n)|2 = 1 2π π

−π

|X(ejω)|2 | ˆ H(ejω)|2 dω = 1 2π π

−π

|X(ejω)|2 | ˆ A(ejω)|2 dω

  • This means solving for the coefficients that minimize the error also

minimize the integral of the ratio of the ESDs

  • Why can’t we just make ˆ

Hi(ejω) large or ˆ A(ejω) small at all frequencies? – Recall the constraint a0 = 1 in ˆ a =

  • 1

ˆ a1 . . . ˆ aP

  • ak = 1

2π π

−π

A(ejω) ejωk dω a0 = 1 2π π

−π

A(ejω) dω – Thus A(ejω) is constrained to have unit area

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

8

Autoregressive Estimation Windowing

  • Parametric windowing is mostly reserved for nonstationary

applications – Time-frequency analysis

  • Text seems to implicity suggest using data windowing

– This is a bad idea! – Biases the estimate – No obvious gain

  • Is much better to perform a weighted least squares

– Each row of the data matrix and output still correspond to a specific time – Estimate is unbiased – Permits you to weight the influence of points near the center

  • f the observation window (block) more heavily

– Can be used with any non-negative window (need not be positive definite)

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

6

slide-3
SLIDE 3

Order Selection Problem Unbiased ˆ MSEu =

1 Nf−Ni+1−P Nf

  • n=Ni

|e(n)|2 Biased ˆ MSEb =

1 Nf−Ni+1 Nf

  • n=Ni

|e(n)|2

  • Goal: Select the value of P that minimizes the MSE
  • We have two estimates of the MSE
  • One from Chapter 8 and one from Chapter 9
  • The one from Chapter 8 was unbiased
  • Why don’t we use it to obtain the best value of P?

– Only holds in the deterministic data matrix case – The data matrix is always stochastic with AR models

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

11

AR Spectral Estimation Properties E =

  • n=−∞

|e(n)|2 = 1 2π π

−π

|X(ejω)|2 | ˆ H(ejω)|2 dω = 1 2π π

−π

|X(ejω)|2 | ˆ A(ejω)|2 dω

  • The frequency domain representation of the error makes it clear

that the impact of the error across the full frequency range in uniform – There is no benefit to fitting lower or higher frequency ranges more accurately, in general

  • The regions where |X(ejω)| > | ˆ

H(ejω)| contribute more to the total error – The error will be minimized if | ˆ H(ejω)| is larger in these regions – This is part of the reason the estimate is more accurate near spectral peaks, than valleys – Nonparametric estimators were also more accurate near peaks, but for very different reasons (spectral leakage)

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

9

Order Selection Methods Concept

  • There are many order selection criteria
  • All try to obtain an approximately unbiased estimate of the MSE
  • All essentially add a penalty as the unscaled error increases
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

12

Bias-Variance Tradeoff

  • The order of the parametric model is the key parameter that

controls the tradeoff between bias and variance

  • P too large

– The variance is manifest differently than in nonparametric estimators – Spectrum may contain spurious speaks – Also possible for a single frequency component to be split into distinct peaks

  • P too small

– Insufficient peaks – Peaks that are present are too wide or have the wrong shape – Can only do so much with a pair of complex-conjugate poles

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

10

slide-4
SLIDE 4

Maximum Entropy Motivation

  • This discussion based on [1]
  • Suppose again that we have N observations of a random process

x(n)

  • If we knew the autocorrelation, we could calculate the AR

parameters exactly (recall chapters 4 and 6)

  • With only N observations, at most we can estimate r(ℓ) only for

|ℓ| < N

  • Many signals have autocorrelations that are non zero for ℓ ≥ N
  • The segmentation may significantly impair the accuracy of the

estimated parameters – Especially true of narrowband processes

  • Nonparametric PSD estimation methods simply extrapolate the

estimate r(ℓ) with zeros: ˆ r(ℓ) = 0 for ℓ ≥ N

  • Can we do better?
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

15

Possible Order Selection Methods Let Nt Nf − Ni + 1. Then Model Error ˆ σ2

e 1

Nt

Nf

  • n=Ni

|e(n)|2 Final Prediction Error FPE(P) = Nt + P Nt − P ˆ σ2

e

Akaike Information Criterion AIC(P) = Nt log ˆ σ2

e + 2P

Minimum Description Length MDL(P) = Nt log ˆ σ2

e + P log Nt

Criterion Autoregressive Transfer CAT(P) = 1 Nt

P

  • k=1

Nt − k Ntˆ σ2

e

− Nt − P Ntˆ σ2

e

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

13

Maximum Entropy Problem

  • Suppose we know (i.e., have estimated) the autocorrelation of an

WSS process for ℓ ≤ P

  • How do we extrapolate the estimated autocorrelation sequence for

ℓ > P?

  • Let us denote the extrapolated values by re(ℓ)
  • Then the estimated PSD is given by

ˆ Rx(ejω =

P

  • ℓ=−P

rx(ℓ)e−jℓω +

  • |ℓ|>P

re(ℓ)e−jℓω

  • We would like ˆ

Rx(ejω to have the same properties as a real PSD – Real-valued – Nonnegative

  • These conditions are not sufficient for a unique extrapolation
  • Need an additional constraint
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

16

Order Selection Methods Comments

  • The order selection decision is only critical when P is on the same
  • rder as Nt, say Nt/10 ≤ P < Nt

– This is the difficult case of too little data to make a good decision – None of the order selection criteria work good in this case

  • Otherwise

– Similar performance will be obtained for a wide range of values

  • f P

– Can simply pick the value where the estimated parameter vector stops changing with increasing values of P

  • Text suggests looking at diagnostic plots (residuals, ACF, PACS,

etc.) and selecting the parameter manually – Works well in many applications

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

14

slide-5
SLIDE 5

Maximum Entropy Estimate Derived ˆ Rx(ejω) =

P

  • ℓ=−P

rx(ℓ)e−jℓω +

  • |ℓ|>P

re(ℓ)e−jℓω H(x) = 1 2π π

−π

ln Rx(ejω) dω ∂H(x) ∂r∗

e(ℓ) = 0 = 1

2π π

−π

1 Rx(ejω) ∂Rx(ejω) ∂r∗

e(ℓ)

dω = 1 2π π

−π

1 Rx(ejω)ejℓω dω Ax(ejω) 1 Rx(ejω) 0 = 1 2π π

−π

Ax(ejω)ejℓω dω

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

19

Maximum Entropy Concept ˆ Rx(ejω =

P

  • ℓ=−P

rx(ℓ)e−jℓω +

  • |ℓ|>P

re(ℓ)e−jℓω

  • Pick re(ℓ) such that the entropy of the process is maximized
  • Entropy is a measure of randomness or uncertainty
  • Maximizing the entropy is equivalent to

– Making the (estimated) process x(n) as white as possible – Making the (estimated) PSD as flat as possible

  • Places as few constraints as possible on x(n)
  • Minimizes the amount of “statistical structure”
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

17

Maximum Entropy Estimate Properties 0 = q(ℓ) = 1 2π π

−π

Qx(ejω)ejℓω dω Qx(ejω) 1 Rx(ejω) =

P

  • ℓ=−P

q(ℓ)e−jℓω Rx(ejω) = 1 P

ℓ=−P q(ℓ)e−jℓω

  • The inverse DTFT of the signal PSD Rx(ejω) is zero for ℓ ≥ N
  • Thus the maximum entropy PSD estimate for a Gaussian process

is an all-pole power spectrum

  • Can obtain a from the Yule-Walker equations with the know (or

estimated) autocorrelation sequence for ℓ ≤ N

  • If the autocorrelation sequence is of interest, it can be obtained

using the Yule-Walker equation as well (covered in ECE 5/638)

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

20

Entropy Defined For a Gaussian random process with PSD Rx(ejω), entropy is given by H(x) = 1 2π π

−π

ln Rx(ejω) dω

  • The maximum entropy estimate is the one that maximizes this

equation

  • However, we have constraints

1 2π π

−π

ln Rx(ejω) ejℓω dω = rx(ℓ) |ℓ| ≤ P

  • Can solve by setting the partial derivative of H(x) with respect to

r∗

e(ℓ) to zero

– This is a tricky because we’re optimizing a function of a complex-valued parameter – Details are in the appendix of the text – For now assume r∗

e(ℓ) is real valued

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

18

slide-6
SLIDE 6

Example 1: Autoregressive PSD Estimation of Chirp Use a least-squares approach to autoregressive process estimation to perform a time-frequency analysis of the chirp signal.

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

23

Maximum Entropy Discussion

  • Using maximum entropy for stationary process estimation (PSD,

autocorrelation, etc.) has been thoroughly discussed and debated

  • Rationale

– In the absence of any information, impose the least amount of structure on x(n) – More reasonable than setting r(ℓ) = 0 for large lags, as the nonparametric methods assume (?)

  • Counter-argument

– Maximum entropy approach imposes an all-pole structure – If data was not all-pole, how do you know the estimated properties of interest will be accurate?

  • Bottom line: depends on the process (application)
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

21

Example 1: Autoregressive PSD

0 50 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Autoregressive Spectrogram Window:50 s Filter Order:2 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

24

Sinusoidal Excitation x(n) =

L

  • k=1

Ak cos(ωkn + φn)

  • All-pole models can also be fit to signals that consist of sums of

sinusoids

  • If L < P, can minimize error criterion

Ep = d0 L

L

  • k=1

Rx(ejω) ˆ Rh(ejω)

  • Not clear to me what the value of this model is
  • Apparently problematic for nearly periodic signals
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

22

slide-7
SLIDE 7

Example 1: Nonparametric PSD

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Signal Nonparametric Spectrogram Window:100 s 100 200 300 400 500 600 700 800 900 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

27

Example 1: Nonparametric PSD

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Signal Nonparametric Spectrogram Window:50 s 100 200 300 400 500 600 700 800 900 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

25

Example 1: Autoregressive PSD

0 1 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Autoregressive Spectrogram Window:200 s Filter Order:2 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

28

Example 1: Autoregressive PSD

0 2 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Autoregressive Spectrogram Window:100 s Filter Order:2 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

26

slide-8
SLIDE 8

Example 1: Autoregressive PSD

0500 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Autoregressive Spectrogram Window:100 s Filter Order:10 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

31

Example 1: Nonparametric PSD

0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Signal Nonparametric Spectrogram Window:200 s 100 200 300 400 500 600 700 800 900 0.1 0.2 0.3 0.4 0.5

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

29

Example 1: Autoregressive PSD

2000 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Autoregressive Spectrogram Window:200 s Filter Order:10 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

32

Example 1: Autoregressive PSD

05000 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Autoregressive Spectrogram Window:50 s Filter Order:10 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

30

slide-9
SLIDE 9

Example 1: Autoregressive PSD

50 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Window:50 s Filter Order:5 Estimator:Modified Covariance Window:Rectangular 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

35

Example 1: Autoregressive PSD

0100 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Window:50 s Filter Order:5 Estimator:Unbiased/no window Window:Rectangular 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

33

Example 1: Autoregressive PSD

0 2 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Window:50 s Filter Order:5 Estimator:Modified Covariance Window:Blackman 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

36

Example 1: Autoregressive PSD

0 1020 0.1 0.2 0.3 0.4 0.5 Frequency (Hz) 100 200 300 400 500 600 700 800 900 −1 1 Time (s) Window:50 s Filter Order:5 Estimator:Unbiased/no window Window:Blackman 100 200 300 400 500 600 700 800 900

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

34

slide-10
SLIDE 10

st = sprintf(’Window:%d s Filter Order:%d Estimator:%s Window:%s’,wl,fo,estimatorType,windowType); title(st); FigureSet(1,’Slides’); AxisSet(8); fn = sprintf(’ChirpARO%02dL%03dE%1dW%1d’,fo,wl,et,wt); print(fn,’-depsc’); fprintf(fid,’%%==============================================================================\n’); fprintf(fid,’\\newslide\n’); fprintf(fid,’\\slideheading{Example \\arabic{exc}: Autoregressive PSD}\n’); fprintf(fid,’%%==============================================================================\n’); fprintf(fid,’\\hspace{-1.5em} \\includegraphics[scale=1]{Matlab/%s}\n\n’,fn); end; end; fclose(fid);

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

39

Example 1: MATLAB Code

clear; close all; %==================================================================== % User-Specified Parameters %==================================================================== fs = 1; % Sample rate (Hz) N = 1000; % Number of observations from the process Np = 500; % Number of samples to throw away to account for transient k1 = 1:N/4; t1 = (k1-0.5)/fs; yc = chirp(t1,0.05,t1(end),0.45); y = [yc fliplr(yc) yc fliplr(yc)]’; %==================================================================== % ParametricSpectrograms %==================================================================== fid = fopen(’ARChirp.tex’,’w’); if fid==-1, error(’Could not open ARChirp.tex file for writing’); end; fo = [2 10]; wl = [50 100 200]; for c0=1:fo, for c1=1:length(wl), AutoregressiveSpectrogram(y,fs,wl(c1),fo(c0),0,0); colormap(ColorSpiral(256)); title(sprintf(’Autoregressive Spectrogram Window:%d s Filter Order:%d’,wl(c1),fo(c0))); FigureSet(1,’Slides’); AxisSet(8); fn = sprintf(’ChirpARO%02dL%03d’,fo(c0),wl(c1));

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

37

Example 1: MATLAB Code

function [S,t,f] = AutoregressiveSpectrogram(x,fsa,wla,foa,eta,wta,fra,nfa,nsa,pfa); %AutoregressiveSpectrogram: Generates the spectrogram of the signal % % [S,t,f] = AutoregressiveSpectrogram(x,fs,wl,fo,et,wt,fr,nf,ns,pf); % % x Input signal. % fs Sample rate (Hz). Default = 1 Hz. % wl Length of window to use (sec). Default = 1024 samples. % If a vector, specifies entire window. % fo Model order. Default = 30. % et Estimator type: 0=Unbiased/no window, % 1=Modified covariance (default). % wt Window type: 0=Rectangular, 1=Blackman (default). % fr Minimum and maximum frequencies to display (Hz). % Default = [0 fs/2]. % nf Number of frequencies to evaluate (vertical resolution). % Default = max(128,round(wl/2)). % ns Requested number of times (horizontal pixels) to evaluate % Default = min(400,length(x)). % pf Plot flag: 0=none (default), 1=screen. % % S Matrix containing the image of the Spectrogram. % t Times at which the spectrogram was evaluated (s). % f Frequencies at which the spectrogram was evaluated (Hz). % % Calculates estimates of the spectral content at the specified % times using an autoregressive model. The mean of the signal is % removed as a preprocessing step. The square root of the power % spectral density is calculated and displayed. To limit % computation, decimate the signal if necessary to make the upper % frequency range approximately equal to half the sampling % frequency. % % If only the window length is specified, the blackman window is

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

40

print(fn,’-depsc’); fprintf(fid,’%%==============================================================================\n’); fprintf(fid,’\\newslide\n’); fprintf(fid,’\\slideheading{Example \\arabic{exc}: Autoregressive PSD}\n’); fprintf(fid,’%%==============================================================================\n’); fprintf(fid,’\\hspace{-1.5em} \\includegraphics[scale=1]{Matlab/%s}\n\n’,fn); if c0==1, NonparametricSpectrogram(y,fs,wl(c1)); colormap(ColorSpiral(256)); title(sprintf(’Nonparametric Spectrogram Window:%d s’,wl(c1))); FigureSet(2,’Slides’); AxisSet(8); fn = sprintf(’ChirpNPO%02dL%03d’,fo(c0),wl(c1)); print(fn,’-depsc’); fprintf(fid,’%%==============================================================================\n’); fprintf(fid,’\\newslide\n’); fprintf(fid,’\\slideheading{Example \\arabic{exc}: Nonparametric PSD}\n’); fprintf(fid,’%%==============================================================================\n’); fprintf(fid,’\\hspace{-1.5em} \\includegraphics[scale=1]{Matlab/%s}\n\n’,fn); end; end; end; fo = 5; wl = 50; for c1=0:1, for c2=0:1, et = c1; wt = c2; AutoregressiveSpectrogram(y,fs,wl,fo,et,wt); colormap(ColorSpiral(256)); if et==0, estimatorType = ’Unbiased/no window’; else estimatorType = ’Modified Covariance’; end; if wt==0, windowType = ’Rectangular’; else windowType = ’Blackman’; end;

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

38

slide-11
SLIDE 11

if exist(’wta’) & ~isempty(wta), wt = wta; end; fmin = 0; % Lowest frequency to display fmax = fs/2; % Highest frequency to display if exist(’fra’) & ~isempty(fra), fmin = max(fra(1),0); fmax = min(fra(2),fs/2); end; nf = max(2^9,round(wl/2)); if exist(’nfa’) & ~isempty(nfa), nf = nfa; end; ns = min(1000,nx); if exist(’nsa’) & ~isempty(nsa), ns = min(nsa,nx); end; pf = 0; % Default - no plotting if nargout==0, % Plot if no output arguments pf = 1; end; if exist(’pfa’) & ~isempty(pfa), pf = pfa; end; %==================================================================== % More Error Checking %==================================================================== if fo>wl, error(’Filter order is larger than the window length.’); end; %====================================================================

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

43

% applied as a data window. The specified window length should be %

  • dd. If called with a window with an even number of elements, a

% zero is appended to make the window odd. % % Example: Generate the parametric spectrogram of an intracranial % pressure signal using a Blackman-Harris window that is 45 s in % duration. % % load ICP.mat; % x = decimate(icp,5); % AutoregressiveSpectrogram(x,fs/5,30,30); % %

  • D. G. Manolakis, V. K. Ingle, S. M. Kogon, "Statistical and

% Adaptive Signal Processing," McGraw-Hill, 2000. % % Version 1.00 JM % % See also specgram, window, decimate, and Spectrogram. %==================================================================== % Error Checking %==================================================================== if nargin<1, help AutoregressiveSpectrogram; return; end; if length(x)==0, error(’Signal is empty.’); end; if var(x)==0, error(’Signal is constant.’); end; if exist(’fra’) & length(fra)==1, error(’Frequency range must be a 2-element vector.’);

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

41

% Preprocessing %==================================================================== switch wt, case 0, wn = ones(wl-fo,1); case 1, wn = blackman(wl-fo+2); wn = wn(2:end-1); % Trim off the zeros end; x = x(:).’; % Make into a row vector wn = wn(:); % Make into a column vector x = x - mx; % Remove mean %==================================================================== % Variable Allocations & Initialization %==================================================================== st = (nx-1)/(ns-1); % Step size (samples) st = max(st,1); te = 1:st:nx; % Evaluation times (in samples) nt = length(te); % No. evaluation times nz = nf*(fs/(fmax-fmin)); % No. window points needed in FFT nz = 2^(ceil(log2(nz))); % Convert to power of 2 for FFT b0 = floor(nz*(fmin/fs))+1; % Lower frequency bin index b1 = ceil (nz*(fmax/fs))+1; % Upper frequency bin index fi = b0:b1; % Frequency indices f = fs*(fi-1)/nz; % Frequencies nf = length(fi); % No. of frequencies that PSD is evaluated at %==================================================================== % Main loop %==================================================================== switch et, case 0, % Short/no window X = zeros(wl-fo,fo); y = zeros(wl-fo,1); case 1, % Modified Covariance X = zeros(2*(wl-fo),fo);

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

44

end; %==================================================================== % Calculate Basic Signal Statistics %==================================================================== nx = length(x); xr = x; mx = mean(x); sx = std(x); %==================================================================== % Process Function Arguments %==================================================================== fs = 1; if exist(’fsa’) & ~isempty(fsa), fs = fsa; end; wl = min(1024,nx); % Default window length if exist(’wla’) & ~isempty(wla), wl = max(3,round(wla*fs)); if ~rem(wl,2), wl = wl + 1; % Make odd end; end; fo = 30; % Default filter oder if exist(’foa’) & ~isempty(foa), fo = foa; end; et = 1; % Modified covariance estimation type if exist(’eta’) & ~isempty(eta), et = eta; end; wt = 1; % Blackman data matrix window

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

42

slide-12
SLIDE 12

end %---------------------------------------------------------------- % Parametric Spectrogram %---------------------------------------------------------------- ha1 = axes(’Position’,[0.15 0.21 0.81 0.69]); s = reshape(abs(S),nf*nt,1); p = [0 prctile(s,98)]; imagesc(tp,f,abs(S),p); %,[Smin Smax]); xlim([0 tmax]); ylim([fmin fmax]); set(ha1,’YTickLabel’,[]); set(ha1,’XAxisLocation’,’Top’); set(ha1,’YDir’,’normal’); hold on; h = plot([1;1]*tw,[0 fs],’k’); set(h,’LineWidth’,2.0); h = plot([1;1]*tw,[0 fs],’w’); set(h,’LineWidth’,1.0); hold off; %---------------------------------------------------------------- % Colorbar %---------------------------------------------------------------- ha2 = axes(’Position’,[0.135 0.21 0.01 0.69]); colorbar(ha2); set(ha2,’Box’,’Off’) set(ha2,’YTick’,[]); %---------------------------------------------------------------- % Power Spectral Density %---------------------------------------------------------------- ha3 = axes(’Position’,[0.08 0.21 0.05 0.69]); psd = mean((abs(S).^2).’).^(1/2); plot(psd,f,’r’); %,[Smin Smax]);

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

47

y = zeros(2*(wl-fo),1); end; S = zeros(nf,nt); % Power Spectral Density for c1 = 1:nt, ic = te(c1); % Sample index of segment center i0 = round(ic-wl/2); % Index of first segment element i1 = i0 + (wl-1); % Index of last segment element k = (i0:i1); % Collection of indices a0 = max(-i0+1,0); % Number of zeros to append at front a1 = max(i1-nx,0); % Number of zeros to append at end k = k(1+a0:wl-a1); % Subset of valid indice xs = [x(1)*ones(1,a0) x(k) x(nx)*ones(1,a1)].’; % Data segment %---------------------------------------------------------------- % Create the Target Vector and Data Matrix (full windowing) %---------------------------------------------------------------- y(1:wl-fo) = wn.*xs(fo+1:wl); for c2=1:fo, X(1:wl-fo,c2) = wn.*xs(fo-(c2-1):wl-c2); end; if et==1, y(wl-fo+(1:wl-fo)) = wn.*xs(1:wl-fo); for c2=1:fo, X(wl-fo+(1:wl-fo),c2) = wn.*xs(c2+1:wl-fo+c2); end; end; ah = -pinv(X)*y; s2w = mean(abs(y + X*ah).^2); ah = [1;ah]; dft = s2w./(abs(fft(ah,nz)).^2); % Calculate FFT S(:,c1) = dft(fi).’; drawnow; end; t = (te-1)/fs;

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

45

ylim([fmin fmax]); xlim([0 max(psd)]); ylabel(’Frequency (Hz)’); set(gca,’XAxisLocation’,’Top’); %---------------------------------------------------------------- % Signal %---------------------------------------------------------------- ha4 = axes(’Position’,[0.15 0.10 0.81 0.10]); h = plot(tx,x); set(h,’LineWidth’,1.5); ymin = min(x); ymax = max(x); yrng = ymax-ymin; ymin = ymin - 0.02*yrng; ymax = ymax + 0.02*yrng; xlim([0 tmax]); ylim([ymin ymax]); xlabel(xl); axes(ha1); AxisSet; end; %==================================================================== % Process Return Arguments %==================================================================== if nargout==0, clear(’S’,’t’,’f’); end;

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

48

%==================================================================== % Postprocessing %==================================================================== x = xr; t = t(:); % Convert to column vector f = f(:); % Convert to column vector %==================================================================== % Plot Results %==================================================================== if pf>=1, td = 1; % Time Divider if max(t)>2000, td = 60; % Use minutes end; Tmax = (nx-1)/fs; if pf~=2, figure; FigureSet(1); colormap(jet(256)); end; fmax = max(f); k = 1:length(x); tx = (k-1)/fs; tp = t; tmax = max(t); tw = [(wl/2-1)/fs,(tmax-(wl/2)/fs)]; xl = ’Time (s)’; % X-axis label if max(t)>2000, % Convert to minutes tx = tx/60; tp = tp/60; tmax = tmax/60; tw = tw/60; xl = ’Time (min)’;

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

46

slide-13
SLIDE 13

Summary

  • AR models of stochastic processes have many advantages

– Least squares estimates reduce to solving a linear set of equations – Properties of estimators known – Several good choices for estimating properties – Can weight observations

  • They are also used for many nearly equivalent problems

– Whitening – One-step ahead linear prediction – Parametric process modeling – Maximum entropy process/PSD/autocorrelation estimation

  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

49

References

[1] Monson H. Hayes. Statistical Digital Signal Processing and

  • Modeling. John Wiley & Sons, Inc., 1996.
  • J. McNames

Portland State University ECE 539/639 Autoregressive Models

  • Ver. 1.01

50