CCECE 2003 Signal Classification through Multifractal Analysis and - - PowerPoint PPT Presentation

ccece 2003
SMART_READER_LITE
LIVE PREVIEW

CCECE 2003 Signal Classification through Multifractal Analysis and - - PowerPoint PPT Presentation

CCECE 2003 Signal Classification through Multifractal Analysis and Complex Domain Neural Networks V. Cheung, K. Cannons, W. Kinsner, and J. Pear* Department of Electrical & Computer Engineering Signal and Data Compression Laboratory


slide-1
SLIDE 1

May 5, 2003

Signal Classification through Multifractal Analysis and Complex Domain Neural Networks

  • V. Cheung, K. Cannons, W. Kinsner, and J. Pear*

Department of Electrical & Computer Engineering Signal and Data Compression Laboratory *Department of Psychology University of Manitoba, Winnipeg, Manitoba, Canada

CCECE 2003

slide-2
SLIDE 2

Cheung 1 / 16 CCECE03

Outline

  • Introduction
  • Background

► Variance fractal dimension trajectory ► Kohonen self-organizing feature map ► Probabilistic neural network ► Complex domain neural network

  • Experimental Results and Discussion
  • Conclusion
slide-3
SLIDE 3

Cheung 2 / 16 CCECE03

Introduction

  • Classification of signals that are:

► Stochastic ► Self-affine ► Non-stationary ► Multivariate ► From non-linear systems

  • Eg. multi-channel speech signals, multi-lead ECGs
  • r EEGs

Introduction Background Results Conclusion

slide-4
SLIDE 4

Cheung 3 / 16 CCECE03

Fish Dishabituation Signals

X Z Y Fish Tank Mirror

Introduction Background Results Conclusion

slide-5
SLIDE 5

Cheung 4 / 16 CCECE03

System Design

Introduction Background Results Conclusion

Feature Extraction VFDT SOFM Signals Signal Classification Classification PNN CNN

slide-6
SLIDE 6

Cheung 5 / 16 CCECE03

Variance Fractal Dimension Trajectory

  • Temporal multifractal characterization

► Calculate the variance fractal dimension of a small

segment of the signal in a sliding-window fashion over the entire signal [Kins94]

► Reveals the underlying complexity of the signal ► Provides a normalizing effect

  • Advantages of the variance fractal dimension

► Easy to compute

■ Measure the variance of amplitude increments at different scales

► Can be computed in real-time

Introduction Background Results Conclusion

slide-7
SLIDE 7

Cheung 6 / 16 CCECE03

VFDT Plot

VFDT Introduction Background Results Conclusion

slide-8
SLIDE 8

Cheung 7 / 16 CCECE03

Self-Organizing Feature Maps (SOFM)

  • Topology-preserving neural networks using

competitive unsupervised learning [Koho84]

  • Two uses in this paper

► Clustering

■ Aid in constructing the training and testing sets

► Feature Extraction

■ Dimensionality reduction

Introduction Background Results Conclusion

slide-9
SLIDE 9

Cheung 8 / 16 CCECE03

Probabilistic Neural Networks

  • Neural network implementation of the Bayes
  • ptimal decision rule [Spec88]

► eg. Spam filters

  • Advantages

► Asympotically Bayes optimal

■ Good classifiers

► Trains orders of magnitude faster than other NNs

  • Disadvantages

► Slower execution than other NNs ► Require large amounts of memory

Introduction Background Results Conclusion

slide-10
SLIDE 10

Cheung 9 / 16 CCECE03

Complex Domain Neural Networks (CNN)

  • Advantages

► Works with inputs in their natural complex valued form ► Faster training ► Better generalization

  • Disadvantages

► More complexity

■ Convoluted partial derivatives involving complex analysis

  • Ref: [Mast94]

Introduction Background Results Conclusion

slide-11
SLIDE 11

Cheung 10 / 16 CCECE03

CNN Architecture

Input Layer Hidden Layer Output Layer 1 2 3

Introduction Background Results Conclusion

slide-12
SLIDE 12

Cheung 11 / 16 CCECE03

Experiment #1

95% Confidence Interval: [62.77%, 70.69%] 50.00% 93 70 23 4 59.04% 65 111 12 3 92.47% 4 4 135 3 2 100.00% 24 1 Expected 4 3 2 1 Correct

  • Class. Rate

Experimental Average Correct Classification Rate: 66.73% VFDT

Classification

PNN

X-Axis Signals

Introduction Background Results Conclusion

slide-13
SLIDE 13

Cheung 12 / 16 CCECE03

Experimental Results Summary

87 91 80 87 96 CNN X & Z 4 3 2 1 Average Rate (%) Classification Rate (%) Classifier Signal 91 95 84 95 100 PNN X & Z 58 91 47 29 63 PNN Z 67 50 59 92 100 PNN X VFDT

X-Axis Signals

VFDT

Z-Axis Signals

CNN PNN

Introduction Background Results Conclusion

slide-14
SLIDE 14

Cheung 13 / 16 CCECE03

SOFM Feature Extraction

SOFM VFDT

X-Axis Signals

VFDT

Z-Axis Signals

CNN PNN 66 67 PNN X 61 58 PNN Z 88 91 PNN X & Z 85 87 CNN X & Z + SOFM VFDT Average Rate (%) Classifier Signal

Introduction Background Results Conclusion

slide-15
SLIDE 15

Cheung 14 / 16 CCECE03

Conclusions

  • A system capable of classifying self-affine,

stochastic, non-stationary, multivariate signals

  • riginating from non-linear processes was

developed

  • Feature extraction involving variance fractal

dimensions and self-organizing feature maps shown to be effective

  • Probabilistic neural networks and complex domain

neural networks shown to be capable of performing the desired classification

Introduction Background Results Conclusion

slide-16
SLIDE 16

Cheung 15 / 16 CCECE03

Acknowledgements

  • Natural Sciences and Engineering Research

Council (NSERC) of Canada

  • University of Manitoba
slide-17
SLIDE 17

Cheung 16 / 16 CCECE03

References

[ChCa03] V. Cheung and K. Cannons, Signal Classification through Multifractal Analysis and Neural Networks. BSc Thesis. Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB, 106 pp., 2003. [Kins94]

  • W. Kinsner, “Batch and real-time computation of a fractal dimension

based on variance of a time series,” Technical Report, DEL94-6; UofM; June 15, 1994, (v+17) 22 pp. [Koho84] T. Kohonen, Self-Organization and Associative Memory. Berlin: Springer-Verlag, 1984. [Mast94] T. Masters, Signal and Image Processing with Neural Networks: A C++ Sourcebook. New York, NY: John Wiley & Sons, Inc., 1994. [Spec88] D.F. Specht, “Probabilistic neural networks for classification, mapping, or associative memory”, IEEE International Conference on Neural Networks, vol. 1, pp. 525-532, July 1988.