Integrated Sensor Systems Dept. of Electrical Engineering and - - PowerPoint PPT Presentation

integrated sensor systems
SMART_READER_LITE
LIVE PREVIEW

Integrated Sensor Systems Dept. of Electrical Engineering and - - PowerPoint PPT Presentation

Institute of Integrated Sensor Systems Dept. of Electrical Engineering and Information Technology Comparison of PSO-Based Optimized Feature Computation for Automated Configuration of Multi-Sensor Systems Kuncup Iswandy and Andreas Koenig


slide-1
SLIDE 1

Kuncup Iswandy and Andreas Koenig

Institute of

Integrated Sensor Systems

  • Dept. of Electrical Engineering and Information Technology

Comparison of PSO-Based Optimized Feature Computation for Automated Configuration of Multi-Sensor Systems

Kuncup Iswandy and Andreas Koenig

slide-2
SLIDE 2

Kuncup Iswandy and Andreas Koenig

Overview

  • Introduction
  • Feature Computation Methods
  • Optimization Methods
  • Experiments and Results
  • Conclusions
slide-3
SLIDE 3

Kuncup Iswandy and Andreas Koenig

Introduction

Motivation

  • Intelligent sensor system find more widespread application, e.g.,

the fields of ambient intelligence and sensor networks.

  • The growing spectrum of available sensor principles and

implementations require:

  • a large variety sensor electronics
  • sensor signal processing techniques
  • The design process goes through:
  • the principal steps of sensor selection and scene optimization,
  • choice of signal and feature processing,
  • dimensionality reduction, and
  • classification
slide-4
SLIDE 4

Kuncup Iswandy and Andreas Koenig

Introduction

Motivation Currently, optimizing design of intelligent sensor systems is

  • still manually elaborated by human designer
  • tedious
  • time and labor consuming process
  • potentially subpotimal outcome
slide-5
SLIDE 5

Kuncup Iswandy and Andreas Koenig

Introduction

Goals Goals:

contribute to the design automation activities for intelligent (multi-) sensor systems focus on optimization of feature computation comparison between genetic algorithms and particle swarm optimization

Sensor & Scene Signal Processing & Feature Computation Dimension Reduction Classification

Optimization

Assessment Parameter

General architecture of intelligent sensor system

slide-6
SLIDE 6

Kuncup Iswandy and Andreas Koenig

Feature Computation Methods

The roles of feature computation techniques:

  • extract the meaningful information of raw data of sensor response
  • reduce the dimension size of variable vector of a pattern
  • increase the computation speed and classification accuracy

In particular to the application of gas sensor systems, two feature computations have been proposed:

  • Multi-level threshold
  • Gaussian windowing

(kernel technique)

50 100 150 200 250 300 350 400 0.5 1 1.5 2 2.5 x 10

4

time [ms] conductance [a.u.]

H2 : 7 ppm. CH4 : 1000 ppm. Ethanol : 0.8 ppm. CO : 40 ppm cycle 1 cycle 2 500°C 23°C 500°C 90°C 290°C 23°C 500°C 900°C 290°C

Sensor response patterns during two temperature cycles

slide-7
SLIDE 7

Kuncup Iswandy and Andreas Koenig

Feature Computation Methods

Multi-Level Threshold (MLT) The features of MLT can be computed as

  • ys

: magnitude value of sensor signal with s = 1, 2, ..., Nr Nr : total samples of a pattern i : a number of features (i = T - 1) T : a number of thresholds used Tp and Tq : level-values with q = (2, 3, ... T) and p = q – 1 for DM, and with q = T and p = (1, 2, 3, ... T-1) for CM

Feature computation of MLT for a gas stimulus presentation of first derivative of conductance

MLT computes the features with similar to histogram and amplitude distribution. Two methods of MLT, i.e., differential (DM) and cumulative (CM) modes.

=

=

r

N s q p s i

T T y z

1

) , , ( δ

  • therwise

T y T T T y

q s p q p s

≤ ≤ ⎩ ⎨ ⎧ = 0 1 ) , , ( δ

slide-8
SLIDE 8

Kuncup Iswandy and Andreas Koenig

Feature Computation Methods

Gaussian Windowing Function The features of Gaussian Windowing can be computed as

  • Feature computaion of Gaussian windowing

(window time slicing) for a normalized conductance curve

Extract features directly from conductance curves (transient responses) Each feature is represented by a kernel-base, i.e., a Gaussian exponential function Parameters : mean μ and standard deviation σ

) , , (

1 i i N s s i

s G y z

r

σ μ ⋅ =∑

=

exp

2

2 1

) , , (

⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ − −

=

i i

s i i

s G

σ μ

σ μ

slide-9
SLIDE 9

Kuncup Iswandy and Andreas Koenig

Optimization Methods

Genetic Algorithms for a real-valued case GA procedure:

  • 1. Generate randomly a real-valued population
  • 2. Selection for mating processess using Roulette Wheel Selection
  • 3. Discrete recombination with rate = 0.8, example:

X = (x1, x2, x3, x4, x5) ------> X´ = (x1, x2, y3, x4, x5) Y = (y1, y2, y3, y4, y5) ------> Y´ = (y1, y2, x3, y4, y5)

  • 4. Mutation with Gaussian perturbation and rate = 0.01, example:

X´ = (x1, x2, ..., xs, ..., xi) , 1 ≤ s ≤ i xsnew = xs + αs N(0,σs)

  • 5. Replacement: elitism 10 % of parent population and offspring

individuals

  • 6. Repeate to step 2; otherwise stop.
slide-10
SLIDE 10

Kuncup Iswandy and Andreas Koenig

Optimization Methods

Particle Swarm Optimization for a real-valued case One of the evolutionary computation techniques. Population-based search algorithm Population of random solution is called particles. Related to the bird flocking and swarm theory. PSO algorithm:

  • w = 1 : inertia weight

C1 = C2 = 2 : positive constants Rand() and rand() : random functions, [0,1] pi : best previous position of the i-th particle pg : best particle among all particles vi xi xi_new pi pg vi_new d-space

( )

( )

id gd id id id id

x p Rand C x p rand C t wv t v − + − + = + () () ) ( ) 1 (

2 1

) 1 ( ) ( ) 1 ( + + = + t v t x t x

id id id

slide-11
SLIDE 11

Kuncup Iswandy and Andreas Koenig

Optimization Methods

Feature Assessment - Nonparametric Overlap Measure Inspired by probability estimation in Edited-Nearest Neighbor method Used as an optimization criterion (fitness function) NPOM is computed as: with and qo is well normalized in [0,1]; „1“ indicates no overlap of class regions

∑ ∑

= = = = =

∑ ∑ + ∑ =

c ji

N j k i i k i i k i NN L c c

  • n

n q N L q

1 1 1 1 1

2 1 1

jk ji

NN NN i

d d n − =1

⎩ ⎨ ⎧ ≠ = − =

i j i j i i NN

n n q

ji

ω ω ω ω

slide-12
SLIDE 12

Kuncup Iswandy and Andreas Koenig

Optimization Methods

Approaches of Feature Computation / Feature Selection Wrapper approach:

Classification result Assessment and Modification Raw feature computation Dimension. Reduction Classifier Train/Test Classification result Sensor Raw feature computation Dimension. Reduction Classifier Train/Test Sensor

Filter approach:

Assessment and Modification qo Assessment and Modification qo param. param. param. R

slide-13
SLIDE 13

Kuncup Iswandy and Andreas Koenig

Optimization Methods

Genetic Algorithms for a binary case

  • Representation: binary (switch variables)
  • One point Crossover; rate = 0.8
  • Mutation; rate = 0.01
  • Reproduction: best 10 % parents and offsprings

Crossover Mutation

P1 P2 Ofs1 Ofs2 Random P Ofs Random Random : 1 : 0

slide-14
SLIDE 14

Kuncup Iswandy and Andreas Koenig

Optimization Methods

Particle Swarm Optimization for a binary case Original PSO is designed for real-value problems For binary, particles are assigned a binary value, e.g., 0 or 1 Velocity are restricted to an interval value of [-4,4] BPSO algorithm:

  • Parameters w, C1, and C2 are same with original PSO

( )

( )

id gd id id id id

x p Rand C x p rand C t wv t v − + − + = + () () ) ( ) 1 (

2 1

⎩ ⎨ ⎧ = , , 1

id

x

( )

  • therwise

v sigmoid U if

id

< ) 1 , (

( )

id

v id

e v sigmoid

+ = 1 1

slide-15
SLIDE 15

Kuncup Iswandy and Andreas Koenig

Experiments and Results

Data Description and Parameter Setting Applying a benchmark data of a gas sensor system Types of gases, i.e., H2,CH4, ethanol and CO The data set consists of 810 measure values and 264 patterns Separated into training (144 patterns) and testing (120 patterns) sets Each experiment is repeated using 10 runs Each run is limited up to 100 iterations Population size is 20 individuals for both GA and PSO. The number of nearest neighbors is set to five for the overlap (NPOM) measurement and the kNN voting classifier. The classification accuracy is estimated using holdout method and leave-one-out cross-validation approach.

slide-16
SLIDE 16

Kuncup Iswandy and Andreas Koenig

Experiments and Results

Comparison between GA and PSO

test-LOO(%) test(%) train(%) qo Mean / Std Mean / Std Mean / Std Mean / Std 99.17 / 0.79 99.67 / 0.58 99.44 / 0.55 0.9950 / 0.0035 GA 99.83 / 0.35 100 / 0 100 / 0 1.00 / 0 PSO Recocognition accuracy (kNN)

  • verlap

Method

MLT differential mode MLT cumulative mode

test-LOO(%) test(%) train(%) qo 98.67 / 1.48 99.50 / 6.36 98.89 / 0.36 0.9878 / 0.0044 GA 99.83 / 0.35 99.92 / 0.89 99.10 / 0.34 0.9953 / 0.0024 PSO Mean / Std Mean / Std Mean / Std Mean / Std Recocognition accuracy (kNN)

  • verlap

Method

slide-17
SLIDE 17

Kuncup Iswandy and Andreas Koenig

Experiments and Results

Gaussian Windowing feature computation optimized by PSO

Mean / Std Mean / Std Mean / Std Mean / Std test-LOO(%) test(%) train(%) qo Kernel (x 10) 95.83 / 1.76 99.00 / 0.77 97.78 / 0.79 0.9791 / 0.0081 4 96.08 / 1.11 99.67 / 0.43 98.13 / 0.03 0.9794 / 0.0021 5 94.92 / 2.17 98.75 / 0.90 97.71 / 0.74 0.9797 / 0.0034 6 96.92 / 1.11 99.25 / 0.73 98.13 / 0.57 0.9795 / 0.0015 7 95.67 / 0.95 99.00 / 0.53 97.92 / 0.46 0.9786 / 0.0027 8 95.83 / 1.36 99.08 / 0.61 97.92 / 0.46 0.9786 / 0.0031 9 95.50 / 2.29 99.00 / 0.66 97.91 / 0.65 0.9806 / 0.0044 3 96.08 / 0.88 99.75 / 0.40 98.13 / 0.47 0.9787 / 0.0016 10 Recocognition accuracy (kNN)

  • verlap
  • No. Of
slide-18
SLIDE 18

Kuncup Iswandy and Andreas Koenig

Experiments and Results

Feature Selection optimized by PSO

0.9787 0.9786 0.9786 0.9795 0.9797 0.9794 0.9791 0.9806 qo

  • verlap

98.33 100 98.61 10 4 98.33 100 99.31 36 5 94.92 100 99.31 26 6 98.33 100 99.31 29 7 99.17 100 99.31 34 8 98.33 99.17 99.31 41 9 99.17 100 99.31 10 3 98.33 100 99.31 50 10 test-LOO(%) test(%) train(%) features Kernel (x 10) Recocognition accuracy (kNN) selected

  • No. Of

Next step, feature selection is applied in order to select the most relevant features Show the improvement of the classification accuracy

slide-19
SLIDE 19

Kuncup Iswandy and Andreas Koenig

Conclusions

Contribution to the optimization of feature computation step. Investigation of MLT and Gaussian Windowing methods optimized by evolutionary techniques (i.e., GA and PSO). MLT methods are less computation and have better results than Gaussian Windowing . PSO performance are better than GA. Applying feature selection proved the improvement of the accuracy rate. In future work, applications of sensor networks will be considered. Develop library/toolbox of feature computation techniques. Applying feature level fusion in order to advance the accuracy rate. Combination of feature assessment functions (wrapper and filter) with regard to multi-objective optimization.

slide-20
SLIDE 20

Kuncup Iswandy and Andreas Koenig

Thank You!