PERFORMANCEANALYSIS OFPPFNN PROBABILISTIC - - PDF document

performance analysis of ppfnn
SMART_READER_LITE
LIVE PREVIEW

PERFORMANCEANALYSIS OFPPFNN PROBABILISTIC - - PDF document

PERFORMANCEANALYSIS OFPPFNN PROBABILISTIC POTENTIALFUNCTIONNEURAL NETWORK GurselSerpen HongJiang LloydG.Allred*


slide-1
SLIDE 1

PERFORMANCEANALYSIS OFPPFNN

  • PROBABILISTIC

POTENTIALFUNCTIONNEURAL NETWORK

  • GurselSerpen

HongJiang LloydG.Allred*

  • ElectricalEngineering&ComputerScienceDepartment

TheUniversityofToledo,Toledo,OH43606

  • *SoftwareEngineeringDivision

OgdenAirLogisticsCenter HillAFB,UT84056

slide-2
SLIDE 2

PROPERTIESOFNEURO-CLASSIFIER

  • 1.offerfast(real-time)trainingandclassificationcycles

evenifimplementedinsoftware, 2.do not require an initial guess for the network topology, rather topologically adapt to a particular instance of the classification problem at hand in a dynamicwayasthetrainingprogresses, 3.discover clustering properties of training data and adapt to a minimal network topology in terms of neededcomputationalresources, 4.implement an incremental learning procedure and hence,donotdisturbthepreviousstateofthenetwork but simply add new computational resources to the existingnetworktopology,and 5.require a small number of parameters to be specified heuristically while the network performance being insensitive to large variations in the value of those parameters,and 6.form classification boundaries, which optimally separate the classes, which are likely to be formed from a set of disconnected subclasses in the pattern space; the joint probability density function of a particularclassislikelytohavemanymodes.

slide-3
SLIDE 3

SIGNIFICANTNEURAL CLASSIFICATIONPARADIGMS

  • ♦ TheMulti-LayerFeedforwardNetwork

Initialnetworktopology-needsguessing slow training speed - unsuitable for real-time implementations.

  • ♦ TheRadialBasisFunctionNetwork

Initialization of network - clustering properties of trainingdata(k-means) Hiddenlayernodecount

  • ♦ TheProbabilisticNeuralNetwork

A pattern layer node for each training pattern Potentiallylargenodecounts

  • ♦ LearningVectorQuantizationNetworks

Codebook vector initialization - no well-defined procedureexists

slide-4
SLIDE 4

TOPOLOGYOFPPFNN.

  • PatternLayerNodesdistributetheincomingsignalvalues

tohiddenlayernodes. Hiddenlayernodesmaptheincomingsignalsasfollows: exp − −

  • α x

xk 2

  • whereα(Alpha)isaspreadparameteroftheexponential

functioncenteredat xk. Output layer nodes sum the incoming weighted signals andpasstheweightedsumthrough:

  • utput

if weighted sum weighted sum if weighted sum if weighted sum = < ∈ >

  • 01

1 1 _ _ _ [ , ] _

  • MAXNET

Layer Hidden Layer Pattern Layer Output Layer I N P U T S

wij

O U T P U T S

slide-5
SLIDE 5

NETWORKCREATIONPROCESS

  • 2-CLASS

PROBLEM

slide-6
SLIDE 6
slide-7
SLIDE 7

1.InitializethePPFNNandassumeavalueforparameterAlpha. 2.Presentanewfeaturevector(usingkasindexforfeaturevector)andcompute networkoutput. 3.Ifthenetworkclassifiesthevectorcorrectlyforeachclass,noactionneeded. 4.Else A.Addanewhiddenlayernode(usingiindexfornodes), B.Centerthepotentialfunctionrepresentedbythenewhiddenlayernode aroundthisvector,and C.Repeatforeachclass(usingjasindexforclasses), ∗Ifpatternbelongstotheclassandfunction,

fk

i,ispositive,noaction.

∗Elseifpatterndoesnotbelongtotheclassandfunction,

fk

i,isnegative,

noaction. ∗Elseifpatternbelongstotheclassandfunction,

fk

i,isnegative,connect

  • utputofhiddennodeitotheoutputnodeforclassjthroughaweightof

+γ k

ij.

∗Elseifpatterndoesnotbelongtotheclassandfunction,

fk

i,ispositive,

connectoutputofhiddennodeitotheoutputnodeforclassjthrougha weightof-γ k

ij.

5.Repeattheprocedureuntilalltrainingpatternsareprocessed.

slide-8
SLIDE 8
slide-9
SLIDE 9

SIMULATIONRESULTS

  • CLASSIFICATIONPERFORMANCESOF

NEURALNETWORKALGORITHMS

  • TestData

ClassificationRatein%

  • MLP
  • LVQ
  • RBF
  • PNN
  • PPFNN

2-Spiral1 50.00 55.73 98.96 89.58 91.67 IRIS 78.00 82.67 80.00

  • 96.00

Sonar 53.85 62.98 71.15 74.04 73.08 Vowel 36.57 11.11 56.67

  • 52.32

Wisconsin 59.94 87.88 66.67 95.15 95.76 Cleveland 55.17 57.93 65.86 55.86 58.28 Thyroid 36.74 81.86 72.09

  • 78.14
  • 1Trainingandtestingdatasetsarethesameforthisproblem.
slide-10
SLIDE 10
slide-11
SLIDE 11
  • TRAININGTIME(INSECONDS)

REQUIREMENTS OFNEURALNETWORKALGORITHMS

  • 2-Spiral

IRIS Sonar Vowel Wisconsin Cleveland Thyroid MLP 3556 12572 47400 52800 10317 8834 9746 LVQ 1500 1320 7800 5237 1653 1835 450 RBF 120 120 360 3600 137 495 65 PNN 120

  • 886
  • 220

2532

  • PPFNN

120 120 621 3777 98 1010 120

slide-12
SLIDE 12
  • SENSITIVITYOFCLASSIFICATION

PERFORMANCEOFPPFNNASALPHAVARIES

  • Benchmark

Problem Testing Intervalfor Alpha Maximum Variationin Classification Performancefor TrainingData Maximum Variationin Classification Performancefor TestData Sonar [4.0,15] 1.24% 2.89% Vowel [3.5,15] 3.18% 3.03% Wisconsin [0.8,6.8] 0.00% 1.82% Cleveland [6.8,18.8] 6.39% 4.14% Thyroid Disease [0.0012,12] 21.74% 23.26%

slide-13
SLIDE 13

CONCLUSIONS

  • SimulationresultsdemonstratethatPPFNNperformanceiscomparabletoor

better than that of MLP, RBF, LVQ and PNN when the set of seven benchmarkproblemsisconsidered.

  • Performancecriteriaemployedinthesimulationstudyincludedthenetwork

trainingtimeandclassificationratesfortestdata.

  • SimulationresultsindicatedthatPPFNNconsistentlyperformedintheleading

groupofclassifiersoverthesetofproblemstested,whichwasnotthecasefor therestoftheneuro-classifieralgorithms.

  • ThetrainingtimerequirementsofthePPFNNweregenerallyminimalleading

to the conclusion that the PPFNN algorithm is a good choice for real-time implementation.

  • PPFNNperformancewasnotaffectedbylargevariationsinthevalueofthe
  • nlyadjustableparameterAlphathatdeterminesthespreadofthepotential

functions.

  • In conclusion, simulation results indicate that PPFNN is a robust neuro-

classifieralgorithmsuitableforreal-timeenvironments.