A Self-Organizing Fuzzy Neural Networks H. S. LI N, X. Z. GAO, XI - - PowerPoint PPT Presentation

a self organizing fuzzy neural networks
SMART_READER_LITE
LIVE PREVIEW

A Self-Organizing Fuzzy Neural Networks H. S. LI N, X. Z. GAO, XI - - PowerPoint PPT Presentation

A Self-Organizing Fuzzy Neural Networks H. S. LI N, X. Z. GAO, XI ANLI N HUANG, AND Z. Y. SONG Abstract This paper proposes a novel clustering algorithm for the structure learning of fuzzy neural networks. Our clustering algorithm uses


slide-1
SLIDE 1

A Self-Organizing Fuzzy Neural Networks

  • H. S. LI N, X. Z. GAO, XI ANLI N HUANG,

AND Z. Y. SONG

slide-2
SLIDE 2

Abstract

This paper proposes a novel clustering algorithm for the structure learning of fuzzy neural networks. Our clustering algorithm uses the reward and penalty mechanism for the adaptation of the fuzzy neural networks prototypes at every training sample. Compared with the classical clustering algorithms, the new algorithm can on-line partition the input data, pointwise update the clusters, and self-organize the fuzzy neural structure.

slide-3
SLIDE 3

Abstract

All rules are self-created, and they grow automatically with more incoming data. There are no conflicting rules in the created fuzzy neural networks. Our approach shows supervised clustering algorithms can be suitable for the structure learning of the self-

  • rganizing fuzzy neural networks. The

identification of several typical nonlinear dynamical systems and prediction of time series data are employed to demonstrate the effectiveness of the proposed fuzzy neural networks and its learning algorithm.

slide-4
SLIDE 4

I ntroduction

It is well known fuzzy logic provides human reasoning capabilities to capture uncertainties, which cannot be described by precise mathematical models. Neural networks offer remarkable advantages, such as adaptive learning, parallelism, fault tolerance, and

  • generalization. They have been proved to be

powerful techniques in the discipline of system control, especially when the controlled system is difficult to be modeled accurately, or the controlled system has large uncertainties and strong nonlinearities.

slide-5
SLIDE 5

I ntroduction

Therefore, fuzzy logic and neural networks have been widely adopted in model-free adaptive control of nonlinear systems. There are numerous kinds of neural fuzzy systems proposed in the literature,and most

  • f them are suitable for only off-line cases.

Some on-line learning methods for the neural fuzzy systems are studied as well.

slide-6
SLIDE 6

I ntroduction

In this paper, we proposed a novel on-line clustering algorithm of structure learning for

  • ur fuzzy neural networks. This new

clustering algorithm employs the mechanism

  • f reward and penalty action used in the

Learning Vector Quantization (LVQ). Our fuzzy neural networks with the on-line structure and parameter learning capability is a suitable candidate for real-time applications due to its fast convergence.

slide-7
SLIDE 7

Structure of Fuzzy Neural Netw orks

1

x

2

x

d

y1

2

y

d

y2

1

y

slide-8
SLIDE 8

Structure of Fuzzy Neural Netw orks

Layer 1: Each node in this layer, only transmits input values to the next layer

  • directly. Thus, the function of the th node is

defined as Layer 2: Each node in this layer corresponds to one linguistic label of one of the input variables in Layer 1. The operation performed in this layer is

1 i i

f u x = = a f =

(2) 2

1 ( ) 2

i ij ij

u m f σ − = −

f

a e =

slide-9
SLIDE 9

Structure of Fuzzy Neural Netw orks

Layer 3: Nodes in this layer are rule nodes, and constitute the antecedents of the fuzzy rule base. The input and output functions of the th rule node are Layer 4: The nodes in this layer are called “output-term nodes”. The links in Layer 4 perform the fuzzy OR operation that have the same consequent

(3) 1 n i i

f u

=

=∏ a f =

4 1 J j j

f u

=

= ∑

min(1, ) a f =

slide-10
SLIDE 10

Structure of Fuzzy Neural Netw orks

Layer 5: These nodes and Layer 5 links attached act as the defuzzifier. The following functions can perform the Center Of Area (COA) defuzzification method: Based on the above structure, an on-line learning algorithm will be proposed to determine the proper centers and widths of term nodes.

5 5 5

( )

ij i ij ij i

f w u m u σ = =

∑ ∑

5 ij i

f a u σ = ∑

slide-11
SLIDE 11

Learning Algorithm for Structure I dentification

In our fuzzy neural networks, for every on- line incoming training pattern, we first use the novel cluster algorithm to identify the structure, and next apply the backpropagation algorithm to optimize the

  • parameters. In our learning method, only the

training data is need. The input / output-term nodes and rule nodes are created dynamically as learning proceeds upon receiving on-line incoming training data. During the learning process, novel input-term and output-term nodes and rule nodes will be added.

slide-12
SLIDE 12

Learning Algorithm for Structure I dentification

The main idea of our clustering algorithm is for every input data, we first find the winner clusters in the input and

  • utput

space

  • respectively. Next, as in the fuzzy ARTMAP, we

check that if the winner cluster in the input space is connected to the winner cluster in the

  • utput space. If so, we assume that the winner

cluster in the output space is the correct prediction of the winner cluster in the input space, which is analogous to the fact the fuzzy ARTb category activated by the input is the correct prediction of the fuzzy ARTa categories activated by an input in the fuzzy ARTMAP.

slide-13
SLIDE 13

Learning Algorithm for Structure I dentification

If not, we assume that the mismatch occurs between the winner cluster in the input space and the winner cluster in the output space and we will begin to search for another cluster in the input space which will match the winner cluster in the output space . The reward and penalty mechanism is employed in our clustering algorithm. We can describe the novel clustering algorithm as follows.

slide-14
SLIDE 14

Learning Algorithm for Structure I dentification

Step1: Initialize the fuzzy system with zero cluster: , . Step 2: For the first input and output training vectors, they are selected as the centers of the first clusters in the input and output space, respectively. We connect the first cluster in the input space to the first cluster in the output space, and set the number of data belonging to the cluster as one.

In =

On =

slide-15
SLIDE 15

Learning Algorithm for Structure I dentification

Step 3: For an input data point in the input space, we compute the distances between the input vector and the existing input space clusters using the Euclidean metric function: The nearest cluster (winner neuron) is chosen by selecting the minimum . Next, the following algorithms are used:

2 1

_

q l l p i p i

d x I m

=

= −

j

d

slide-16
SLIDE 16

Learning Algorithm for Structure I dentification

If is larger than a certain value , we assume that this input data does not belong to any existing cluster, we form a novel

  • cluster. The newly added cluster is the winner

cluster in the input space. If is smaller than , we assume that cluster j is the winner cluster in the input space. The procedure in the input space in Step 3 is also adopted in the output space. We can also find the winner cluster in the output space.

vigilance

d

j

d

vigilance

d

j

d

slide-17
SLIDE 17

Learning Algorithm for Structure I dentification

Step 4: We check the mapping process from input clusters to the output clusters. (1). If the winner cluster in the input space is a novel cluster, we connect this novel cluster to the winner cluster in the output space, and update the parameters of the winner cluster in the output space.

_ 1

winner winner winner winner

Om Oc x eph Om Oc × + = +

slide-18
SLIDE 18

Learning Algorithm for Structure I dentification

(2). If the winner cluster of the input space is connected to the winner cluster of the output space originally, we adopt the following algorithm to update the centers (), variances and the counters of the winner cluster in the input space.

2 2 2 2 2

( ) _ 1

winner winner winner winner winner winner

Oc O Om y O eph Om Oc σ σ × + + = − +

_

winner winner

Om eph Om =

1

winner winner

Oc Oc = +

slide-19
SLIDE 19

Learning Algorithm for Structure I dentification

(3). If the winner cluster of the input space is not connected to the winner cluster of the

  • utput space yet, we use the following

algorithm to punish the winner cluster of the input space

_ 1

winner winner winner winner

Im Ic x eph Im Ic × + = + _

winner winner

Im eph Im =

2 2 2 2 2

( ) _ 1

winner winner winner winner winner winner

Ic I Im x I eph Im Ic σ σ × + + = − + 1

winner winner

Ic Ic = +

slide-20
SLIDE 20

Learning Algorithm for Structure I dentification

After that, we return to Step 3 to search for another cluster in the input space which will match the winner cluster in the output space

_ 1

winner winner winner winner

Im Ic x eph Im Ic × − = −

_

winner winner

Im eph Im =

2 2 2 2 2

( ) _ 1

winner winner winner winner winner winner

Ic I Im x I eph Im Ic σ σ × + − = − −

winner winner

Ic Ic =

slide-21
SLIDE 21

Learning Algorithm for Structure I dentification

Our structure learning algorithm is actually a supervised clustering method for identifying the structure of the fuzzy neural networks. As we know, supervised cluster algorithms are indeed effective, and they converge fast. Furthermore, our fuzzy neural networks has the remarkable self-learning ability. That is, it can self-generate fuzzy rules and self-adapt the structure and synaptic weights. Note, there are no conflicting rules in the generated fuzzy neural networks. In summary, the proposed structure learning strategy provides a new way to utilize a class

  • f supervised clustering algorithms for the on-line

structure learning of fuzzy neural networks.

slide-22
SLIDE 22

Learning Algorithm for Structure I dentification

Our structure learning algorithm is actually a supervised clustering method for identifying the structure of the fuzzy neural networks. As we know, supervised cluster algorithms are indeed effective, and they converge fast. Furthermore, our fuzzy neural networks has the remarkable self-learning ability. That is, it can self-generate fuzzy rules and self-adapt the structure and synaptic weights. Note, there are no conflicting rules in the generated fuzzy neural networks. In summary, the proposed structure learning strategy provides a new way to utilize a class

  • f supervised clustering algorithms for the on-line

structure learning of fuzzy neural networks.

slide-23
SLIDE 23

Param eter Learning of Fuzzy Neural Netw orks

We use the backpropagation algorithm to tune the parameters of the fuzzy neural networks. The centers and variances of the cluster in Layer 5 are updated by

4

( ) ( ) ( 1) ( ) [ 1( ) ( )] 1( )

k k k k

O t f t Om t Om t y t y t f t σ η × + = + × − × ( 1) ( ) [ 1( ) ( )]

k k

O t O t y t y t σ σ η + = + × −

4 4 2

( ) ( ) 1( ) 2( ) ( ) 1 ( )

k k k

Om t f t f t f t f t f t × × − × ×

slide-24
SLIDE 24

Param eter Learning of Fuzzy Neural Netw orks

The centers and variances of the cluster in Layer 2 are updated by

3 3 2

( ) ( 1) ( ) ( ) ( ) 2 ( ( ))

i i j i i j j j j i j

x Im t Im t Im t error t f t I t η σ − + = + × × × ×

2 3 3 3

( ( )) ( 1) ( ) ( ) ( ) 2 ( ( ))

i i j i i j j j j i j

x Im t I t I t error t f t I t σ σ η σ − + = + × × × ×

slide-25
SLIDE 25

Sim ulations

Example 1 — Identification of an SISO dynamic system: The plant to be identified is described by the following difference equation

3 2

( ) ( 1) ( ). 1 ( ) y t y t u t y t + = + +

slide-26
SLIDE 26

Sim ulations

Root-mean-square errors during learning.

20000 40000 60000 80000 100000 0.1 0.2 0.25 RMS errors Iterations

slide-27
SLIDE 27

Sim ulations

Outputs of SISO system and identification model.

50 100 150 200 250 300 350 400

  • 1.5
  • 1
  • 0.5

0.5 1 1.5 Output Time in Samples

slide-28
SLIDE 28

Sim ulations

Example 2 — Identification of an MISO dynamic system: the plant to be identified is a two-input and one-output dynamic system described by the following equations:

2 2 2 1 2 1

( ) (1 ) y x x x = − + −

slide-29
SLIDE 29

Sim ulations

Outputs of MISO system and identification

10 20 30 40 50 60 70 80 90 100 1 2 3 4 5 6 Time in Samples Outputs

slide-30
SLIDE 30

Sim ulations

Example 3 — Identification of the MIMO dynamic system: the plant is described as:

1 2 1 2 1 2 1 2 2 2 2

( ) ( 1) 1 ( ) ( ) ( 1) ( ) ( ) ( ) 1 ( ) y t y t y t u t y t y t y t u t y t ⎡ ⎤ ⎢ ⎥ + + ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ = + ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ + ⎣ ⎦ ⎣ ⎦ ⎢ ⎥ + ⎣ ⎦

slide-31
SLIDE 31

Sim ulations

  • response of MIMO system and identification model.

10 20 30 40 50 60 70 80 90 100

  • 6
  • 5
  • 4
  • 3
  • 2
  • 1

1 2 3 4 Time in Samples y1

1

y

slide-32
SLIDE 32

Sim ulations

  • response of MIMO system and identification model.

10 20 30 40 50 60 70 80 90 100

  • 3
  • 2
  • 1

1 2 3 Time in Samples y2

2

y

slide-33
SLIDE 33

Sim ulations

Example 4 — Prediction of time series. The performance of our fuzzy neural networks in dealing with real-world prediction problems is demonstrated here by predicting the time series of an automobile gearbox. This slightly nonlinear time series is collected using a sound level meter, and the analyzed acoustic data is provided by a Japanese automobile manufacturer.

slide-34
SLIDE 34

Sim ulations

It represents the external sound level of an automatic transmission system. Below is a brief summary

  • f

the experimental measurement conditions and used instrumentation: ①.1000 r/ min rotational velocity of the automobile engine, ②.5 Nm load torque, ③.Integrated sound level meter (Ono Sokki LA- 5110), ④.5 ms sampling period.

slide-35
SLIDE 35

Sim ulations

Actual and predicted time series of gearbox.

10 20 30 40 50 60 70 80 90 100

  • 3
  • 2
  • 1

1 2 3 Time in Samples y2

slide-36
SLIDE 36

Conclusions

In

  • ur

paper, a new novel clustering algorithm is proposed for the structure learning of the fuzzy neural networks. This clustering algorithm can on-line partition the input data and self-organize the fuzzy neural

  • structure. Therefore, no priori knowledge of

the distribution of the input data is needed for initialization of fuzzy rules. They are automatically generated with the incoming training data.

slide-37
SLIDE 37

Conclusions

Our fuzzy neural networks can use this on- line training algorithm for the structure and parameter training. The effectiveness of the proposed learning algorithm is verified by the identification of dynamical nonlinear systems and prediction of time series.