a self organizing fuzzy neural networks
play

A Self-Organizing Fuzzy Neural Networks H. S. LI N, X. Z. GAO, XI - PowerPoint PPT Presentation

A Self-Organizing Fuzzy Neural Networks H. S. LI N, X. Z. GAO, XI ANLI N HUANG, AND Z. Y. SONG Abstract This paper proposes a novel clustering algorithm for the structure learning of fuzzy neural networks. Our clustering algorithm uses


  1. A Self-Organizing Fuzzy Neural Networks H. S. LI N, X. Z. GAO, XI ANLI N HUANG, AND Z. Y. SONG

  2. Abstract � This paper proposes a novel clustering algorithm for the structure learning of fuzzy neural networks. Our clustering algorithm uses the reward and penalty mechanism for the adaptation of the fuzzy neural networks prototypes at every training sample. Compared with the classical clustering algorithms, the new algorithm can on-line partition the input data, pointwise update the clusters, and self-organize the fuzzy neural structure.

  3. Abstract � All rules are self-created, and they grow automatically with more incoming data. There are no conflicting rules in the created fuzzy neural networks. Our approach shows supervised clustering algorithms can be suitable for the structure learning of the self- organizing fuzzy neural networks. The identification of several typical nonlinear dynamical systems and prediction of time series data are employed to demonstrate the effectiveness of the proposed fuzzy neural networks and its learning algorithm.

  4. I ntroduction � It is well known fuzzy logic provides human reasoning capabilities to capture uncertainties, which cannot be described by precise mathematical models. Neural networks offer remarkable advantages, such as adaptive learning, parallelism, fault tolerance, and generalization. They have been proved to be powerful techniques in the discipline of system control, especially when the controlled system is difficult to be modeled accurately, or the controlled system has large uncertainties and strong nonlinearities.

  5. I ntroduction � Therefore, fuzzy logic and neural networks have been widely adopted in model-free adaptive control of nonlinear systems. � There are numerous kinds of neural fuzzy systems proposed in the literature , and most of them are suitable for only off-line cases. Some on-line learning methods for the neural fuzzy systems are studied as well.

  6. I ntroduction � In this paper, we proposed a novel on-line clustering algorithm of structure learning for our fuzzy neural networks. This new clustering algorithm employs the mechanism of reward and penalty action used in the Learning Vector Quantization (LVQ). Our fuzzy neural networks with the on-line structure and parameter learning capability is a suitable candidate for real-time applications due to its fast convergence.

  7. Structure of Fuzzy Neural Netw orks y 1 y 2 y y d d 1 2 x x 2 1

  8. Structure of Fuzzy Neural Netw orks � Layer 1: Each node in this layer, only transmits input values to the next layer directly. Thus, the function of the th node is defined as = = 1 = f u x a f i i � Layer 2: Each node in this layer corresponds to one linguistic label of one of the input variables in Layer 1. The operation performed in this layer is − (2) = u m 1 ( f a e = − i ij 2 f ) σ 2 ij

  9. Structure of Fuzzy Neural Netw orks � Layer 3: Nodes in this layer are rule nodes, and constitute the antecedents of the fuzzy rule base. The input and output functions of the th rule node are n = ∏ (3) f u = a f i = i 1 � Layer 4: The nodes in this layer are called “output-term nodes”. The links in Layer 4 perform the fuzzy OR operation that have the same consequent = ∑ J 4 = f u a min(1, f ) j = j 1

  10. Structure of Fuzzy Neural Netw orks � Layer 5: These nodes and Layer 5 links attached act as the defuzzifier. The following functions can perform the Center Of Area (COA) defuzzification method: = ∑ f ∑ ∑ a = = σ 5 5 5 f w u ( m ) u σ 5 u ij i ij ij i ij i � Based on the above structure, an on-line learning algorithm will be proposed to determine the proper centers and widths of term nodes.

  11. Learning Algorithm for Structure I dentification � In our fuzzy neural networks, for every on- line incoming training pattern, we first use the novel cluster algorithm to identify the structure, and next apply the backpropagation algorithm to optimize the parameters. In our learning method, only the training data is need. The input / output-term nodes and rule nodes are created dynamically as learning proceeds upon receiving on-line incoming training data. During the learning process, novel input-term and output-term nodes and rule nodes will be added.

  12. Learning Algorithm for Structure I dentification � The main idea of our clustering algorithm is for every input data, we first find the winner clusters in the input and output space respectively. Next, as in the fuzzy ARTMAP, we check that if the winner cluster in the input space is connected to the winner cluster in the output space. If so, we assume that the winner cluster in the output space is the correct prediction of the winner cluster in the input space, which is analogous to the fact the fuzzy ARTb category activated by the input is the correct prediction of the fuzzy ARTa categories activated by an input in the fuzzy ARTMAP.

  13. Learning Algorithm for Structure I dentification � If not, we assume that the mismatch occurs between the winner cluster in the input space and the winner cluster in the output space and we will begin to search for another cluster in the input space which will match the winner cluster in the output space . The reward and penalty mechanism is employed in our clustering algorithm. We can describe the novel clustering algorithm as follows.

  14. Learning Algorithm for Structure I dentification � Step1: Initialize the fuzzy system with zero cluster: , . In = On = 0 0 � Step 2: For the first input and output training vectors, they are selected as the centers of the first clusters in the input and output space, respectively. We connect the first cluster in the input space to the first cluster in the output space, and set the number of data belonging to the cluster as one.

  15. Learning Algorithm for Structure I dentification � Step 3: For an input data point in the input space, we compute the distances between the input vector and the existing input space clusters using the Euclidean metric function: 2 q ∑ = − l l d x I _ m p i p = i 1 The nearest cluster (winner neuron) is chosen by selecting the minimum . Next, d j the following algorithms are used:

  16. Learning Algorithm for Structure I dentification � If is larger than a certain value , we d d vigilance j assume that this input data does not belong to any existing cluster, we form a novel cluster. The newly added cluster is the winner cluster in the input space. If is smaller d j than , we assume that cluster j is the d vigilance winner cluster in the input space. The procedure in the input space in Step 3 is also adopted in the output space. We can also find the winner cluster in the output space.

  17. Learning Algorithm for Structure I dentification � Step 4: We check the mapping process from input clusters to the output clusters. (1). If the winner cluster in the input space is a novel cluster, we connect this novel cluster to the winner cluster in the output space, and update the parameters of the winner cluster in the output space. × + Om Oc x = winner winner eph _ Om + winner Oc 1 winner

  18. Learning Algorithm for Structure I dentification × σ + + 2 2 2 Oc ( O Om ) y σ = − 2 2 winner winner winner O eph _ Om + winner winner Oc 1 winner = Om eph _ Om winner winner = + Oc Oc 1 winner winner � (2). If the winner cluster of the input space is connected to the winner cluster of the output space originally, we adopt the following algorithm to update the centers (), variances and the counters of the winner cluster in the input space.

  19. Learning Algorithm for Structure I dentification × + Im Ic x = winner winner eph _ Im + winner Ic 1 winner = Im eph _ Im winner winner × σ + + 2 2 2 Ic ( I Im ) x σ = − 2 2 winner winner winner _ I eph Im + winner winner Ic 1 winner = + Ic Ic 1 winner winner � (3). If the winner cluster of the input space is not connected to the winner cluster of the output space yet, we use the following algorithm to punish the winner cluster of the input space

  20. Learning Algorithm for Structure I dentification × − Im Ic x = winner winner eph _ Im − winner Ic 1 winner = Im eph _ Im winner winner × σ + − 2 2 2 Ic ( I Im ) x σ = − 2 2 winner winner winner I eph _ Im − winner winner Ic 1 winner = Ic Ic winner winner � After that, we return to Step 3 to search for another cluster in the input space which will match the winner cluster in the output space

  21. Learning Algorithm for Structure I dentification � Our structure learning algorithm is actually a supervised clustering method for identifying the structure of the fuzzy neural networks. As we know, supervised cluster algorithms are indeed effective, and they converge fast. Furthermore, our fuzzy neural networks has the remarkable self-learning ability. That is, it can self-generate fuzzy rules and self-adapt the structure and synaptic weights. Note, there are no conflicting rules in the generated fuzzy neural networks. In summary, the proposed structure learning strategy provides a new way to utilize a class of supervised clustering algorithms for the on-line structure learning of fuzzy neural networks.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend