chapter ii iii i chapter neural networks as neural
play

CHAPTER II III I CHAPTER Neural Networks as Neural Networks as - PDF document

Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER II III I CHAPTER Neural Networks as Neural Networks as Associative Memory Associative Memory CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as


  1. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER II III I CHAPTER Neural Networks as Neural Networks as Associative Memory Associative Memory CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory Introduction One of the primary functions of the brain is associative memory. We associate the faces with names, letters with sounds, or we can recognize the people even if they have sunglasses or if they are somehow elder now. In this chapter, first the basic definitions about associative memory will be given and then it will be explained how neural networks can be made linear associators so as to perform as interpolative memory. Next it will be explained how the Hopfield network can be used as autoassociative memory and then Bipolar Associative Memory network that is designed to operate as heteroassociative memory will be introduced. EE543 - ANN - CHAPTER 3 1

  2. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory In an associative memory we store a set of patterns µ k , k =1... K, so that the network responds by producing whichever of the stored patterns most closely resembles the one presented to the network Suppose that the stored patterns, which are called exemplars or memory elements, are in the form of pairs of associations, µ k =( u k , y k ) where u k ∈ R N , y k ∈ R M , k =1.. K . According to the mapping ϕ : R N → R M that they implement, we distinguish the following types of associative memories: • Interpolative associative memory • Accretive associative memory CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory: Interpolative Memory In interpolative associative memory , When u = u r is presented to the memory it responds by producing y r of the stored association. However if u differs from u r by an amount of ε , that is if u=u r + ε is presented to the memory, then the response differs from y r by some amount ε r . Therefore in interpolative associative memory we have ϕ + ε = + ε ε = ⇒ ε = = r r r r ( u ) y 0 0 , 1 .. such that r K (3.1.1) EE543 - ANN - CHAPTER 3 2

  3. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory: Accretive Memory • In accretive associative memory , when u is presented to the memory it responds by producing y r of the stored association such that u r is the one closest to u among u k , k=1..K , that is, ϕ = = − = r r k ( u ) y such that u min u u , k 1 ... K (3.1.2) k u CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory: Heteroassociative-Autoassociative The accretive associative memory in the form given above, that is u k and y k are different , is called heteroassociative memory . However if the stored exemplars are in a special form such that the desired patterns and the input patterns are the same, that is y k = u k for k = 1..K , then it is called autoassociative memory . In such a memory, whenever u is presented to the memory it responds by u r which is the closest one to u among u k , k = 1..K , that is, ϕ = = − = r r k ( u ) u such that u min u u , k 1 ... K (3.1.3) k u EE543 - ANN - CHAPTER 3 3

  4. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory While interpolative memories can be implemented by using feed-forward neural networks, it is more appropriate to use recurrent networks as accretive memories. The advantage of using recurrent networks as associative memory is their convergence to one of a finite number of stable states when started at some initial state. The basic goals are: to be able to store as many exemplars as we need, each corresponding to a different stable state of the network, to have no other stable state to have the stable state that the network converges to be the one closest to the applied pattern . CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory The problems that we are faced with being: the capacity of the network is restricted depending on the number and properties of the patterns to be stored, some of the exemplars may not be among the stable states some spurious stable states different than the exemplars may arise by themselves the converged stable state may be other than the one closest to the applied pattern EE543 - ANN - CHAPTER 3 4

  5. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory One way of using recurrent neural networks as associative memory is to fix the external input of the network and present the input pattern u r to the system by setting x (0)= u r . If we relax such a network, then it will converge to the attractor x * for which x (0) is within the basin attraction as explained in Chapter 2. If we are able to place each µ k as an attractor of the network by proper choice of the connection weights, then we expect the network to relax to the attractor x *= µ r that is related to the initial state x (0)= u r . For a good performance of the network, we need the network only to converge to one of the stored patterns µ k , k = 1 ... K. CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory Unfortunately, some initial states may converge to spurious states , which are the undesired attractors of the network representing none of the stored patterns. Spurious states may arise by themselves depending on the model used and the patterns stored. The capacity of the neural associative memories is restricted by the size of the networks. If we increment the number of stored patterns for a fixed size neural network, spurious states arise inevitably. Sometimes, the network may converge not to a spurious state, but to a memory pattern that is not so close to the pattern presented. EE543 - ANN - CHAPTER 3 5

  6. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory What we expect for a feasible operation is that, at least for the memory patterns themselves, if any of the stored pattern is presented to the network by setting x (0)= µ k , then the network should stay converged to x *= µ r (Figure 3.1). Figure 3.1. In associative memory each memory element is assigned to an attractor CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.1. Associative Memory A second way to use recurrent networks as associative memory, is to present the input pattern u r to the system as an external input. This can be done by setting θ = u r , where θ is the threshold vector whose i th component is corresponding to the threshold of neuron i . After setting x (0) to some fixed value, we relax the network and then wait until it converges to an attractor x *. For a good performance of the network, we desire the network to have a single attractor such that x *= µ k for each stored input pattern u k , therefore the network will converge to this attractor independent of the initial state of the network. Another solution to the problem is to have predetermined initial values, so that these initial values lie within the basin attraction of µ k whenever u k is applied. We will consider this kind of networks in Chapter 7 in more detail, where we will examine how these recurrent networks are trained. EE543 - ANN - CHAPTER 3 6

  7. Ugur HALICI - METU EEE - ANKARA 11/18/2004 CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.2. Linear Associators : Orthonormal patterns It is quite easy to implement interpolative associative memory when the set of input memory elements { u k } constitutes an othonormal set of vectors, that is = ⎧ 1 i j T u = i j ⎨ u ≠ ⎩ 0 i j (3.2.1) where T denotes transpose. By using kronecker delta, we write simply T = δ (3.2.2) i j u u ij CHAPTER III : III : Neural Networks as Associative Memory CHAPTER Neural Networks as Associative Memory 3.2. Linear Associators : Orthonormal patterns The mapping function ϕ ( u ) defined below may be used to establish an interpolative associative memory: T ϕ ( = (3.2.3) u ) W u where ∑ = k × k (3.2.4) W u y k Here the symbol is used to denote outer product of vectors x ∈ R N and y ∈ R M , which is defined as × = T = T k k k k k k T u y u y ( y u ) (3.2.5) resulting in a matrix of size N by M . EE543 - ANN - CHAPTER 3 7

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend