Ugur HALICI - METU EEE - ANKARA 11/18/2004 EE543 - ANN - CHAPTER 5 1
Annealing by Stochastic Annealing by Stochastic Neural Networks for Optimization Neural Networks for Optimization CHAPTER CHAPTER V V
CHAPTER CHAPTER V : V : Annealing by Stochastic Annealing by Stochastic NNs NNs for Optimization for Optimization Introduction
Two major classes of optimization techniques are the deterministic gradient methods and stochastic annealing methods. Gradient descent algorithms are greedy algorithms, which are subject to a fundamental limitation of being easily trapped in local minima of the cost function. Hopfield networks usually converge to a local minimum of energy function. This problem is overcome by the use of stochastic annealing algorithms since they provide opportunity to escape from local minima. Boltzmann machine has capability of escaping local minima through a relaxation technique based on simulated annealing