Adaptation in Weight Space through Gradient Descent for Hopfield Netas Static Optimizer: Is It Feasible?


This paper reports on results of an empirical simulation study for adaptation of weights through gradient descent for a Hopfield neural network configured as a static optimizer and tested on the traveling salesman problem. Adaptation through gradient descent within the context of recurrent and non-recurrent back-propagation training was attempted in the weight space, which is high-dimensional, i.e. on the order of 1,000,000,000,000 weights for a two-dimensional node array of the Hopfield network configured for a 1000-city problem instance. Ensuing substantial empirical work, practically no noteworthy progress could be recorded for realization of the adaptation in the weight space: the adaptation algorithm failed to locate a set of weight values that established the solutions of the traveling salesman problem instances as local minima in the Lyapunov space. Accordingly, the findings in this paper tend to suggest that alternate means of adaptation schemes with a small number of freely adjustable parameters should be considered.

  • Abstract
  • Introduction
  • Hopfield Network and Adaptation
  • Simulation Study
  • Conclusions
  • References

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In