0
Second-Order Model-Based Optimizers: SQ and NR

Excerpt

This chapter presents two search algorithms that use second-order models of the surface: successive quadratic (SQ) and Newton–Raphson (NR). They presume more about the surface than the gradient-based optimizers. Consequently these are faster when the surface is compatible with the algorithm concepts, which include continuum deterministic surfaces, no flat spots, and an initial trial solution in the vicinity of the optimum. These optimization approaches often are accepted as the premier optimization methods, and they are components in next-level gradient-based optimizers. So, they need to be presented, even though I find that they are wholly inappropriate for many applications, which have features that are inconsistent with the concepts on which these algorithms are predicated.

9.1Introduction
9.2Successive Quadratic
9.3Newton–Raphson
9.4Perspective on CSLS, ISD, SQ, and NR
9.5Choosing Step Size for Numerical Estimate of Derivatives
9.6Takeaway
9.7Exercises

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In