Gradient-Based Optimizer Solutions: LM, RLM, CG, BFGS, RG, and GRG


Features of a particular application (constraints, inflections, steep valleys) provide difficulty for SQ and NR, which are second-order model-based optimizers. This chapter covers several solutions that typify a myriad of similar adaptations of the basic incremental steepest descent and Newton-type approaches. The topics of this chapter are Levenberg–Marquardt (LM), conjugate gradient (CG), Broyden–Fletcher–Goldfarb–Shanno (BFGS), and generalized reduced gradient (GRG). LM blends ISD and NR to ensure it moves downhill, not to a saddle, maximum, or jump to absurd locations. CG and BFGS are for unconstrained applications with CG being a variant of Cauchy’s sequential line search method and BFGS a quasi-Newton approach. GRG is for handling constraints.

10.2Levenberg–Marquardt (LM)
10.3Scaled Variables
10.4Conjugate Gradient (CG)
10.5Broyden–Fletcher–Goldfarb–Shanno (BFGS)
10.6Generalized Reduced Gradient (GRG)

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In