Numerical Experience with a Class of Self-Scaling Quasi-Newton Algorithms1,2

    loading  Checking for direct PDF access through Ovid

Abstract

Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.

Related Topics

    loading  Loading Related Articles