TY - JOUR

T1 - An accelerated structured quasi-Newton method with a diagonal second-order Hessian approximation for nonlinear least squares problems

AU - Huynh, Duc Quoc

AU - Hwang, Feng Nan

N1 - Publisher Copyright:
© 2023 Elsevier B.V.

PY - 2024/5/1

Y1 - 2024/5/1

N2 - Our study focuses on exploring new variants of the structured quasi-Newton (SQN) method with a secant-like diagonal approximation (SLDA) of the second-order term of Hessian for solving nonlinear least squares (NLS) problems. In addition, an accelerated version of SQN-SLDA referred to as ASQN-SLDA, is also considered. In ASQN-SLDA, we rescale the search direction after the first backtracking linesearch procedure to produce a more aggressive step to increase the objective function value reduction. The concept of the proposed methods is simple and easy to implement. We prove the proposed methods are globally convergent under some appropriate assumptions and report several numerical experiments based on a suite of benchmark problems for NLS. The numerical results show that ASQN-SLDA is more robust than some baseline methods, including SQN-SLDA, the generalized Gauss–Newton (GN), Levenberg–Marquardt update, and the hybrid GN with Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, also called H-GN-BFGS. For computing time, AQN-SLDA outperforms H-GN-BFGS for most test problems, and the speedup is more significant as the problem size increases. Furthermore, due to the trade-off between the number of iterations for convergence and the overhead needed in ASQN-SLDA, the benefit of the acceleration step is more evident for the largest-sized problems.

AB - Our study focuses on exploring new variants of the structured quasi-Newton (SQN) method with a secant-like diagonal approximation (SLDA) of the second-order term of Hessian for solving nonlinear least squares (NLS) problems. In addition, an accelerated version of SQN-SLDA referred to as ASQN-SLDA, is also considered. In ASQN-SLDA, we rescale the search direction after the first backtracking linesearch procedure to produce a more aggressive step to increase the objective function value reduction. The concept of the proposed methods is simple and easy to implement. We prove the proposed methods are globally convergent under some appropriate assumptions and report several numerical experiments based on a suite of benchmark problems for NLS. The numerical results show that ASQN-SLDA is more robust than some baseline methods, including SQN-SLDA, the generalized Gauss–Newton (GN), Levenberg–Marquardt update, and the hybrid GN with Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, also called H-GN-BFGS. For computing time, AQN-SLDA outperforms H-GN-BFGS for most test problems, and the speedup is more significant as the problem size increases. Furthermore, due to the trade-off between the number of iterations for convergence and the overhead needed in ASQN-SLDA, the benefit of the acceleration step is more evident for the largest-sized problems.

KW - Hessian matrix

KW - Nonlinear least-squares problems

KW - Quasi-Newton

KW - Secant-like diagonal approximation

UR - http://www.scopus.com/inward/record.url?scp=85179464791&partnerID=8YFLogxK

U2 - 10.1016/j.cam.2023.115718

DO - 10.1016/j.cam.2023.115718

M3 - 期刊論文

AN - SCOPUS:85179464791

SN - 0377-0427

VL - 442

JO - Journal of Computational and Applied Mathematics

JF - Journal of Computational and Applied Mathematics

M1 - 115718

ER -