Showing 1 - 10 of 15
The projected Levenberg-Marquardt method for the solution of a system of equations with convex constraints is known to converge locally quadratically to a possibly nonisolated solution if a certain error bound condition holds. This condition turns out to be quite strong since it implies that the...
Persistent link: https://www.econbiz.de/10010937802
In this paper, we propose a Shamanskii-like Levenberg-Marquardt method for nonlinear equations. At every iteration, not only a LM step but also m−1 approximate LM steps are computed, where m is a positive integer. Under the local error bound condition which is weaker than nonsingularity, we...
Persistent link: https://www.econbiz.de/10010998305
By means of a gradient strategy, the Moreau-Yosida regularization, limited memory BFGS update, and proximal method, we propose a trust-region method for nonsmooth convex minimization. The search direction is the combination of the gradient direction and the trust-region direction. The global...
Persistent link: https://www.econbiz.de/10010896513
In practical applications related to, for instance, machine learning, data mining and pattern recognition, one is commonly dealing with noisy data lying near some low-dimensional manifold. A well-established tool for extracting the intrinsically low-dimensional structure from such data is...
Persistent link: https://www.econbiz.de/10010998274
<Para ID="Par1">We present a local convergence analysis of Gauss-Newton method for solving nonlinear least square problems. Using more precise majorant conditions than in earlier studies such as Chen (Comput Optim Appl 40:97–118, <CitationRef CitationID="CR8">2008</CitationRef>), Chen and Li (Appl Math Comput 170:686–705, <CitationRef CitationID="CR9">2005</CitationRef>), Chen and Li (Appl...</citationref></citationref></para>
Persistent link: https://www.econbiz.de/10011241270
In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The...
Persistent link: https://www.econbiz.de/10011241274
We propose a new nonmonotone filter method to promote global and fast local convergence for sequential quadratic programming algorithms. Our method uses two filters: a standard, global g-filter for global convergence, and a local nonmonotone l-filter that allows us to establish fast local...
Persistent link: https://www.econbiz.de/10010896525
Recently an affine scaling, interior point algorithm ASL was developed for box constrained optimization problems with a single linear constraint (Gonzalez-Lima et al., SIAM J. Optim. 21:361–390, <CitationRef CitationID="CR7">2011</CitationRef>). This note extends the algorithm to handle more general polyhedral constraints. With a line...</citationref>
Persistent link: https://www.econbiz.de/10010998324
Persistent link: https://www.econbiz.de/10008925520
<Para ID="Par2">[Figure not available: see fulltext.] </AbstractSection> Copyright Springer Science+Business Media New York 2015
Persistent link: https://www.econbiz.de/10011241263