Showing 1 - 6 of 6
By means of a gradient strategy, the Moreau-Yosida regularization, limited memory BFGS update, and proximal method, we propose a trust-region method for nonsmooth convex minimization. The search direction is the combination of the gradient direction and the trust-region direction. The global...
Persistent link: https://www.econbiz.de/10010896513
Recently an affine scaling, interior point algorithm ASL was developed for box constrained optimization problems with a single linear constraint (Gonzalez-Lima et al., SIAM J. Optim. 21:361–390, <CitationRef CitationID="CR7">2011</CitationRef>). This note extends the algorithm to handle more general polyhedral constraints. With a line...</citationref>
Persistent link: https://www.econbiz.de/10010998324
We propose a new nonmonotone filter method to promote global and fast local convergence for sequential quadratic programming algorithms. Our method uses two filters: a standard, global g-filter for global convergence, and a local nonmonotone l-filter that allows us to establish fast local...
Persistent link: https://www.econbiz.de/10010896525
In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The...
Persistent link: https://www.econbiz.de/10011241274
Persistent link: https://www.econbiz.de/10012063818
Many derivative-free methods for constrained problems are not efficient for minimizing functions on “thin” domains. Other algorithms, like those based on Augmented Lagrangians, deal with thin constraints using penalty-like strategies. When the constraints are computationally inexpensive but...
Persistent link: https://www.econbiz.de/10010994061