Showing 1 - 10 of 1,395
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. The original version, used for minimizing a convex function with Lipschitz-continuous Hessian, guarantees a global rate of convergence of order O(1/k exp.2), where k is the iteration counter. Our...
Persistent link: https://www.econbiz.de/10005065351
Persistent link: https://www.econbiz.de/10000107230
Persistent link: https://www.econbiz.de/10004616209
Persistent link: https://www.econbiz.de/10009603166
We consider iterative trust region algorithms for the unconstrained minimization of an objective function <InlineEquation ID="IEq1"> <EquationSource Format="TEX">$F ( \underline{x})$</EquationSource> </InlineEquation>, <InlineEquation ID="IEq2"> <EquationSource Format="TEX">$\underline{x}\in \mathcal{R}^{n}$</EquationSource> </InlineEquation>, when F is differentiable but no derivatives are available, and when each model of F is a linear or a quadratic polynomial. The...</equationsource></inlineequation></equationsource></inlineequation>
Persistent link: https://www.econbiz.de/10010998246
In this paper we derive effciency estimates of the regularized Newton's method as applied to constrained convex minimization problems and to variational inequalities. We study a one- step Newton's method and its multistep accelerated version, which converges on smooth convex problems as O( 1 k3...
Persistent link: https://www.econbiz.de/10005043350
In this paper we consider the problem of minimizing a smooth function by using the adaptive cubic regularized (ARC) framework. We focus on the computation of the trial step as a suitable approximate minimizer of the cubic model and discuss the use of matrix-free iterative methods. Our approach...
Persistent link: https://www.econbiz.de/10011151825
Persistent link: https://www.econbiz.de/10009268518
Persistent link: https://www.econbiz.de/10003999484
Persistent link: https://www.econbiz.de/10004129887