Showing 111 - 120 of 141
Persistent link: https://www.econbiz.de/10008925536
Persistent link: https://www.econbiz.de/10009149880
In this paper, we present a recurrent neural network for solving mixed linear complementarity problems (MLCPs) with positive semi-definite matrices. The proposed neural network is derived based on an NCP function and has a low complexity respect to the other existing models. In theoretical and...
Persistent link: https://www.econbiz.de/10009194098
Persistent link: https://www.econbiz.de/10008674173
Persistent link: https://www.econbiz.de/10008674177
In this paper, we propose a new sequential systems of linear equations (SSLE) filter algorithm, which is an infeasible QP-free method. The new algorithm needs to solve a few reduced systems of linear equations with the same nonsingular coefficient matrix, and after finitely many iterations, only...
Persistent link: https://www.econbiz.de/10009141317
Recently, conjugate gradient methods, which usually generate descent search directions, are useful for large-scale optimization. Narushima et al. (SIAM J Optim 21:212–230, <CitationRef CitationID="CR22">2011</CitationRef>) have proposed a three-term conjugate gradient method which satisfies a sufficient descent condition. We extend this...</citationref>
Persistent link: https://www.econbiz.de/10011151822
In this paper, a generalized recurrent neural network is proposed for solving ϵ-insensitive support vector regression (ϵ-ISVR). The ϵ-ISVR is first formulated as a convex non-smooth programming problem, and then a generalize recurrent neural network with lower model complexity is designed for...
Persistent link: https://www.econbiz.de/10011051206
In this paper, we propose a strongly sub-feasible direction method for the solution of inequality constrained optimization problems whose objective functions are not necessarily differentiable. The algorithm combines the subgradient aggregation technique with the ideas of generalized cutting...
Persistent link: https://www.econbiz.de/10011052810
In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The...
Persistent link: https://www.econbiz.de/10011241274