Showing 1 - 10 of 1,056
We consider a new class of huge-scale problems, the problems with sparse subgradients. The most important functions of this type are piece-wise linear. For optimization problems with uniform sparsity of corresponding linear operators, we suggest a very efficient implementation of subgradient...
Persistent link: https://www.econbiz.de/10010610488
<Para ID="Par1">We consider a class of nonsmooth convex optimization problems where the objective function is a convex differentiable function regularized by the sum of the group reproducing kernel norm and <InlineEquation ID="IEq1"> <EquationSource Format="TEX">$$\ell _1$$</EquationSource> <EquationSource Format="MATHML"> <math xmlns:xlink="http://www.w3.org/1999/xlink"> <msub> <mi>ℓ</mi> <mn>1</mn> </msub> </math> </EquationSource> </InlineEquation>-norm of the problem variables. This class of problems has many applications in...</equationsource></equationsource></inlineequation></para>
Persistent link: https://www.econbiz.de/10011241278
We consider a class of nonsmooth convex optimization problems where the objective function is the composition of a strongly convex differentiable function with a linear mapping, regularized by the group reproducing kernel norm. This class of problems arise naturally from applications in group...
Persistent link: https://www.econbiz.de/10010845793
In this paper, we develop new subgradient methods for solving nonsmooth convex optimization problems. These methods are the first ones, for which the whole sequence of test points is endowed with the worst-case performance guarantees. The new methods are derived from a relaxed estimating...
Persistent link: https://www.econbiz.de/10010927696
We introduce an axiomatic formalism for the concept of the center of a set in a Euclidean space. Then we explain how to exploit possible symmetries and possible cyclicities in the set in order to localize its center. Special attention is paid to the determination of centers in cones of matrices....
Persistent link: https://www.econbiz.de/10010995357
Persistent link: https://www.econbiz.de/10010995460
In many applications it is possible to justify a reasonable bound for possible variation of subgradients of objective function rather than for their uniform magnitude. In this paper we develop a new class of efficient primal-dual subgradient schemes for such problem classes.
Persistent link: https://www.econbiz.de/10005043014
In this paper we present a new approach for constructing subgradient schemes for different types of nonsmooth problems with convex structure. Our methods are primaldual since they are always able to generate a feasible approximation to the optimum of an appropriately formulated dual problem....
Persistent link: https://www.econbiz.de/10005043237
In this paper we propose and analyze a variant of the level method [4], which is an algorithm for minimizing nonsmooth convex functions. The main work per iteration is spent on 1) minimizing a piecewise-linear model of the objective function and on 2) projecting onto the intersection of the...
Persistent link: https://www.econbiz.de/10005008186
In this paper we develop a new primal-dual subgradient method for nonsmooth convex optimization problems. This scheme is based on a self-concordant barrier for the basic feasible set. It is suitable for finding approximate solutions with certain relative accuracy. We discuss some applications of...
Persistent link: https://www.econbiz.de/10005065359