News

Conjugate gradient methods form a class of iterative algorithms that are highly effective for solving large‐scale unconstrained optimisation problems.
In the setting of Hilbert spaces, we show that a hybrid steepest-descent algorithm converges strongly to a solution of a convex minimization problem over the fixed point set of a finite family of ...
Unlike the metaphorical mountaineer, optimization researchers can program their gradient descent algorithms to take steps of any size. Giant leaps are tempting but also risky, as they could overshoot ...
In this paper we present two algorithms for LC¹ unconstrained optimization problems which use the second order Dini upper directional derivative. These methods are simple and easy to perform. We ...