Scientific Publications

Important: This page is frozen. New documents are now available in the digital repository  DSpace


Abstract

Abstract Conjugate gradient methods are an important class of methods for unconstrained
optimization, especially for large-scale problems. Recently, they have been much studied. In
this paper, a newfamily of conjugate gradientmethod is proposed for unconstrained optimiza-
tion. This method includes the already existing two practical nonlinear conjugate gradient
methods, which produces a descent search direction at every iteration and converges globally
provided that the line search satis?es the Wolfe conditions. The numerical experiments are
done to test the ef?ciency of the new method, which implies the new method is promising.
In addition the methods related to this family are uniformly discussed.


BibTex

@article{uniusa517,
    title={A new family of globally convergent conjugate gradient methods},
    author={Badereddine SELLAMI and Yacine CHAIB},
    journal={Ann Oper Res}
    year={2016},
    volume={241},
    number={497-513},
    pages={1-17},
    publisher={Springer}
}