Scientific Publications

Important: This page is frozen. New documents are now available in the digital repository  DSpace


Abstract

The conjugate gradient method is a useful and powerful
approach for solving large-scale minimization problems. In this paper,
a new nonlinear conjugate gradient method is proposed for large-scale
unconstrained optimization.This method include the already existing
two practical nonlinear conjugate gradient methods, to combine the
nice global convergence properties of Fletcher-Reeves method (abbre-
viated FR) and the good numerical performances of the Polak–Ribi´ ere–
Polyak method (abbreviated PRP), which produces a descent search
direction at every iteration and converges globally provided that the
line search satis?es the Wolfe conditions. Our numerical results show
that of the new method is very e?cient for the given test problems.
In addition we will study the methods related to the new nonlinear
conjugate gradient method.


BibTex

@article{uniusa699,
    title={Globaly convergence of a nonlinear conjugate gradient method for unconstrained optimization},
    author={Badereddine SELLAMI, Mohammed BELLOUFI and Yacine CHAIB},
    journal={RAIRO. Operations Research}
    year={2017},
    volume={},
    number={},
    pages={},
    publisher={}
}