Scientific Publications

Important: This page is frozen. New documents are now available in the digital repository  DSpace


Abstract

Conjugate gradient methods are an important class of methods for unconstrained
optimization, especially for large-scale problems. Recently, they have been much
studied. In this paper, we propose a new two-parameter family of conjugate
gradient methods for unconstrained optimization. The two-parameter family of
methods not only includes the already existing three practical nonlinear con-
jugate gradient methods, but has other family of conjugate gradient methods
as subfamily. The two-parameter family of methods with the Wolfe line search
is shown to ensure the descent property of each search direction. Some general
convergence results are also established for the two-parameter family of methods.
The numerical results show that this method is efficient for the given test problems.
In addition, the methods related to this family are uniformly discussed.


BibTex

@article{uniusa513,
    title={A new two-parameter family of nonlinear conjugate gradient methods},
    author={Badereddine SELLAMI, Yamina LASKRI and Rachid BENZINE},
    journal={Optimization: A Journal of Mathematical Programming and Operations Research}
    year={2013},
    volume={64},
    number={4},
    pages={993-1009},
    publisher={Taylor & Francis}
}