Scientific Publications

Important: This page is frozen. New documents are now available in the digital repository  DSpace


Abstract

Nonlinear conjugate gradient techniques make a significant contribution to addressing unconstrained optimization problems, especially in cases with significant dimensions. In this thesis, we develop some conjugate gradient parameters, as well as study the global convergence and sufficient descent property of these new methods, this is in the case of using the strong Wolfe line search (SWLS). Numerical results show that the proposed methods are effective and robust in minimizing some unconstrained optimization problems.

Furthermore, the proposed algorithms were extended to solve the problem of nonparametric statistics, specifically, to solve the problem of mode function and the problem of conditional model regression function.

Keywords: Conjugate gradient method , Inexact line search , Descent condition , Numerical comparisons, Global convergence , Mode function , Conditional model regression, Kernel estimatorrs.


BibTex

@phdthesis{uniusa5187,
    title={Usingconjugategradientmethods for regression function},
    author={Abdelhamid MEHAMDIA},
    year={2024},
    school={university of souk ahras}
}