Amina HALLAL (2024) Quelques techniques pour améliorer les performances des méthodes à directions conjuguées. univ of souk ahras
Scientific Publications
Important: This page is frozen. New documents are now available in the digital repository DSpace
Abstract
Abstract
The conjugate gradient method is an effective method for solving unconstrained and nonlinear optimization problems. In this thesis, we have proposed three new hybrid conjugate gradient algorithms.
For the first method, the parameter βkis computed as convex combination of two parameters from the \"Harger-Zhan\" and \"Dai-Yaun \" conjugate gradient methods. For the second method, the parameter β_{k} is determineded as convex combination of three parameters from the \"Dai-Yuan\", \"Conjugate Descent\", and \"Hestenes-Stiefel\" conjugate gradient methods, and for the last method, the parameter βkis computed as convex combination of foor parameters from the \"Dai-Yaun\", \"Fletcher-Revees\", \"Polak-Ribière- Polyak\", and \" Hestenes-Stiefel \" conjugate gradient methods. The convex combination parameter for the first method can be computed if the direction of this method satisfies the pure conjugucy condition, while for the convex combination parameters of the second and third methods, it can be computed if the D-L conjugacy condition is satisfied.
We have established the global convergence of these three methods under the strong Wolfe conditions. Numerical results have demonstrated the effectiveness of all three new hybrid methods. Additionally, we have applied the third new method to image restoration problems and shown its efficiency.
.......................................................................
Key words: Unconstrained optimization; Strong Wolfe line search; Conjugate gradient method; Convex combination; Global convergence; Numerical comparisons..
The conjugate gradient method is an effective method for solving unconstrained and nonlinear optimization problems. In this thesis, we have proposed three new hybrid conjugate gradient algorithms.
For the first method, the parameter βkis computed as convex combination of two parameters from the \"Harger-Zhan\" and \"Dai-Yaun \" conjugate gradient methods. For the second method, the parameter β_{k} is determineded as convex combination of three parameters from the \"Dai-Yuan\", \"Conjugate Descent\", and \"Hestenes-Stiefel\" conjugate gradient methods, and for the last method, the parameter βkis computed as convex combination of foor parameters from the \"Dai-Yaun\", \"Fletcher-Revees\", \"Polak-Ribière- Polyak\", and \" Hestenes-Stiefel \" conjugate gradient methods. The convex combination parameter for the first method can be computed if the direction of this method satisfies the pure conjugucy condition, while for the convex combination parameters of the second and third methods, it can be computed if the D-L conjugacy condition is satisfied.
We have established the global convergence of these three methods under the strong Wolfe conditions. Numerical results have demonstrated the effectiveness of all three new hybrid methods. Additionally, we have applied the third new method to image restoration problems and shown its efficiency.
.......................................................................
Key words: Unconstrained optimization; Strong Wolfe line search; Conjugate gradient method; Convex combination; Global convergence; Numerical comparisons..
Information
Item Type | Thesis |
---|---|
Divisions |
» Faculty of Science and Technology |
ePrint ID | 4925 |
Date Deposited | 2024-03-21 |
Further Information | Google Scholar |
URI | https://univ-soukahras.dz/en/publication/article/4925 |
BibTex
@phdthesis{uniusa4925,
title={Quelques techniques pour améliorer les performances des méthodes à directions conjuguées},
author={Amina HALLAL},
year={2024},
school={univ of souk ahras}
}
title={Quelques techniques pour améliorer les performances des méthodes à directions conjuguées},
author={Amina HALLAL},
year={2024},
school={univ of souk ahras}
}