| English | Arabic | Home | Login |

Published Journal Articles

2024

Two new limited-memory preconditioned conjugate gradient algorithms for nonlinear optimization problems

2024-02
Journal of Intelligent & Fuzzy Systems (Issue : 46) (Volume : 2)
The conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local and global convergence qualities and minimal memory needs. While the quasi-Newton methods are reliable and efficient on a wide range of problems and thesemethods are converge faster than the conjugate gradientmethods and require fewer function evaluations, however, they are request substantially more storage, and if the problem is ill-conditioned, they may require several iterations. There is another class, termed preconditioned conjugate gradient method, it is a technique that combines two methods conjugate gradient with quasi-Newton. In this work, we proposed a new two limited memory preconditioned conjugate gradient methods (New1 and New2), to solve nonlinear unconstrained minimization problems, by using new modified symmetric rank one (NMSR1) and new modified Davidon, Fletcher, Powell (NMDFP), and also using projected vectors. We proved that these modifications fulfill some conditions. Also, the descent condition of the new technique has been proved. The numerical results showed the efficiency of the proposed new algorithms compared with some standard nonlinear, unconstrained problems.
2023

Two New Preconditioned Conjugate Gradient Methods for Minimization Problems

2023-01
Mathematics and Statistics (Issue : 1) (Volume : 11)
In application to general function, each of the conjugate gradient and Quasi-Newton methods has particular advantages and disadvantages. Conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local and global convergence qualities and minimal memory needs. Quasi-Newton methods are reliable and efficient on a wide range of problems and they converge faster than the conjugate gradient method and require fewer function evaluations but they have the disadvantage of requiring substantially more storage and if the problem is ill-conditioned, they may take several iterations. A new class has been developed, termed preconditioned conjugate gradient (PCG) method. It is a method that combines two methods, conjugate gradient and Quasi-Newton. In this work, two new preconditioned conjugate gradient algorithms are proposed namely New PCG1 and New PCG2 to solve nonlinear unconstrained optimization problems. A new PCG1 combines conjugate gradient method Hestenes-Stiefel (HS) with new self-scaling symmetric Rank one (SR1), and a new PCG2 combines conjugate gradient method Hestenes-Stiefel (HS) with new self-scaling Davidon, Flecher and Powell (DFP). The algorithm uses the strong Wolfe line search condition. Numerical comparisons with standard preconditioned conjugate gradient algorithms show that for these new algorithms, computational scheme outperforms the preconditioned conjugate gradient.
2022

A NEW MODIFIED CONJUGATE GRADIENT FOR NONLINEAR MINIMIZATION PROBLEMS

2022-10
Science Journal of University of Zakho (Issue : 4) (Volume : 10)
The conjugate gradient is a highly effective technique to solve the unconstrained nonlinear minimization problems and it is one of the most well-known methods. It has a lot of applications. For large-scale and unconstrained minimization problems, conjugate gradient techniques are widely applied. In this paper, we will suggest a new parameter of conjugate gradient to solve the nonlinear unconstrained minimization problems, based on the parameter of Dai and Liao. We will study the property of the descent , the property of the sufficient descent and property of the global convergence of the new method. We introduce some numerical data to prove the efficacy of the our method.
2021

A New conjugate gradient method for unconstrained optimization problems with descent property

2021-05
General Letters in Mathematics (GLM) (Issue : 2) (Volume : 9)
In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.
2019

A New Parameter Conjugate Gradient Method Based on Three Terms Unconstrained Optimization

2019-11
General Letters in Mathematics (Issue : 1) (Volume : 7)
In this paper, we suggest a new conjugate gradient method for solving nonlinear uncons trained optimization problems by using three term conjugate gradient method , We give a descent condi tion and the sufficient descent condition of the suggested method.
2016

New conjugate gradient method for unconstrained optimization with logistic mapping

2016-06
Journal of University of Zakho (Issue : 1) (Volume : 4)
In this paper , we suggested a new conjugate gradient algorithm for unconstrained optimization based on logistic mapping, descent condition and sufficient descent condition for our method are provided. Numerical results show that our presented algorithm is more efficient for solving nonlinear unconstrained optimization problems comparing with (DY) .
2014

New Iterative Conjugate Gradient Method for Nonlinear Uncon- strained Optimization Using Homptopy Technique

2014-06
IOSR Journal of Mathematics (IOSR-JM) (Issue : 3) (Volume : 10)
A new hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , we computed the parameter as aconvex combination of (Polak-Ribiere (PR))[9] and (Al-Bayati and Al-Assady (BA))[1].

New Homotopy Conjugate Gradient for Unconstrained Optimization using Hestenes- Stiefel and Conjugate Descent

2014-05
IOSR Journal of Engineering (IOSRJEN) (Issue : 05) (Volume : 04)
In this paper, we suggest a hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , We calculate the parameter �� as aconvex combination of �𝑆 (Hestenes Stiefel)[5] and ��� (Conjugate descent)[3].

Back