| الإنجليزية | العربية | الرئسية | تسجيل الدخول |

البحوث العلمية

2023

A NEW THREE-TERM CONJUGATE GRADIENT ALGORITHM FOR SOLVING MINIMIZATION PROBLEMS

2023-12
Science Journal of University of Zakho (القضية : 4) (الحجم : 11)
The method of optimization is used to determine the most precise value for certain functions within a certain domain; it is mostly studied and employed in the fields of mathematics, computer science, and physics. This work presents a novel three-term conjugate gradient (CG) approach for unconstrained optimization problems. Both the descending criteria and the sufficient descent criterion were met by the new approach. The novel method that has been proposed has been evaluated for global convergence. The outcomes of numerical trials on a few well-known test functions demonstrated how highly successful our new modified method is, depending on the number of iterations (NOI) and the number of functions to be evaluated (NOF).

A New Conjugate Gradient Algorithm for Minimization Problems Based on the Modified Conjugacy Condition

2023-06
Mathematics and Statistics (القضية : 4) (الحجم : 11)
Optimization refers to the process of finding the best possible solution to a problem within a given set of constraints. It involves maximizing or minimizing a specific objective function while adhering to specific constraints. Optimization is used in various fields, including mathematics, engineering, economics, computer science, and data science, among others. The objective function can be a simple equation, a complex algorithm, or a mathematical model that describes a system or process. There are various optimization techniques available, including linear programming, nonlinear programming, genetic algorithms, simulated annealing, and particle swarm optimization, among others. These techniques use different algorithms to search for the optimal solution to a problem. In this paper, the main goal of unconstrained optimization is to minimize an objective function that uses real variables and has no value restrictions. In this study, based on the modified conjugacy condition, we offer a new conjugate gradient (CG) approach for nonlinear unconstrained problems in optimization. The new method satisfied the descent condition and the sufficient descent condition. We compare the numerical results of the new method with the Hestenes-Stiefel (HS) method. Our novel method is quite effective according to the number of iterations (NOI) and the number of functions (NOF) evaluated, as demonstrated by the numerical results on certain well-known non-linear test functions.

A New Quasi-Newton Method with PCG Method for Nonlinear Optimization Problems

2023-01
Mathematics and Statistics (القضية : 1) (الحجم : 11)
The major stationary iterative method used to solve nonlinear optimization problems is the quasi-Newton (QN) method. Symmetric Rank-One (SR1) is a method in the quasi-Newton family. This algorithm converges towards the true Hessian fast and has computational advantages for sparse or partially separable problems [1]. Thus, investigating the efficiency of the SR1 algorithm is significant. It's possible that the matrix generated by SR1 update won't always be positive. The denominator may also vanish or become zero. To overcome the drawbacks of the SR1 method, resulting in better performance than the standard SR1 method, in this work, we derive a new vector 𝑦𝑦𝑘𝑘 ∗ depending on the Barzilai-Borwein step size to obtain a new SR1 method. Then using this updating formula with preconditioning conjugate gradient (PCG) method is presented. With the aid of inexact line search procedure by strong Wolfe conditions, the new SR1 method is proposed and its performance is evaluated in comparison to the conventional SR1 method. It is proven that the updated matrix of the new SR1 method, 𝐻𝐻𝑘𝑘+1 𝑛𝑛𝑛𝑛𝑛𝑛, is symmetric matrix and positive definite matrix, given 𝐻𝐻𝑘𝑘 is initialized to identity matrix. In this study, the proposed method solved 13 problems effectively in terms of the number of iterations (NI) and the number of function evaluations (NF). Regarding NF, the new SR1 method also outperformed the classic SR1 method. The proposed method is shown to be more efficient in solving relatively large-scale problems (5,000 variables) compared to the original method. From the numerical results, the proposed method turned out to be significantly faster, effective and suitable for solving large dimension nonlinear equations
2022

HYBRIDIZATION GRADIENT BASED METHODS WITH GENETIC ALGORITHM FOR SOLVING SYSTEMS OF LINEAR EQUATIONS

2022-11
Journal of Duhok University (القضية : 2) (الحجم : 25)
In this paper, we propose two hybrid gradient based methods and genetic algorithm for solving systems of linear equations with fast convergence. The first proposed hybrid method is obtained by using the steepest descent method and the second one by the Cauchy-Barzilai-Borwein method. These algorithms are based on minimizing the residual of solution which has genetic characteristics. They are compared with the normal genetic algorithm and standard gradient based methods in order to show the accuracy and the convergence speed of them. Since the conjugate gradient method is recommended for solving large sparse and symmetric positive definite matrices, we also compare the numerical results of our proposed algorithms with this method. The numerical results demonstrate the robustness and efficiency of the proposed algorithms. Moreover, we observe that our hybridization of the CBB method and genetic algorithm gives more accurate results with faster convergence than other mentioned methods in all given cases

New search direction of steepest descent method for solving large linear systems

2022-08
General Letters in Mathematics (القضية : 2) (الحجم : 12)
The steepest descent (SD) method is well-known as the simplest method in optimization. In this paper, we propose a new SD search direction for solving system of linear equations Ax= b. We also prove that the proposed SD method with exact line search satisfies descent condition and possesses global convergence properties. This proposed method is motivated by previous work on the SD method by Zubai’ah-Mustafa-Rivaie-Ismail (ZMRI)[2]. Numerical comparisons with a classical SD algorithm and ZMRI algorithm show that this algorithm is very effective depending on the number of iterations (NOI) and CPU time.
2021

A MODIFIED CONJUGATE GRADIENT METHOD WITH GLOBAL-CONVERGENCE FOR UNCONSTRIANED OPTIMIZATION PROBEMS

2021-10
Journal of Duhok University (القضية : 2) (الحجم : 24)
In this paper a new conjugate gradient method for unconstrained optimization is suggested, the new method is based on Conjugate Descent (CD) formula which is a modified of the CD formula and which is also sufficiently descent and globally convergent. Numerical evidence shows that this new conjugate gradient algorithm is considered as one of the competitive conjugate gradient methods.
2016

. New Quasi-Newton (DFP) with Logistic Mapping

2016-10
University of Zakho (القضية : 1) (الحجم : 4)
In this paper, we propose a modification of the self-scaling quasi-Newton (DFP) method for unconstrained optimization using logistic mapping. We shoe that it produces a positive definite matrix. Numerical results demonstrate that the new algorithm is superior to standard DFP method with respect to the NOI and NOF.

Sufficient Condition of Convergence of the Fletch-Reeves Method for Solving Almost Symmetric Problems

2016-07
University of Duhok (القضية : 1) (الحجم : 19)
The conjugate gradient method is one of the most important algorithms for solving large and sparse symmetric positive definite (SPD) matrices. In this paper, the sufficient condition for convergence of the Fletcher-Reeves conjugate gradient method for almost symmetric matrices when the Euclidean norm of the nonsymmetric part of a positive definite matrix is less than the value of which is related to the smallest and the largest eigenvalues of the symmetric part of the matrix under consideration is presented.
2015

Convergence of the Barzilai-Borwen Method for Sightly unsymmetric Linear Systems

2015-11
University of Zakho (القضية : 1) (الحجم : 3)
Due to its simplicity and numerical efficiency, the Barzilai and Borwein (BB) gradient method has received numerous attentions in different scientific fields. In this paper, the sufficient condition for convergence of the BB method when the coefficient matrix of linear algebraic equations is slightly unsymmetric with positive definite symmetric part is presented.

Modification of Flecher-Reeves Method with Wolfe Condition

2015-09
International Journal of Enhanced Research in Science Technology & Engineering (القضية : 5) (الحجم : 4)
In this paper, we proposed a new conjugate gradient method for unconstrained optimization problems by using Logistics Equation, One of the remarkable properties of the conjugate gradient method is its ability to generate so they are widely used for large scale unconstrained optimization problems.
2006

An investigation to minimization conjugate gradient type methods

2006-05
Dohuk University (القضية : 1) (الحجم : 9)
An investigation to minimization conjugate gradient type methods
2005

A modified algorithm for the combined Barrier-Penalty constrained problem

2005-06
Dohuk University (القضية : 1) (الحجم : 8)
A modified algorithm for the combined Barrier-Penalty constrained problem

الرجوع