| الإنجليزية | العربية | الرئسية | تسجيل الدخول |

البحوث العلمية

2023

THE NEW RANK ONE CLASS FOR UNCONSTRAINED PROBLEMS SOLVING

2023-04
Science Journal of University of Zakho (القضية : 2) (الحجم : 11)
One of the most well-known methods is the quasi-Newton approach, iterative solutions for unconstrained problems. The great precision and quick convergence of the quasi-Newton methods are well recognized. In this work, we derive the new algorithm for the symmetric rank one SR1 method. The strong Wolfe line search criteria define the step length selection. We also proved the new quasi-Newton equation and positive definite matrix theorem. preliminary computer testing on the set of fourteen Unrestricted optimization test functions leads to the conclusion that this new method is more effective and durable than the implementation of classical the SR1 method. In terms of iterations count and functions.

New spectral LS conjugate gradient method for nonlinear unconstrained optimization

2023-01
International Journal of Computer Mathematics (القضية : 4) (الحجم : 100)
In this work, we propose a novel algorithm to perform spectral conjugate gradient descent for an unconstrained, nonlinear optimization problem. First, we theoretically prove that the proposed method satisfies the sufficient descent condition, the conjugacy condition, and the global convergence theorem. The experimental setup uses Powell’s conjugacy condition coupled with a cubic polynomial line search using strong Wolfe conditions to ensure quick convergence. The experimental results demonstrate that the proposed method shows superior performance in terms of the number of iterations to convergence and the number of function evaluations when compared to traditional methods such as Liu and Storey (LS) and Conjugate Descent (CD).
2022

Conjugated Gradient with Four Terms for Nonlinear Unconstrained Optimization

2022-05
General Letters in Mathematics (GLM) (القضية : 12) (الحجم : 1)
The nonlinear conjugate gradient (GJG) technique is an effective tool for addressing minimization on a huge scale. It can be used in a variety of applications., We presented a novel conjugate gradient approach based on two hypotheses, and we equalized the two hypotheses and retrieved the good parameter in this article. To get a new conjugated gradient, we multiplied the new parameter by a control parameter and substituted it in the second equation. a fresh equation for 𝛽𝑘 is proposed. It has global convergence qualities. When compared to the two most common conjugate gradient techniques, our algorithm outperforms them in terms of both the number of iterations (NOIS) and the number of functions (NOFS). The new technique is efficient in real computing and superior to previous comparable approaches in many instances, according to numerical results.

A New Algorithm for Spectral Conjugate Gradient in Nonlinear Optimization

2022-03
Mathematics and Statistics (القضية : 10) (الحجم : 2)
CJG is a nonlinear conjugation gradient. Algorithms have been used to solve large-scale unconstrained enhancement problems. Because of their minimal memory needs and global convergence qualities, they are widely used in a variety of fields. This approach has lately undergone many investigations and modifications to enhance it. In our daily lives, the conjugate gradient is incredibly significant. For example, whatever we do, we strive for the best outcomes, such as the highest profit, the lowest loss, the shortest road, or the shortest time, which are referred to as the minimum and maximum in mathematics, and one of these ways is the process of spectral gradient descent. For multidimensional unbounded objective function, the spectrum conjugated gradient (SCJG) approach is a strong tool. In this study, we describe a revolutionary SCG technique in which performance is quantified. Based on assumptions, we constructed the descent condition, sufficient descent theorem, conjugacy condition, and global convergence criteria using a robust Wolfe and Powell line search. Numerical data and graphs were constructed utilizing benchmark functions, which are often used in many classical functions, to demonstrate the efficacy of the recommended approach. According to numerical statistics, the suggested strategy is more efficient than some current techniques. In addition, we show how the unique method may be utilized to improve solutions and outcomes.
2021

Global convergence of new three terms conjugate gradient for unconstrained optimization

2021-10
General Letters in Mathematics (GLM) (القضية : 1) (الحجم : 11)
Abstract In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.

ENHANCE THE EFFICIENCY OF RMIL'S FORMULA FOR MINIMUM PROBLEM

2021-10
Journal of University of Duhok (Pure and Eng. Sciences) (القضية : 2) (الحجم : 24)
In this paper, a new formula of 𝜷𝒌 is suggested for conjugate gradient method of solving unconstrained optimization problems based on depends on the creation and update of RMIL’S formula with the inclusion of a parameter and step size of cubic. Our novel proposed CG-method has descent condition and global convergence properties. Numerical comparisons with standard conjugate gradient algorithm of RMIL’S formula show that this algorithm very effective depending on the number of iterations and the number of functions evaluation.
2015

A New Conjugate Gradient for Nonlinear Unconstrained Optimization

2015-05
International Journal of Advanced Research in Engineering & Management (IJAREM) (القضية : 2) (الحجم : 1)
The conjugate gradient method is a very useful technique for solving minimization problems and has wide applications in many fields. In this paper we propose a new conjugate gradient methods by) for nonlinear unconstrained optimization. The given method satisfies descent condition under strong Wolfe line search andglobal convergence property for uniformly functions.Numerical results based on the number of iterations (NOI)and number of function (NOF), have shown that the new 𝛽𝑘 𝑁𝑒𝑤 performs better thanas Hestenes-Steifel(HS)CG methods.

A Modification of Quasi-Newton (DFP) Method for Solving Unconstrained Optimization Problems

2015-04
International Journal of Advanced Research in Engineering & Management (IJAREM) (القضية : 1) (الحجم : 1)
The Quasi-Newton method is a very useful technique for solving minimization problems and has wide applications in many fields. In this paper we develop a new class of DFP method for unconstrained optimization. The given method satisfies the Quasi-Newton condition and positive definite theorem under strong Wolfe line search. Numerical results based on the number of iterations (NOI) and number of function (NOF), have shown that the new method (New5) performs better than standard method of ( ) method.

Improve Performance of Fletcher-Reeves (FR) Method

2015-04
International Journal of Enhanced Research in Science Technology & Engineering, (القضية : 4) (الحجم : 4)
Conjugate gradient (CG) methods are famous for solving nonlinear unconstrained optimization problems because they required low computational memory. In this paper, we propose a new conjugate gradient (𝛃𝐤 𝐍𝐞𝐰𝟏 ) which possesses global convergence properties using exact line search and inexact line search. The given method satisfies sufficient descent condition under strong Wolfe line search. Numerical results based on the number of iterations (NOI) and number of function (NOF), have shown that the new 𝛃𝐤 𝐍𝐞𝐰𝟏 performs better than Flecher- Reeves (FR) CG methods.

الرجوع