ئەز   Ahmed Mustafa


Assistant Teacher

Specialties

Numerical Optimization in Mathematics

Education

Mathematics

Mathematics لە Zakho

2015

Membership


2023

2023-01-01,current
Department Coordinator

Department Coordinator

2015

2015-10-29,current
Member

Teaching Staff

Academic Title

Assistant Teacher

2021-01-03

Published Journal Articles

EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS (Issue : 3) (Volume : 18)
An Enhanced Conjugate Gradient Method for Nonlinear Minimization Problems

Because of their computing efficiency and minimal memory requirements, conjugate gradient techniques are a fundamental... See more

Because of their computing efficiency and minimal memory requirements, conjugate gradient techniques are a fundamental family of algorithms for handling large-scale unconstrained nonlinear optimization problems. A new version of the Hestenes-Stiefel (HS) technique is presented in this study with the goal of improving convergence properties without compromising ease of use. We rigorously prove the global convergence qualities of the proposed approach under standard assumptions and show that it meets the conjugacy, descent, and adequate descent constraints. Numerous numerical tests, covering a wide range of benchmark issues, show that the suggested strategy routinely performs better than the traditional HS approach in terms of function evaluations and iteration count.

 2025-08
Science Journal of University of Zakho (Issue : 2) (Volume : 11)
THE NEW RANK ONE CLASS FOR UNCONSTRAINED PROBLEMS SOLVING

One of the most well-known methods is the quasi-Newton approach, iterative solutions for unconstrained problems.... See more

One of the most well-known methods is the quasi-Newton approach, iterative solutions for unconstrained problems. The great precision and quick convergence of the quasi-Newton methods are well recognized. In this work, we derive the new algorithm for the symmetric rank one SR1 method. The strong Wolfe line search criteria define the step length selection. We also proved the new quasi-Newton equation and positive definite matrix theorem. preliminary computer testing on the set of fourteen Unrestricted optimization test functions leads to the conclusion that this new method is more effective and durable than the implementation of classical the SR1 method. In terms of iterations count and functions.

 2023-04
International Journal of Computer Mathematics (Issue : 4) (Volume : 100)
New spectral LS conjugate gradient method for nonlinear unconstrained optimization

In this work, we propose a novel algorithm to perform spectral conjugate gradient descent for... See more

In this work, we propose a novel algorithm to perform spectral conjugate gradient descent for an unconstrained, nonlinear optimization problem. First, we theoretically prove that the proposed method satisfies the sufficient descent condition, the conjugacy condition, and the global convergence theorem. The experimental setup uses Powell’s conjugacy condition coupled with a cubic polynomial line search using strong Wolfe conditions to ensure quick convergence. The experimental results demonstrate that the proposed method shows superior performance in terms of the number of iterations to convergence and the number of function evaluations when compared to traditional methods such as Liu and Storey (LS) and Conjugate Descent (CD).

 2023-01
General Letters in Mathematics (GLM) (Issue : 12) (Volume : 1)
Conjugated Gradient with Four Terms for Nonlinear Unconstrained Optimization

The nonlinear conjugate gradient (GJG) technique is an effective tool for addressing minimization on a... See more

The nonlinear conjugate gradient (GJG) technique is an effective tool for addressing minimization on a huge scale. It can be used in a variety of applications., We presented a novel conjugate gradient approach based on two hypotheses, and we equalized the two hypotheses and retrieved the good parameter in this article. To get a new conjugated gradient, we multiplied the new parameter by a control parameter and substituted it in the second equation. a fresh equation for 𝛽𝑘 is proposed. It has global convergence qualities. When compared to the two most common conjugate gradient techniques, our algorithm outperforms them in terms of both the number of iterations (NOIS) and the number of functions (NOFS). The new technique is efficient in real computing and superior to previous comparable approaches in many instances, according to numerical results.

 2022-05
Mathematics and Statistics (Issue : 10) (Volume : 2)
A New Algorithm for Spectral Conjugate Gradient in Nonlinear Optimization

CJG is a nonlinear conjugation gradient. Algorithms have been used to solve large-scale unconstrained enhancement... See more

CJG is a nonlinear conjugation gradient. Algorithms have been used to solve large-scale unconstrained enhancement problems. Because of their minimal memory needs and global convergence qualities, they are widely used in a variety of fields. This approach has lately undergone many investigations and modifications to enhance it. In our daily lives, the conjugate gradient is incredibly significant. For example, whatever we do, we strive for the best outcomes, such as the highest profit, the lowest loss, the shortest road, or the shortest time, which are referred to as the minimum and maximum in mathematics, and one of these ways is the process of spectral gradient descent. For multidimensional unbounded objective function, the spectrum conjugated gradient (SCJG) approach is a strong tool. In this study, we describe a revolutionary SCG technique in which performance is quantified. Based on assumptions, we constructed the descent condition, sufficient descent theorem, conjugacy condition, and global convergence criteria using a robust Wolfe and Powell line search. Numerical data and graphs were constructed utilizing benchmark functions, which are often used in many classical functions, to demonstrate the efficacy of the recommended approach. According to numerical statistics, the suggested strategy is more efficient than some current techniques. In addition, we show how the unique method may be utilized to improve solutions and outcomes.

 2022-03
General Letters in Mathematics (GLM) (Issue : 1) (Volume : 11)
Global convergence of new three terms conjugate gradient for unconstrained optimization

Abstract In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient... See more

Abstract In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.

 2021-10
Journal of University of Duhok (Pure and Eng. Sciences) (Issue : 2) (Volume : 24)
ENHANCE THE EFFICIENCY OF RMIL'S FORMULA FOR MINIMUM PROBLEM

In this paper, a new formula of 𝜷𝒌 is suggested for conjugate gradient method of... See more

In this paper, a new formula of 𝜷𝒌 is suggested for conjugate gradient method of solving unconstrained optimization problems based on depends on the creation and update of RMIL’S formula with the inclusion of a parameter and step size of cubic. Our novel proposed CG-method has descent condition and global convergence properties. Numerical comparisons with standard conjugate gradient algorithm of RMIL’S formula show that this algorithm very effective depending on the number of iterations and the number of functions evaluation.

 2021-10
International Journal of Advanced Research in Engineering & Management (IJAREM) (Issue : 2) (Volume : 1)
A New Conjugate Gradient for Nonlinear Unconstrained Optimization

The conjugate gradient method is a very useful technique for solving minimization problems and has... See more

The conjugate gradient method is a very useful technique for solving minimization problems and has wide applications in many fields. In this paper we propose a new conjugate gradient methods by) for nonlinear unconstrained optimization. The given method satisfies descent condition under strong Wolfe line search andglobal convergence property for uniformly functions.Numerical results based on the number of iterations (NOI)and number of function (NOF), have shown that the new 𝛽𝑘 𝑁𝑒𝑤 performs better thanas Hestenes-Steifel(HS)CG methods.

 2015-05
International Journal of Advanced Research in Engineering & Management (IJAREM) (Issue : 1) (Volume : 1)
A Modification of Quasi-Newton (DFP) Method for Solving Unconstrained Optimization Problems

The Quasi-Newton method is a very useful technique for solving minimization problems and has wide... See more

The Quasi-Newton method is a very useful technique for solving minimization problems and has wide applications in many fields. In this paper we develop a new class of DFP method for unconstrained optimization. The given method satisfies the Quasi-Newton condition and positive definite theorem under strong Wolfe line search. Numerical results based on the number of iterations (NOI) and number of function (NOF), have shown that the new method (New5) performs better than standard method of ( ) method.

 2015-04
International Journal of Enhanced Research in Science Technology & Engineering, (Issue : 4) (Volume : 4)
Improve Performance of Fletcher-Reeves (FR) Method

Conjugate gradient (CG) methods are famous for solving nonlinear unconstrained optimization problems because they required... See more

Conjugate gradient (CG) methods are famous for solving nonlinear unconstrained optimization problems because they required low computational memory. In this paper, we propose a new conjugate gradient (𝛃𝐤 𝐍𝐞𝐰𝟏 ) which possesses global convergence properties using exact line search and inexact line search. The given method satisfies sufficient descent condition under strong Wolfe line search. Numerical results based on the number of iterations (NOI) and number of function (NOF), have shown that the new 𝛃𝐤 𝐍𝐞𝐰𝟏 performs better than Flecher- Reeves (FR) CG methods.

 2015-04

Thesis

2015-12-01
A New Self-Scaling Preconditioned Conjugate Gradient and Family of New Scaled CG for Nonlinear Unconstrained Optimization

A New Self-Scaling Preconditioned Conjugate Gradient and Family of New Scaled CG for Nonlinear Unconstrained... See more

A New Self-Scaling Preconditioned Conjugate Gradient and Family of New Scaled CG for Nonlinear Unconstrained Optimization

 2015

Presentation

A hall in the ground floor at the College of Education
2023-02
Conjugated Gradient with Four Terms for Nonlinear Unconstrained Optimization

published in international journal

 2023
A hall in the ground floor at the College of Education
2022-04
ENHANCE THE EFFICIENCY OF RMIL'S FORMULA FOR MINIMUM PROBLEM

Presenting a Seminar related to a published research in a local journal

 2022
A hall in the ground floor at the College of Education
2022-04
A New Algorithm for Spectral Conjugate Gradient in Nonlinear Optimization

Presenting a Seminar related to a published research in an international journal

 2022
A hall in the ground floor at the College of Education
2022-02
Global convergence of new three terms conjugate gradient for unconstrained optimization

Presenting a Seminar related to a published research in an international journal

 2022
Online
2021-04
Steepest Descent Method

Attending / Presenting Seminar

 2021
Online
2021-04
Concept of Optimization

Attending / Presenting Seminar

 2021
A hall in the ground floor at the College of Education
2020-06
Basic Concepts of Optimization

Attending / Presenting Seminar

 2020

Workshop

Online
2021-12
quality assurance from A-Z

Participation with a research or presenting a seminar in workshops inside or outside the country

 2021
A Hall in The Ground Floor at The Dep. of Chemistry
2018-04
Model System Workshop.

Model System Workshop.

 2018

Training Course

2023-01-19,2023-07-19
Cutting Edege

English language

 2023
2020-10-18,2020-11-12
Certificate qualifying session in English Language proficiency Course with intermediate level (University of Zakho).

Certificate qualifying session in English Language proficiency Course with intermediate level (University of Zakho).

 2020
2016-02-01,2016-04-30
Certificate qualifying session in Academic Training Program (University of Zakho).

Certificate qualifying session in Academic Training Program (University of Zakho).

 2016
2012-10-01,2012-11-30
Certificate qualifying session in computers (ICDL) (University of Zakho).

Certificate qualifying session in computers (ICDL) (University of Zakho).

 2012
2012-09-12,2012-11-12
Certificate of English Proficiency

Certificate qualifying session in English (University of Zakho).

 2012