أنا  حسين عجيل خطاب


Assistant Lecture

التخصصات

امثلية

التعليم

Ph.D in Optimization

Zakho من Zakho

2024

M.Sc. in Numerical Optimization

Mathematics من Zakho

2014

B.Sc. in Mathematics

Mathematic من Zakho

2011

العضوية


2025

2025-01-05,current
Member of the quality and assurance committee in the department of mathematics

Department Coordinator

2024

2024-09-01,2025-01-04
Member of the quality and assurance committee in the department of mathematics

Ranking

2019

2019-04-16,2019-05-15
Member of the Examination Committee in the department of mathematics

Examination Committee

2016

2016-05-02,2016-09-15
Member of the Examination Committee in the department of mathematics

Examination Committee

اللقب العلمي

Assistant Lecture

2015-06-15

البحوث العلمية

European Journal of Pure and Applied Mathematics (القضية : 3) (الحجم : 18)
An Enhanced Conjugate Gradient Method for Nonlinear Minimization Problems

Because of their computing efficiency and minimal memory requirements, conjugate gradient techniques are a fundamental... See more

Because of their computing efficiency and minimal memory requirements, conjugate gradient techniques are a fundamental family of algorithms for handling large-scale unconstrained nonlinear optimization problems. A new version of the Hestenes-Stiefel (HS) technique is presented in this study with the goal of improving convergence properties without compromising ease of use. We rigorously prove the global convergence qualities of the proposed approach under standard assumptions and show that it meets the conjugacy, descent, and adequate descent constraints. Numerous numerical tests, covering a wide range of benchmark issues, show that the suggested strategy routinely performs better than the traditional HS approach in terms of function evaluations and iteration count.

 2025-08
Journal of Intelligent & Fuzzy Systems (القضية : 46) (الحجم : 2)
Two new limited-memory preconditioned conjugate gradient algorithms for nonlinear optimization problems

The conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local... See more

The conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local and global convergence qualities and minimal memory needs. While the quasi-Newton methods are reliable and efficient on a wide range of problems and thesemethods are converge faster than the conjugate gradientmethods and require fewer function evaluations, however, they are request substantially more storage, and if the problem is ill-conditioned, they may require several iterations. There is another class, termed preconditioned conjugate gradient method, it is a technique that combines two methods conjugate gradient with quasi-Newton. In this work, we proposed a new two limited memory preconditioned conjugate gradient methods (New1 and New2), to solve nonlinear unconstrained minimization problems, by using new modified symmetric rank one (NMSR1) and new modified Davidon, Fletcher, Powell (NMDFP), and also using projected vectors. We proved that these modifications fulfill some conditions. Also, the descent condition of the new technique has been proved. The numerical results showed the efficiency of the proposed new algorithms compared with some standard nonlinear, unconstrained problems.

 2024-02
Mathematics and Statistics (القضية : 1) (الحجم : 11)
Two New Preconditioned Conjugate Gradient Methods for Minimization Problems

In application to general function, each of the conjugate gradient and Quasi-Newton methods has particular... See more

In application to general function, each of the conjugate gradient and Quasi-Newton methods has particular advantages and disadvantages. Conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local and global convergence qualities and minimal memory needs. Quasi-Newton methods are reliable and efficient on a wide range of problems and they converge faster than the conjugate gradient method and require fewer function evaluations but they have the disadvantage of requiring substantially more storage and if the problem is ill-conditioned, they may take several iterations. A new class has been developed, termed preconditioned conjugate gradient (PCG) method. It is a method that combines two methods, conjugate gradient and Quasi-Newton. In this work, two new preconditioned conjugate gradient algorithms are proposed namely New PCG1 and New PCG2 to solve nonlinear unconstrained optimization problems. A new PCG1 combines conjugate gradient method Hestenes-Stiefel (HS) with new self-scaling symmetric Rank one (SR1), and a new PCG2 combines conjugate gradient method Hestenes-Stiefel (HS) with new self-scaling Davidon, Flecher and Powell (DFP). The algorithm uses the strong Wolfe line search condition. Numerical comparisons with standard preconditioned conjugate gradient algorithms show that for these new algorithms, computational scheme outperforms the preconditioned conjugate gradient.

 2023-01
Science Journal of University of Zakho (القضية : 4) (الحجم : 10)
A NEW MODIFIED CONJUGATE GRADIENT FOR NONLINEAR MINIMIZATION PROBLEMS

The conjugate gradient is a highly effective technique to solve the unconstrained nonlinear minimization problems... See more

The conjugate gradient is a highly effective technique to solve the unconstrained nonlinear minimization problems and it is one of the most well-known methods. It has a lot of applications. For large-scale and unconstrained minimization problems, conjugate gradient techniques are widely applied. In this paper, we will suggest a new parameter of conjugate gradient to solve the nonlinear unconstrained minimization problems, based on the parameter of Dai and Liao. We will study the property of the descent , the property of the sufficient descent and property of the global convergence of the new method. We introduce some numerical data to prove the efficacy of the our method.

 2022-10
General Letters in Mathematics (GLM) (القضية : 2) (الحجم : 9)
A New conjugate gradient method for unconstrained optimization problems with descent property

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization.... See more

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.

 2021-05
General Letters in Mathematics (القضية : 1) (الحجم : 7)
A New Parameter Conjugate Gradient Method Based on Three Terms Unconstrained Optimization

In this paper, we suggest a new conjugate gradient method for solving nonlinear uncons trained... See more

In this paper, we suggest a new conjugate gradient method for solving nonlinear uncons trained optimization problems by using three term conjugate gradient method , We give a descent condi tion and the sufficient descent condition of the suggested method.

 2019-11
Journal of University of Zakho (القضية : 1) (الحجم : 4)
New conjugate gradient method for unconstrained optimization with logistic mapping

In this paper , we suggested a new conjugate gradient algorithm for unconstrained optimization based... See more

In this paper , we suggested a new conjugate gradient algorithm for unconstrained optimization based on logistic mapping, descent condition and sufficient descent condition for our method are provided. Numerical results show that our presented algorithm is more efficient for solving nonlinear unconstrained optimization problems comparing with (DY) .

 2016-06
IOSR Journal of Mathematics (IOSR-JM) (القضية : 3) (الحجم : 10)
New Iterative Conjugate Gradient Method for Nonlinear Uncon- strained Optimization Using Homptopy Technique

A new hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , we... See more

A new hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , we computed the parameter as aconvex combination of (Polak-Ribiere (PR))[9] and (Al-Bayati and Al-Assady (BA))[1].

 2014-06
IOSR Journal of Engineering (IOSRJEN) (القضية : 05) (الحجم : 04)
New Homotopy Conjugate Gradient for Unconstrained Optimization using Hestenes- Stiefel and Conjugate Descent

In this paper, we suggest a hybrid conjugate gradient method for unconstrained optimization by using... See more

In this paper, we suggest a hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , We calculate the parameter �� as aconvex combination of �𝑆 (Hestenes Stiefel)[5] and ��� (Conjugate descent)[3].

 2014-05

الاطاريح

2014-12-29
Homotopy and Homotopy perturbation Methods for unconstrained Optimization

M.Sc. Thesis

 2014

الدورات التدريبية

2021-11-13,2021-12-29
Intermediate

Training Course on English

 2021
2021-09-05,2021-10-27
Pre-Intermediate

Training Course of English

 2021
2020-10-29,2020-12-22
Elementary

Training course on English

 2020
2015-04-19,2015-05-28
Training course on Teaching Methods

Teaching Methods

 2015
2012-09-01,2012-11-01
Training course on English Proficiency

Three months course in English Proficiency

 2012
2012-06-01,2012-08-01
Training course on Computer Proficiency

Training course on Computer Proficiency

 2012