ئەز   Ayad Ramadhan Ali


Assistant Lecturer

Specialties

Numerical Analysis

Education

M.Sc. in Mathematics

Mathematics لە University of Zakho

2022

B.Sc. in Mathematics

Zakho لە Zakho

2015

Academic Title

Assistant Lecturer

2023-10-25

Researcher

2022-12-20

Assistant Researcher

2017-10-01

Published Journal Articles

JOURNAL OF APPLIED COMPUTER SCIENCE & MATHEMATICS (Issue : 2) (Volume : 19)
A Modified Sine–Cosine Algorithm with Improved Convergence for solving Optimization Problems

Swarm intelligence–based metaheuristics have emerged as powerful tools for solving complex optimization problems due to... See more

Swarm intelligence–based metaheuristics have emerged as powerful tools for solving complex optimization problems due to their adaptability and ease of implementation. Among them, the sine–cosine algorithm (SCA) is a well-known method, but it often suffers from slow convergence and premature stagnation in local optima. To address these limitations, this study introduces a modified sine–cosine algorithm (MSCA) that incorporates an adaptive operator to achieve a better balance between global exploration and local exploitation. The proposed MSCA was extensively evaluated using 23 classical benchmark functions, categorized into unimodal, multimodal, and fixed-dimension multimodal groups. Its performance was benchmarked against several state-of-the-art algorithms, and the standard SCA. Experimental results demonstrate that MSCA consistently outperforms the competitor algorithms in terms of convergence speed, accuracy, and robustness. Furthermore, statistical validation using the Wilcoxon rank-sum test and Friedman test confirms the significant superiority and scalability of MSCA across high-dimensional search spaces. Overall, the proposed MSCA offers a reliable and effective optimization framework with strong potential for addressing diverse and large-scale real-world applications.

 2025-11
IJISCS (International Journal of Information System and Computer Science) (Issue : 2) (Volume : 9)
A NOVEL APPROACH: THREE-GROUP EXPLORATION STRATEGY ALGORITHM FOR SOLVING OPTIMIZATION PROBLEMS

In this study, we present a novel optimization technique, known as the Three-Group Exploration Strategy... See more

In this study, we present a novel optimization technique, known as the Three-Group Exploration Strategy (TGES) algorithm, specifically inspired by collaborative group dynamics often seen in problem-solving. We showed wide testing on 26 widely-recognized benchmark functions, providing a severe comparison between TGES and several well-established optimization algorithms. These results highlight TGES’s effectiveness in finding optimal solutions with high reliability and accuracy. Furthermore, the practical applications of TGES are demonstrated by successfully solving six interesting, real-world engineering problems, showcasing its adaptability and robustness. The experimental results indicate that TGES not only exhibits superior optimization performance, but it also achieves faster convergence and higher solution quality compared to several leading algorithms. This finds TGES algorithm as a strong and adaptable tool for solving a variety of engineering optimization problems.

 2025-08
Journal of Duhok University (Issue : 2) (Volume : 25)
HYBRIDIZATION GRADIENT BASED METHODS WITH GENETIC ALGORITHM FOR SOLVING SYSTEMS OF LINEAR EQUATIONS

In this paper, we propose two hybrid gradient based methods and genetic algorithm for solving... See more

In this paper, we propose two hybrid gradient based methods and genetic algorithm for solving systems of linear equations with fast convergence. The first proposed hybrid method is obtained by using the steepest descent method and the second one by the Cauchy-Barzilai-Borwein method. These algorithms are based on minimizing the residual of solution which has genetic characteristics. They are compared with the normal genetic algorithm and standard gradient based methods in order to show the accuracy and the convergence speed of them. Since the conjugate gradient method is recommended for solving large sparse and symmetric positive definite matrices, we also compare the numerical results of our proposed algorithms with this method. The numerical results demonstrate the robustness and efficiency of the proposed algorithms. Moreover, we observe that our hybridization of the CBB method and genetic algorithm gives more accurate results with faster convergence than other mentioned methods in all given cases.

 2022-11
General Letters in Mathematics (GLM) (Issue : 2) (Volume : 12)
New search direction of steepest descent method for solving large linear systems

The steepest descent (SD) method is well-known as the simplest method in optimization. In this... See more

The steepest descent (SD) method is well-known as the simplest method in optimization. In this paper, we propose a new SD search direction for solving system of linear equations Ax = b. We also prove that the proposed SD method with exact line search satisfies descent condition and possesses global convergence properties. This proposed method is motivated by previous work on the SD method by Zubai’ah-Mustafa-Rivaie-Ismail (ZMRI)[2]. Numerical comparisons with a classical SD algorithm and ZMRI algorithm show that this algorithm is very effective depending on the number of iterations (NOI) and CPU time.

 2022-08

Thesis

2022
New Gradient Optimization and Genetic Algorithms Hybridized for Fast Convergence

Hybrid Between New Gradient Methods and Genetic Algorithm for solving Linear Optimization Problems

 2025

Training Course

2023-01-01,2023-06-28
Pedagogy Training Course

Pedagogy Training Course

 2023
2021-09-05,2021-10-27
Pre-Intermediate

Training Course on English

 2021
2020-10-29,2020-12-22
Elementary

Training Course on English

 2020