| English | Arabic | Home | Login |

Published Journal Articles

2024

A new conjugate gradient for unconstrained optimization problems and its applications in neural n etworks

2024-01
Indonesian Journal of Electrical Engineering and Computer Science (Issue : 1) (Volume : 33)
We introduce a novel efficient and effective conjugate gradient approach for large-scale unconstrained optimization problems. The primary goal is to improve the conjugate gradient method's search direction in order to propose a new, more active method based on the modified vectorv_k^\ast, which is dependent on the step size of Barzilai and Borwein. The suggested algorithm features the following traits: (i) The ability to achieve global convergence; (ii) numerical results for large-scale functions show that the proposed algorithm is superior to other comparable optimization methods according to the number of iterations (NI) and the number of functions evaluated (NF); and (iv) training neural networks is done to improve their performance.

Two new classes of conjugate gradient method based on logistic mapping

2024-01
Telecommunication, Computing, Electronics and Control (Issue : 22) (Volume : 1)
Following the standard methods proposed by Polak-Ribiere-Polyak (P-R), in this work we introduce two new non-linear conjugate gradient methods for solving unconstraint optimization problem, our new methods based on P-R. Standard method (P-R) have performance well in numerical result but does not satisfy global convergency condition. In this paper we modified double attractive and powerful parameters that have better performance and good numerical result than P-R method, also each of our robust method can satisfies the descent condition and global convergency condition by using wolf condition. More over the second method modified by logistic mapping form, the main novelty is their numerical results and demonstrate performance well with compare to a standard method.
2023

Optimizing Accuracy of Stroke Prediction Using Logistic Regression

2023-04
Journal of Technology and Informatics (JoTI) (Issue : 2) (Volume : 4)
An unexpected limitation of blood supply to the brain and heart causes the majority of strokes. Stroke severity can be reduced by being aware of the many stroke warning signs in advance. A stroke may result if the flow of blood to a portion of the brain stops suddenly. In this research, we present a strategy for predicting the early start of stroke disease by using Logistic Regression (LR) algorithms. To improve the performance of the model, preprocessing techniques including SMOTE, feature selection and outlier handling were applied to the dataset. This method helped in achieving a balance of class distribution, identifying and removing unimportant features and handling outliers. with the existence of increased blood pressure, body mass, heart conditions, average blood glucose levels, smoking status, prior stroke, and age. Impairment occurs as the brain's neurons gradually die, depending on which area of the brain is affected by the reduced blood supply. Early diagnosis of symptoms can be extremely helpful in predicting stroke and supporting a healthy lifestyle. Furthermore, we performed an experiment using logistic regression (LR) and compared it to a number of other studies that used the same machine learning model, which is logistic regression (LR), and the same dataset. The results showed that our method successfully achieved the highest F1 score and area under curve (AUC) score, which can be a successful tool for stroke disease prediction with an accuracy of 86% compared to the other five studies in the same field. The predictive model for stroke has prospective applications, and as a result, it is still significant for academics and practitioners in the fields of medicine and health sciences.

A New Quasi-Newton Method For Non-Linear Optimization Problems And Its Application In Artificial Neural Networks (ANN)

2023-01
International Research Journal of Modernization in Engineering Technology and Science (Issue : 1) (Volume : 5)
The Quasi-Newton (QN) method is a widely used stationary iterative method for solving unconstrained optimization problems. One particular method within the Quasi-Newton family is the Symmetric Rank-One (SR1) method. In this research, we propose a new variant of the Quasi-Newton SR1 method that utilizes the Barzilai-Borwein step size. Our analysis demonstrates that the updated matrix resulting from the proposed method is both symmetric and positive definite. Additionally, our numerical experiments show that the proposed SR1 method, when combined with the PCG method, is effective in solving unconstrained optimization problems, as evidenced by its low number of iterations and function evaluations. Furthermore, we demonstrate that our proposed SR1 method is more efficient in solving large-scale problems with a varying number of variables compared to the original method. The numerical results of applying the new SR1 method to neural network problems also reveal its effectiveness.

A New Quasi-Newton Method with PCG Method for Nonlinear Optimization Problems

2023-01
Mathematics and Statistics (Issue : 1) (Volume : 11)
The major stationary iterative method used to solve nonlinear optimization problems is the quasi-Newton (QN) method. Symmetric Rank-One (SR1) is a method in the quasi-Newton family. This algorithm converges towards the true Hessian fast and has computational advantages for sparse or partially separable problems [1]. Thus, investigating the efficiency of the SR1 algorithm is significant. It's possible that the matrix generated by SR1 update won't always be positive. The denominator may also vanish or become zero. To overcome the drawbacks of the SR1 method, resulting in better performance than the standard SR1 method, in this work, we derive a new vector 𝑦𝑦𝑘𝑘∗ depending on the Barzilai-Borwein step size to obtain a new SR1 method. Then using this updating formula with preconditioning conjugate gradient (PCG) method is presented. With the aid of inexact line search procedure by strong Wolfe conditions, the new SR1 method is proposed and its performance is evaluated in comparison to the conventional SR1 method. It is proven that the updated matrix of the new SR1 method, 𝐻𝐻𝑘𝑘+1𝑛𝑛𝑛𝑛𝑛𝑛, is symmetric matrix and positive definite matrix, given 𝐻𝐻𝑘𝑘 is initialized to identity matrix. In this study, the proposed method solved 13 problems effectively in terms of the number of iterations (NI) and the number of function evaluations (NF). Regarding NF, the new SR1 method also outperformed the classic SR1 method. The proposed method is shown to be more efficient in solving relatively large-scale problems (5,000 variables) compared to the original method. From the numerical results, the proposed method turned out to be significantly faster, effective and suitable for solving large dimension nonlinear equations
2022

A new three-term conjugate gradient method for training neural networks with global convergence

2022-10
International Research Journal of Modernization in Engineering Technology and Science (Issue : 1) (Volume : 5)
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simplicity, flexibility, numerical efficiency, and low memory requirements. In this paper, we introduce a new three-term conjugate gradient method, for solving optimization problems and it has been tested on artificial neural networks (ANN) for training a feed-forward neural network. The new method satisfied the descent condition and sufficient descent condition. Global convergence of the new (NTTCG) method has been tested. The results of numerical experiences on some wellknown test function shown that our new modified method is very effective, by relying on the number of functions evaluation and number of iterations, also included the numerical results for training feed-forward neural networks with other well-known method in this field.

A New Version Coefficient of Three-Term Conjugate Gradient Method to Solve Unconstrained Optimization

2022-10
New Trends in Mathematical Sciences (Issue : 4) (Volume : 10)
This paper presents the new three-term conjugate gradient method for solving unconstrained optimization problem. The main aim is to upgrade the search direction of conjugate gradient method to present a more active new three term method. Our new method satisfies the descent and the sufficient descent conditions and global convergent property. Furthermore, the numerical results show that the new method has a better numerical performance in comparison with the standard (PRP) method from an implementation of our new method on some test functions of unconstrained optimization according to number of iterations (NOI) and the number of functions evaluation (NOF).
2020

A new self-scaling variable metric (DFP) method for unconstrained optimization problems

2020-10
General Letters in Mathematics (Issue : 1) (Volume : 9)
In this study, a new self-scaling variable metric (VM)-updating method for solving nonlinear unconstrained optimization problems is presented. The general strategy of (New VM-updating) is to propose a new quasi-newton condition used for update the usual DFP Hessian to a number of times in a way to be specified in some iteration with PCG method to improve the performance of the Hessian approximation. We show that it produces a positive definite matrix. Experimental results indicate that the new suggested method was more efficient than the standard DFP method, with respect to the number of functions evaluations (NOF) and number of iterations (NOI).
2019

A New Quasi-Newton (SR1) With PCG Method for Unconstrained Nonlinear Optimization

2019-12
International Journal of Advanced Trends in Computer Science and Engineering (Issue : 6) (Volume : 8)
The quasi-newton equation (QN) plays a key role in contemporary nonlinear optimization. In this paper, we present a new symmetric rank-one (SR1) method by using preconditioning conjugate gradient (PCG) method for solving unconstrained optimization problems. The suggested method has an algorithm in which the usual SR1 Hessian is updated. We show that the new quasi-newton (SR1) method maintains the Quasi- Newton condition and the positive definite property. Numerical experiments are reported which produces by the new algorithm better numerical results than those of the normal (SR1) method by using PCG algorithm based on the number of iterations (NOI) and the number of functions evaluation (NOF).

A new class of three-term conjugate gradient methods for solving unconstrained minimization problems

2019-12
General Letters in Mathematics (Issue : 2) (Volume : 7)
Conjugate gradient (CG) methods which are usually generate descent search directions, are beneficial for large-scale unconstrained optimization models, because of its low memory requirement and simplicity. This paper studies the three-term CG method for unconstrained optimization. The modified a three-term CG method based on the formal 𝒕∗ which is suggested by Kafaki and Ghanbari [11], and using some well-known CG formulas for unconstrained optimization. Our proposed method satisfies both (the descent and the sufficient descent) conditions. Furthermore, if we use the exact line search the new proposed is reduce to the classical CG method. The numerical results show that the suggested method is promising and exhibits a better numerical performance in comparison with the three- term (ZHS-CG) method from an implementation of the suggested method on some normal unconstrained optimization test functions.

MODIFIED CONJUGATE GRADIENT METHOD FOR TRAINING NEURAL NETWORKS BASED ON LOGISTIC MAPPING

2019-10
Journal of Duhok University (Issue : 1) (Volume : 22)
In this paper, we suggested a modified conjugate gradient method for training neural network which assurance the descent and the sufficient descent conditions. The global convergence of our proposed method has been studied. Finally, the test results present that, in general, the modified method is more superior and efficient when compared to other standard conjugate gradient methods

A New Conjugate Gradient Coefficient for Unconstrained Optimization Based On Dai-Liao

2019-03
Science Journal of University of Zakho (Issue : 1) (Volume : 7)
This paper, proposes a new conjugate gradient method for unconstrained optimization based on Dai-Liao (DL) formula; descent condition and sufficient descent condition for our method are provided. The numerical results and comparison show that the proposed algorithm is potentially efficient when we compare with (PR) depending on number of iterations (NOI) and the number of functions evaluation (NOF).
2016

A New Conjugate Gradient for Unconstrained Optimization Based on Step Size of Barzilai and Borwein

2016-06
Science Journal of University of Zakho (Issue : 1) (Volume : 4)
In this paper, a new formula of is suggested for conjugate gradient method of solving unconstrained optimization problems based on step size of Barzilai and Borwein. Our new proposed CG-method has descent condition, sufficient descent condition and global convergence properties. Numerical comparisons with a standard conjugate gradient algorithm show that this algorithm very effective depending on the number of iterations and the number of functions evaluation.

Back