Some New Algorithms For Unconstrained Optimization And Their Application Neural Networks
2016-11-20
In this thesis, we proposed eight new algorithms in the field of unconstrained optimization: four algorithms in conjugate gradients and the other four algorithms of PCG Method with self-scaling Quasi-Newton.We suggested that the first three algorithms are based on the development algorithm of the Dia-Liao (DL) and the fourth algorithm on Polak-Ribiere (PR). Some conditions on the above algorithms have been studied. It is found that our proposed algorithms satisfy decent and sufficient decent conditions. Also,the first three algorithms satisfy conjugacy condition,studying the global convergence property for each of the first and third algorithm.Also,four algorithms of PCG Method with self-scaling Quasi-Newton SR1 and DFP are suggested.We show that these algorithms satisfy the Quasi-Newton condition and the positive definite property.The above eight algorithms are applied on some known standard nonlinear test functions. They proved their proficiency while comparing numerical results of these algorithms with those of some other well-known algorithms.In addition to that, we apply our new algorithms in the field of the neural networks. These algorithms are used to increase the efficiency of neural networks when comparing their numerical results with some well-known algorithms.