Optimization refers to the process of finding
the best possible solution to a problem within a given set of
constraints. It involves maximizing or minimizing a
specific objective function while adhering to specific
constraints. Optimization is used in various fields,
including mathematics, engineering, economics, computer
science, and data science, among others. The objective
function can be a simple equation, a complex algorithm, or
a mathematical model that describes a system or process.
There are various optimization techniques available,
including linear programming, nonlinear programming,
genetic algorithms, simulated annealing, and particle
swarm optimization, among others. These techniques use
different algorithms to search for the optimal solution to a
problem. In this paper, the main goal of unconstrained
optimization is to minimize an objective function that uses
real variables and has no value restrictions. In this study,
based on the modified conjugacy condition, we offer a new
conjugate gradient (CG) approach for nonlinear
unconstrained problems in optimization. The new method
satisfied the descent condition and the sufficient descent
condition. We compare the numerical results of the new
method with the Hestenes-Stiefel (HS) method. Our novel
method is quite effective according to the number of
iterations (NOI) and the number of functions (NOF)
evaluated, as demonstrated by the numerical results on
certain well-known non-linear test functions.
See More
See Less