ISSN ONLINE(2319-8753)PRINT(2347-6710)
K. Karthikeyan Associate Professor, Mathematics division, VIT University, Vellore, Tamil nadu, India |
Related article at Pubmed, Scholar Google |
Visit for more related articles at International Journal of Innovative Research in Science, Engineering and Technology
Numerical Optimization algorithms presents the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. In this article, we propose some alternative iterative algorithms, with different order of convergence for minimization of non-linear functions. Then comparative study among the proposed algorithms and Newton’s algorithm is established by means of examples
Keywords |
Non-linear functions, Newton’s method, Ostrowski’s method, Eighth-order Convergence |
INTRODUCTION |
An optimization problem consists of maximizing or minimizing a real function by systematically choosing input values from within an allowed set and computing the value of the function. The generalization of optimization theory and techniques to other formulations comprises a large area of applied mathematics. More generally, optimization includes finding "best available" values of some objective function given a defined domain, including a variety of different types of objective functions and different types of domains. Many optimization problems with or without constraints arise in various fields such as science, engineering, economics, management sciences, etc., where numerical information is processed. In recent times, many problems in business situations and engineering designs have been modeled as an optimization problem for taking optimal decisions. In fact, numerical optimization techniques have made deep in to almost all branches of engineering and mathematics. Several methods [8, 10, 16, 20, 21] are available for solving unconstrained minimization problems. These methods can be classified in to two categories as non gradient and gradient methods. The non gradient methods require only the objective function values but not the derivatives of the function in finding minimum. The gradient methods require, in addition to the function values, the first and in some cases the second derivatives of the objective function. Since more information about the function being minimized is used through the use of derivatives, gradient methods are generally more efficient than non gradient methods. All the unconstrained minimization methods are iterative in nature and hence they start from an initial trial solution and proceed towards the minimum point in a sequential manner. To solve unconstrained nonlinear minimization problems arising in the diversified field of engineering and technology, we have several methods to get solutions. For instance, multi- step nonlinear conjugate gradient methods [3], a scaled nonlinear conjugate gradient algorithm[1], a method called, ABS-MPVT algorithm [12] are used for solving unconstrained optimization problems. Newton’s method [13] is used for various classes of optimization problems, such as unconstrained minimization problems, equality constrained minimization problems. A proximal bundle method with inexact data [17] is used for minimizing unconstrained non smooth convex function. Implicit and adaptive inverse preconditioned gradient method [2] is used for solving nonlinear minimization problems. A new algorithm [6] is used for solving unconstrained optimization problem with the form of sum of squares minimization. A derivative based algorithm [9] is used for a particular class of mixed variable optimization problems. A globally derivative – free decent method [14] is used for nonlinear complementarity’s problems. |
Vinay Kanwar et al. [18] introduced new algorithms called, external touch technique and orthogonal intersection technique for solving the non linear equations. A.M.Ostrowski’s[5] introduced fourth order convergence iteration scheme for solving non linear equations. Sharma and Guha[7] introduced a family of modified Ostrowski’s methods with accelerated sixth order convergence. Chun and Ham [11] proposed some sixth order variants of Ostrowski’s root finding methods. Kou. et al [15] introduced some variants of Ostrowski’s method with seventh order convergence. Grau et.al[4] proposed an improvement to Ostrowski’s root finding method. Miquel Grau-Sanchez[19] proposed improvements of the efficiency of some three step iterative like Newton’s methods. Recently, Jisheng Kou and Xiuhua Wang [ 20] introduced some improvements of Ostrowski’s method with order of convergence eight. In this article, we introduce alternative algorithms for minimization of non linear functions and comparative study is established among the new seven algorithms with Newton’s algorithm by means of examples. |
NEW ALGORITHMS |
NUMERICAL ILLUSTRATIONS |
CONCLUSION |
In this paper, we introduced seven alternative numerical algorithms for minimization of nonlinear unconstrained optimization problems and compared with Newton’s method. It is clear from the above numerical results that the rate of convergence of algorithm (1) to algorithm(7) are in general faster than Newton’s algorithm. In particular algorithm(5) and algorithm (4) converge much better than the remaining algorithms. In real life problems, the variables can not be chosen arbitrarily rather they have to satisfy certain specified conditions called constraints. Such problems are known as constrained optimization problems. In near future, we have a plan to extend the proposed new algorithms to constrained optimization problems. |
References |
|