Translate this page into:
Two new efficient sixth order iterative methods for solving nonlinear equations
⁎Corresponding author. obadah_sy@yahoo.com (Obadah Said Solaiman),
-
Received: ,
Accepted: ,
This article was originally published by Elsevier and was migrated to Scientific Scholar after the change of Publisher.
Peer review under responsibility of King Saud University.
Abstract
In this paper, we present two new iterative methods, one of them is second derivative free, for solving nonlinear equations. We derive these methods based on the Taylor series expansion and Halley’s method. The convergence analysis of the two methods is discussed. It is established that the new methods have sixth order of convergence. Several numerical examples given show that the new methods are comparable with the well-known existing methods of the same order.
Keywords
41-xx
65-xx
Root finding method
Halley’s method
Nonlinear equations
Iterative methods
Order of convergence

1 Introduction
One of the most important questions in mathematics is how to find a solution of the nonlinear equation
. The most famous method, implemented by Newton, can be derived from the first two terms of the Taylor series of
A Large number of researchers try to improve Newton’s method in order to get more accuracy and higher order of convergence, see for example Abbasbandy (2003), Chun (2006), He (2003), Medhu and Jayaraman (2016) and the references therein.
If we stop at the third term in the Taylor series we will have
Recently, Kogan et al. (2017) proved that among all one-point iterative methods without memory of order p, methods of order are the most efficient methods.
A lot of researchers try to modify Halley’s method, either by increasing the order of convergence or by reducing the number of functions evaluation in each iteration by writing the new schemes without using the second derivative; as it is not always easy to find the second derivative of the function. Noor and Noor (2007) have implemented a two step Halley’s method using Newton’s method as a predictor and Halley’s method as a corrector. They proved that this two step method has sixth order of convergence. Noor et al. (2007) have modified the method in Noor and Noor (2007) using the finite difference scheme, and established their new algorithm without using of second derivative, and they showed that the implemented scheme is of fifth order of convergence. Hafiz and Al-Goria (2012) have established a two new methods of order seven and nine, based on the weight combination of midpoint with Simpson quadrature formulas and using the predictor-corrector technique. Very recently, Kumar et al. (2017) developed a parameter based family of sixth order iterations free from second derivative for solving nonlinear equations.
Commonly in the literature, the efficiency index is used to compare different iterative methods. This index is defined as , where p is the order of convergence and m is the total number of functional evaluations per iteration.
In this paper, we present two new iterative methods, one of them is free from second derivative. We derive these methods based on the Taylor series expansion of and use of Halley’s method within the expansion. In order to reduce number of functional evaluations at each iteration in the implemented scheme, we approximate the second derivative in the first scheme and construct a new one of the same order with higher efficiency index. Both established techniques are of sixth order of convergence.
The rest of the paper is divided as follows. The new methods are described in Section 2. In Section 3, the convergence analysis is carried out to establish the order of convergence. In Section 4, the methods are tested on some numerical examples, and comparisons of the results of our methods with other well-known methods and methods of the same order are summarized in tables. Finally, the conclusion of the paper is given in Section 5.
2 The new methods
2.1 Method 1
Consider the nonlinear equation of the type , where is a real function, sufficiently differentiable, defined on a real interval I.
For simplicity, assume that
is a simple zero for
, that is
, and assume that
is an initial guess sufficiently close to
. We obtain from Taylor’s series expansion of the function
that
For a given
, compute the approximate solution
by the following iterative scheme
Algorithm 2.1 is called modified Halley’s method (MH1) and has sixth order of convergence. At each iteration, (MH1) needs two evaluations of the function, two evaluations of first derivative, and one evaluation of the second derivative.Therefore, the efficiency index of this modified method is , which is worse than of Halley’s method.
2.2 Method 2
Now, in order to improve the efficiency index of Algorithm 2.1, we approximate the second derivative to reduce number of functional evaluations needed in each iteration. The method thus obtained is free from second derivative. Towards this, we will approximate the second derivative using a combination of the already known data in the past steps.
Consider the function
, where
, and d are unknowns to be found. With the following interpolation conditions:
A system of four linear equations of four variables is obtained from the above conditions. Solving this system gives
For a given
, compute the approximate solution
by the following iterative scheme
We call Algorithm 2.2 as the modified Halley’s method (MH2) which has sixth order of convergence. At each iteration, (MH2) needs two evaluations of the function, and two evaluations of first derivative. Therefore, the efficiency index of this modified method is , which is better than of (MH1) and of Halley’s method.
If , then Algorithm 2.1 reduces to the two step scheme: Which is the two step Newton’s method considered by Traub (1964). This technique is of fourth order of convergence. This shows that (MH1) is a generalization of Newton’s two step method.
3 Convergence analysis
Now, we will discuss the convergence analysis of Algorithms 2.1 and 2.2.
Let be a simple zero of a sufficiently differentiable function in an open interval I. If is sufficiently close to , then the method defined by Algorithm 2.1 is of sixth order of convergence.
Consider
is a root of
, and let
be the error at the
iteration. By using Taylor’s series about
we have
Let be a simple zero of a sufficiently differentiable function in an open interval I. If is sufficiently close to , then the method defined by Algorithm 2.2 is of sixth order of convergence.
With the same assumptions of the previous theorem, we have
4 Numerical examples
In this section we present seven different examples to show the efficiency of the new methods (14) and (17). We compare the new methods with the second order Newton’s method, third order Halley’s method, sixth order (NR) method proposed by Noor and Noor (2007) and (PCH) method of order six presented by Noor et al. (2007). To do so, we consider the following seven test examples: We take in the following stopping criterium of the computer programs . The computations here have been carried out using Mathematica 9 with 10,000 significant digits.
Table 1 shows the number of iterations n such that the stopping criterium is satisfied, the approximate zero
, the distance between two consecutive approximations for the zero with
, the value of the approximate zero after the required iterations
, and the computational order of convergence (COC) which can be approximated using the formula
firstly introduced by Weerakoon and Fernando (2000).
Method
n
COC
Newton
6
2
Halley
5
3
NR
4
6
PCH
4
6
MH1
4
6
MH2
3
6
Newton
7
2
2
Halley
5
2
3
NR
3
2
6
PCH
4
2
6
MH1
3
2
6
MH2
3
2
6
Newton
5
2
Halley
4
3
NR
3
6
PCH
3
6
MH1
3
6
MH2
3
6
Newton
5
2
Halley
5
3
NR
3
6
PCH
3
6
MH1
3
6
MH2
3
6
Newton
7
2
Halley
5
3
NR
4
6
PCH
4
6
MH1
3
6
MH2
3
6
Newton
13
3
2
Halley
7
3
3
NR
5
3
6
PCH
6
3
6
MH1
5
3
6
MH2
5
3
6
Newton
9
2
Halley
5
3
NR
4
6
PCH
4
6
MH1
4
6
MH2
4
6
The second column in Table 1 shows the number of iterations n needed to reach the stopping criterium. It is clear that the new methods need less iterations than the other methods to reach the stopping criteria, or the same number of iterations in some cases when comparing with (NR) and (PCH) methods. Therefore, the approximate solutions obtained by the two new methods are comparable with the well-known existing methods.
Table 2 illustrates the number of iterations needed to obtain an approximation of the solution using the stopping criterium
. Setting the same convergence criterium for all methods, the number of iterations required for the new methods is either less than or equal the number of iterations needed by the other methods of the same order.
Newton
9
10
9
9
10
17
13
Halley
7
7
6
7
7
10
8
NR
5
5
4
4
5
7
6
PCH
5
5
4
5
5
7
6
MH1
5
5
4
4
5
6
5
MH2
5
5
4
4
5
7
5
5 Conclusion
In this paper we implemented two new efficient root finding methods to solve nonlinear equations. The first one was derived using Taylor’s expansion together with Halley’s method, and in the second one we used some approximations for the second derivative to increase the efficiency index of the first method. Overall, the two implemented methods are comparable to other well-known methods of the same order. The convergence of the two algorithms has been proved, and the convergence order has been established to be of the sixth order. Seven examples were tested, showing the capability of the methods.
References
- Improving Newton-Raphson method for nonlinear equations by modified Adomian decomposition method. Appl. Math. Comput.. 2003;145:887-893.
- [Google Scholar]
- A new iterative method for solving nonlinear equations. Appl. Math. Comput.. 2006;187:415-422.
- [Google Scholar]
- New ninth and seventh order methods for solving nonlinear equations. Eur. Sci. J.. 2012;8(27):83-95.
- [Google Scholar]
- A new, exact and easy method for finding the roots of equations generally, and that without any previous reduction. Philos. R. Soc. London. 1964;18:136-148.
- [Google Scholar]
- A new iterative method for solving algebraic equations. Appl. Math. Comput.. 2003;135:81-84.
- [Google Scholar]
- To the question of efficiency of iterative methods. Appl. Math. Lett.. 2017;66:40-46.
- [Google Scholar]
- A family of higher order iterations free from second derivative for nonlinear equations in . J. Comput. Appl. Math. 2017
- [CrossRef] [Google Scholar]
- Higher order methods for nonlinear equations and their basins of attraction. Mathematics. 2016;4(2):22.
- [CrossRef] [Google Scholar]
- Predictor-corrector Halley method for nonlinear equations. Appl. Math. Comput.. 2007;188:1587-1591.
- [Google Scholar]
- A new modified Halley method without second derivatives for nonlinear equation. Appl. Math. Comput.. 2007;189:1268-1273.
- [Google Scholar]
- Modified Householder iterative method for nonlinear equations. Appl. Math. Comput.. 2007;190:1534-1539.
- [Google Scholar]
- Iterative Methods for Solution of Equations. Englewood, Cliffs, NJ: Prentice-Hall; 1964.
- A variant of Newton’s method with accelerated third-order convergence. Appl. Math. Lett.. 2000;13:87-93.
- [Google Scholar]