Calculus/Newton's Method
Newton's Method
In Calculus, Newton's Method (also called the Newton-Raphson method) is a recursive algorithm for approximating the root of a differentiable function. We know simple formulas for finding the roots of linear and quadratic equations, and there are also more complicated formulae for cubic and quartic equations. At one time it was hoped that there would be formulas found for equations of quintic and higher-degree, though it was later shown by Neils Henrik Abel that no such equations exist. The Newton-Raphson method is a method for approximating the roots of polynomial equations of any order. In fact the method works for any equation, polynomial or not, as long the function is differentiable in a desired interval.
|
Let be a differentiable function. Select a point based on a first approximation to the root, arbitrarily close to the function's root. To approximate the root you then recursively calculate using: As you recursively calculate, the 's become increasingly better approximations of the functions root. For n number of approximations, |
Examples
Find the root of the function .
As you can see is gradually approaching zero (which we know is the root of f(x)). One can approach the function's root with arbitrary accuracy.
Answer: has a root at .
Notes
This method fails when . In that case, one should choose a new starting place. Occasionally it may happen that and have a common root. To detect whether this is true, we should first find the solutions of , and then check the value of at these places.
Newton's method also may not converge for every function, take as an example:
For this function choosing any then would cause successive approximations to alternate back and forth, so no amount of iteration would get us any closer to the root than our first guess.