HP Forums

Full Version: Re: New Root Seeking Algorithms
You're currently viewing a stripped down version of our content. View the full version with proper formatting.


Your comparison with Newton's method is wrong.

Newton's method requires that you use f '(x) explicitly. If you don't, what you are doing is really measuring the convergence of fixed point iteration using a function that approximates f '(x). Simply stated, you are comparing your method to the accuracy of the approximation of f '.

The last example you give is for a polynomial with 6 unique roots. If you use the derivative explicitly, an initial guess of 7 will give the root to 10 significant digits in 6 iterations: six calls of f and six calls of f '. If you do one more iteration and you are using sufficient significant digits, you will have the root to 20 significant digits. Your example fails because the inaccuracy of f ' slows the convergence rate.

One of shortcomings of Newton's method is that you have to know and call the derivative. If you don't know the derivative and only want to call the function itself, use the secant method. This is closer to what you are doing but the error in f ' is reduced with every iteration.

I suggest you look at the fixed point iteration theorem and calculate theoretical convergence. This will give you a better picture of how your method behaves.



i understand secant dominates when f' is numerical. is this true?