New root-seeking algorithm « Next Oldest | Next Newest »

 ▼ Namir Posting Freak Posts: 2,247 Threads: 200 Joined: Jun 2005 11-25-2005, 11:21 PM I presented this new root-seeking algorithm (I call it the Quartile algorithm) at the HHC2005 conference It isa root-bracketing method that improves on the Bsection method: ```Given: Root bracketing guesses A and B Tolerance Tolr P-Code: Let alpha = 0.25 Let FA = F(A) Let FB= F(B) Do If ABS(FA) < ABS(B) Then Xm = A + alpha * (B - A) Else Xm = A + (1 - alpha)*(B - A) End Let Fm = F(Xm) If Fm * FA > 0 Then A = Xm FA = Fm Else B = Xm FB = Fm End Loop Until Abs(A - B) < Tolr Root = (A + B) / 2 ``` J. P. Moreau has implemeted the Quartile algorithm in BASIC, C++, and other languages. The BASIC code is found here ▼ Marcus von Cube, Germany Posting Freak Posts: 3,283 Threads: 104 Joined: Jul 2005 11-26-2005, 05:05 AM Namir, Quote:```Do If ABS(FA) < ABS(B) Then ``` I assume you mean: ```Do If ABS(FA) < ABS(FB) Then ``` Can you elaboreate a little about the heuristics behind the algorithm? Marcus ▼ Namir Posting Freak Posts: 2,247 Threads: 200 Joined: Jun 2005 11-26-2005, 07:22 AM You are correct. Here is the correced p-code: P-Code: ```P-Code: Let alpha = 0.25 Let FA= F(A) Let FB= F(B) Do If ABS(FA) < ABS(FB) Then Xm = A + alpha * (B - A) Else Xm = A + (1 - alpha)*(B - A) End Let Fm = F(Xm) If Fm * FA > 0 Then A = Xm FA = Fm Else B = Xm FB = Fm End Loop Until Abs(A - B) < Tolr Root = (A + B) / 2 ``` ▼ Namir Posting Freak Posts: 2,247 Threads: 200 Joined: Jun 2005 11-26-2005, 02:08 PM The most common two root-braketing methods are: 1. The Bisection method.This method compares the sign of function values. 2. he Reguli alse method uses the function values are weights to determine the new guess. The Quartile method compares the absolute funcion values to determine the next guess. Thus, its approach lies somewhere in between the above two methods. If you look at the p-code an change alpha to 0.5, you get the Bisection method. Thus, the Bisection method is a special case of the Quartile method, where the new guess is systematically taken as the middle of the root-bracketing interval. Edited: 26 Nov 2005, 2:09 p.m. ▼ Patrick Senior Member Posts: 266 Threads: 32 Joined: Jan 1970 12-03-2005, 10:59 PM Sorry, Namir, I don't mean to be impolite, but so what? It is relatively simple to develop umpteen variations on a theme, the question is whether or not one of these variations is actually better in some way. Are there any analyses which motivates the choice of alpha = 0.25? In what categories of problems does this algorithm work better than others? How much better? Does it converge more frequently than others? Is it faster? ▼ Namir Posting Freak Posts: 2,247 Threads: 200 Joined: Jun 2005 12-10-2005, 09:24 AM The Bisection method is a reliable method despite it's slowness. Often, complex root-seeking implementation use a combination of Newton (or similar/better methods) with he Bisection as a backup plan in case Newtons method yield guesses that stray away. The Quartile method is a root-bracketing method that is better than the Bisection. The value for alpha can be in the range of 0.25 and 0.3. Th Quartile algorithm uses a slightly different approach--compares the absolute values of the functions at interval [A. B]. The Quartile methd is not a variant of Bisection. Te Bisection is a special case of the Quartile method (setting alpha to 0.5). The method of False-Position is the next higherup root-bracketing method that uses the function values as weights. This algorithm does not systematically shrink the interval containing the root. To stop iterating, you compare the current new guess with the last new guess. If you are going to resort to the Bisection method as the main or backup method, then you can replace it with the Quartile method to have fewer iterations. The reduction in iterations depends on the function and the initial root-bracketing interval. Newtons method is popular in root solving. It uses calls to teh functio and it's first derivative. The Haily method is generally better than Newton's. It calls the function and its first two derivatives. The Householder method is even faster, but calls n he fucntion and it's first three derivatives! By using function calls to approximate the derivatives, I found that Haily's method is in general the optimum method (considering iterations and function calls). Parick, I welcome your suggstions for root-seeking algorithms (both variants of other methods or original algorithms). I think the subject is fascinating and I am interested in all sugegstions. Namir Edited: 10 Dec 2005, 9:28 a.m. ▼ ECL Member Posts: 111 Threads: 22 Joined: Jul 2007 12-10-2005, 10:26 PM Namir, I too am interested in root methods. I am not familiar w/ Householder method. I am dealing with simple methods (modified false position, newton-raphson..). I prefer non-symbolic derivative evaluation due to using primarily the 33s. Although this can be overcome with use of divided difference approach. ▼ Namir Posting Freak Posts: 2,247 Threads: 200 Joined: Jun 2005 12-10-2005, 10:39 PM ```Th Householder method is relatively new. Like Newton's and Haily's methods it is based on te Tayler's expansion. The Householder methods includes up to the third derivative. Atributes of method: + Very fast convergence rate + Requires first, second, and third function derivatives! + Using derivative approximation methods results in five function calls per iteration + Compared to other methods, this method has fewer iterations but a high number of function calls + Basic Householder iteration uses: X = X - f(X){[f’(X)^2 - f(X) f’’(X) / 2]/[f’(X)^3 - f(X) f’(X) f’’(X) - f'''(X) f(X)^2 / 6]} where f(x) is target function (f(x) = 0) f'(x) is the first derivative f''(x) is the second derivative f'''(x) is the third derivative Pseudo-code: Obtain initial guess x and Tolerance Obtain or set value to MaxIterations Clear Diverge flag Set IterCount = 0 Loop Increment IterCount If IterCount > MaxIterations Then Set Diverge flag and exit loop h = 0.01 * [1 + |x|] f0 = f(x) fp1 = f(x+h) fp2 = f(x+2h) fm1 = f(x-h) fm2 = f(x-2h) D1 = (fp - fm) / (2h) D2 = (fp - 2f0 + fm) / (h*h) D3 = (fp2 - 2 * fp1 + 2 * fm1 - fm2) / 2 / h ^ 3 diff = f0 * ((f0^2 - f0 * D1 / 2)/(f0^3 - f0 * D1 * D2 - D3 * f0^2 / 6]} x = x - diff Until |diff| < Tolerance Return root as x and Diverge flag ``` Edited: 11 Dec 2005, 8:38 a.m.

 Possibly Related Threads... Thread Author Replies Views Last Post [HP Prime] Using n-root symbol and exponent problem uklo 7 784 11-11-2013, 01:39 AM Last Post: Alberto Candel Cubic root (-8) = 2 ? Gilles Carpentier 37 2,690 08-12-2013, 10:26 PM Last Post: jep2276 Square Root Simplifier for HP39gII Mic 4 552 03-11-2013, 08:25 AM Last Post: Eddie W. Shore Seeking DIY4X update David Griffith 10 903 11-15-2012, 12:41 AM Last Post: Eric Smith Cube root on standard calculator Thomas Klemm 22 1,884 11-09-2012, 06:11 AM Last Post: Pierre Question about a "combinations" algorithm Namir 9 715 09-20-2012, 04:51 PM Last Post: Gilles Carpentier Linear Programming - Simplex Algorithm LarryLion 5 505 09-04-2012, 10:57 PM Last Post: David Hayden Seeking new resources for TI NSpire Programming Namir 6 617 07-29-2012, 03:57 AM Last Post: fhub ROOT bug? HP 48S/48G Eddie W. Shore 8 782 07-13-2012, 07:05 PM Last Post: Eddie W. Shore x root y on hp42s David Griffith 14 1,103 04-08-2012, 12:43 PM Last Post: Walter B

Forum Jump: 