![]() |
Math Question - Printable Version +- HP Forums (https://archived.hpcalc.org/museumforum) +-- Forum: HP Museum Forums (https://archived.hpcalc.org/museumforum/forum-1.html) +--- Forum: Old HP Forum Archives (https://archived.hpcalc.org/museumforum/forum-2.html) +--- Thread: Math Question (/thread-233673.html) |
Math Question - Namir - 11-06-2012 I am looking at the Jacobi Method which is the simplest (and slowest) iterative method for solving a large system of linear equations. Such systems are too large for traditional methods to work. I am solving: A x = b The Jacobi method uses the following matrix form for its iterations: D x = b – (L + U) x
Where D, L and U are the diagonal, strictly left lower triangular, and strictly right upper triangular matrices, respectively. The algorithm is guaranteed to work, with any initial guess for x, if matrix A has diagonal dominant elements. (A + P) x = b + P x The iteration formula becomes: (D + P) x = b + P x – (L + U) x If I use a guess for x on the right hand side, the iterations do not converge. However, if I solve for x using a traditional method to get a good approximation for x and use that in evaluating the term P x, then the above equation converges to a more accurate solution. In fact, using this approach the expression b + P x can be regarded as equal to b' which is a new constants vector. So basically I am changing the original equation to a more favorable form by updating matrix A and vector b. The question to is this. Am I cheating in the above method that I described? Can you offer any better solutions? Namir PS: whatever approach works with the Jacobi method can also be applied to the Gauss-Seidel method which is a better method. There are algorithms (variants of the conjugate gradient method) that solve large linear systems of equations where matrix A is not diagonal dominant or symmetrical. I am trying to go back to basics and see if the Jacobi and Gauss-Seidel methods can get a face lift.
Edited: 6 Nov 2012, 10:41 a.m.
|