Interpolations for Two and Three Independent Variables  Printable Version + HP Forums (https://archived.hpcalc.org/museumforum) + Forum: HP Museum Forums (https://archived.hpcalc.org/museumforum/forum1.html) + Forum: Old HP Forum Archives (https://archived.hpcalc.org/museumforum/forum2.html) + Thread: Interpolations for Two and Three Independent Variables (/thread204931.html) 
Interpolations for Two and Three Independent Variables  Namir  11152011 Recently I was looking for information about multivariable interpolation. I was curious about interpolation with two and three independent variables. My first stop was Wikipedia. The algorithm presented there dealt with interpolation based on the coordinates of the independent variables forming a square. That is, the bilinear interpolation used points (x1,y1), (x1,y2), (x2,y1), and (x2,y2). Further search on the Internet located a paper that dealt with bilinear and trilinear interpolation. The paper showed more elegant equations for squareshaped coordinates for the bilinear interpolation. The paper also included equations for the cubeshaped trilinear interpolation. Implementing these algorithms in BASIC, RPN, or RPL should be easy. After implementing the bilinear and trilinear interpolations, I asked myself the question, “What if the anchor coordinates for the bilinear interpolation do not form a square? Likewise, what if the anchor coordinates for the trilinear interpolation do not form a cube?” I proceeded to calculate the coefficients for the multivariable polynomials used for the bilinear and trilinear interpolations. After a little bit of math wrangling, I realized that my best bet was to use Vandermonde types of matrices. Such matrices would allow me to calculate the coefficients of the interpolating polynomials. In addition, you need fewer anchor points than the square or cube approaches. One advantage of calculating the interpolating polynomial coefficients is that you can reuse the coefficients for additional interpolations that use the same anchor points. This is an advantage that is not available when using interpolation equations that simply yield the final answer. Using the Vandermonde types of matrices allowed me to easily implement quadratic interpolations for two and three independent variables. If the implementation uses a language that works with matrices, then the calculations are greatly simplified! Such is the case with Matlab, RPL and the HP71B that has the MATH ROM.
Namir
Re: Interpolations for Two and Three Independent Variables  Marcus von Cube, Germany  11152011 Namir, do you think the matrix support in WP 34S is strong enough to support your algorithms? Can you post some background and a solution to look at?
Re: Interpolations for Two and Three Independent Variables  Namir  11152011 In the case of a bilinear interpolation, one can build the Vandermonde matrix A as Re: Interpolations for Two and Three Independent Variables  Marcus von Cube, Germany  11152011 Namir, thanks. It doesn't look impossible to implement on the 34S. Do you have a benchmark example?
Instead of multiplying with the inverse of the matrix, a linear solve should do as well.
Re: Interpolations for Two and Three Independent Variables  Namir  11152011 Here are benchmarks for the bilinear and trilinear interpolation: X Y FX Interpolation Data Speed is not a real issue here because there are no iterations.
Namir
Re: Interpolations for Two and Three Independent Variables  Namir  11152011 In the case of a biquadratic interolation, one can build the Vandermonde matrix A as Re: Interpolations for Two and Three Independent Variables  Namir  11152011 The advantage of using the Vandermonde type of matrices is that you can use it to define your custom interpolation function. For example, if I want to interpolate using: Z = a + b/X + c/Y + d*X*Y Then the Vandermonde matrix will have a matrix with the following columns:
Column 1: is full of ones. Since we have 4 coefficients we need four rows in the Vandermonde matrix. Another example is using the following model that involves three independent variables X, Y and T: Z = a + b sin(X) + c sin(2*X) + d ln(Y/T) + e X*T Then the Vandermonde matrix will have a matrix with the following columns:
Column 1: is full of ones. Since we have 5 coefficients we need five rows in the Vandermonde matrix. Thus the Vandermonde matrix approach gives us the power to choose to work with any number of variables that suites our application and also choose from wide variety of (thousands if not millions) models for interpolation. This conclusion answer an old question of mine about pushing new limits for interpolation. Another way to look at this approach is basically we are using Vandermondestyle matrices to do a curve fit with a minimum required number of data points (meaning Rsquare of the fitted model is always 1). Namir
Edited: 15 Nov 2011, 5:45 p.m.
