Hi, Valentin --
Quote:
KS: [...] I'd believe that SOLVE and INTEG act upon only the real-valued portion of the calculated expression [...]."
VA: No, there are some equations involving complex values where the built-in SOLVE succeeds in returning a complex root, so your statement doesn't always apply. Alas, this won't work in general...
Since you stated it, I'm sure that it's true. However, it would seem almost accidental, as the HP-35s lacks the processing speed (not as fast as Pioneers) to solve for complex-valued roots of user-defined equations. Furthermore, the predecessor RPN-based models did not offer this capability, so some new code would have had to be developed.
The slow HP-15C allows programs that utilize and produce complex-valued numbers to serve as integrands and mathematical functions for SOLVE. However, only the real part of the output is utilized by SOLVE/INTEG. In many cases, though, this is sufficient for practical applications.
Quote:
"KS: [...] it would entail solving a real-valued system of twice the dimension, after first recognizing that the system is complex-valued. [...]"
VA: If by "twice the dimension" you mean that a 5x5 complex system would mandatorily require solving a 10x10 real one, that's not so, you can get by with smaller systems by using algorithms dealing with other than the tired [[A -B][B A]] scheme.
I surmise that the same 5x5 system could be utilized if complex math were supported, but the number of operations should be comparable. "Tired" the real transformation might be, but it is straightforward, easily reversible, and should be generally reliable (except perhaps if the complex numbers somehow transform to an ill-conditioned matrix). It simply provides an explicit decomposition of the complex-valued arithmetic to its real-valued steps, allowing the same algorithms to be used, thus making it a natural approach for the HP-15C.
Here's an illustration for anyone who might not understand the purpose of the HP-15C matrix transformations for complex numbers. For instance, why is the left-side multiplicand transformed to a square matrix, but not the right-side multiplicand?
Py,x ("convert to partioned")
Cy,x ("convert to complex")
AP->A~ (MATRIX 2)
A~->AP (MATRIX 3)
{a, b, c, d} are real-valued; they could be scalars, or matrices with {a, b} and {c, d} of suitably-matching dimensions.
(a+jb)*(c+jd) (two complex-valued multiplicands)
[a b]*[c d] (entered as matrices in "complex" form)
[ a] [ c] ("partitioned" form)
[ b] * [ d]
[ a -b]*[ c] (transformed to square real-valued matrix)
[ b a] [ d]
[ac - bd] ("partitioned" matrix product)
[bc + ad]
[(ac-bd) (bc+ad)] ("complex" form of matrix product)
(ac-bd)+j(bc+ad) (complex-valued result)
These manual procedures make logical sense to those who know their linear algebra well, but are probably not very intuitive to those who don't. Given the limitations of the hardware, however, I'd say that the developers of the HP-15C did a near-ideal job.
-- KS
Edited: 9 May 2008, 3:03 a.m.