woelen
Super Administrator
Posts: 8027
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
Test suite for polynomial root finding software?
I ported the Fortran code for Jenkins-Traub polynomial root finding to Java. I ported the general purpose solver CPOLY, which can be used to find the
roots of any polynomial over the field of complex numbers. I also ported the special purpose solver RPOLY, which only works on polynomials with real
coefficients. RPOLY is 4 times as fast as CPOLY and is a nice addition if you have polynomials with all real coefficients.
I of course tested my implementation of the code with some home-made polynomials, but I am wondering if there is a well-known published test suite of
polynomials, which I can use with my code. If anyone knows about such a suite I would love to read about it. I searched the internet quite some time
but could not find a test suite for Jenkins-Traub myself.
|
|
turd
National Hazard
Posts: 800
Registered: 5-3-2006
Member Is Offline
Mood: No Mood
|
|
No, sorry.
But why not let it run against a reference implementation with random coefficients? You can call Fortran, C or C++ from Java via JNI. If it survives a
night it should be good. The only problem I see is with rounding errors.
|
|
woelen
Super Administrator
Posts: 8027
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
Using a JNI coupling for a piece of Fortran programs does not sound like a nice task to me. It is quite a lot of a hassle to get this up and running.
Besides that, using a JNI coupling with random polynomials does not give a reference test. I can do that test as described below.
An alternative method of testing is generating random polynomial, solving these with RPOLY or CPOLY and then multiplying all roots back to the
original polynomial and comparing this polynomial with the starting one. This test certainly will reveal problems, but using random polynomials you
miss the special pathetic cases (e.g. multiple roots, roots very close to each other in a cluster). I already tried with a polynomial with 14 well
separated roots in the order of magnitude of 0.001 and 1 and three roots near 0.01 with distances of less than 0.00001 and another three roots exactly
equal to 0.01. This has 6 roots which are really pathetic. The software I have written does a reasonable job on this (the 14 well separated roots are
found at better than 14 digit accuracy, the roots near or on 0.01 are found at 3 digit precision, e.g. 0.01001 instead of 0.01. So, I have some
confidence in the software, but I would like to use a published test suite and compare the results with published results.
|
|
turd
National Hazard
Posts: 800
Registered: 5-3-2006
Member Is Offline
Mood: No Mood
|
|
Quote: Originally posted by woelen | Using a JNI coupling for a piece of Fortran programs does not sound like a nice task to me. It is quite a lot of a hassle to get this up and running.
Besides that, using a JNI coupling with random polynomials does not give a reference test. I can do that test as described below.
|
You could also generate a common input file for Fortran and Java and then compare the output files with a script.
Quote: | This test certainly will reveal problems, but using random polynomials you miss the special pathetic cases (e.g. multiple roots, roots very close to
each other in a cluster. |
Having implemented this, you obviously know much more about this topic than I do, so I don't think I can contribute anything valuable, but intuitively
I would do the following:
Generate a random integer n>0, the number of clusters.
For each cluster generate a random integer kn>0, the number of roots in this cluster.
For each cluster generate a random complex, the average root in this cluster.
For each cluster generate a random positive real, the variance of the roots in this cluster.
With these values generate the complex roots in each cluster: first the absolute deviation from the average, then the "phase angle" (modern random
number generator libraries take any distribution you wish).
From this it is trivial to construct the coefficients of the polynomial.
See if you find the roots and log the differences.
Repeat until a difference too large is found.
Then add a few special cases: Make a random percentage of the roots of a cluster purely imaginary or purely rational. Add a certain percentage of the
conjugate complex roots. Evidently, to test the rational coefficient case add the conjugate complex of every root.
So that would be my naive brute-force way of approaching this, but I'm sure there are more intelligent ways.
|
|
woelen
Super Administrator
Posts: 8027
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
Your idea does help me. I was thinking in the line of generating random coefficients, solving for these coefficients and then determining the
polynomial again. What you do is the other way around. First generating a set of zeros, generating the polynomial, which indeed is fairly trivial (not
100% trivial, you have to write your code carefully to minimize accumulation of rounding errors) and then solving the polynomial again.
The determination of the polynomial from the rools I will do in quad double precision (106 bit accuracy) and then solving again I will do with normal
doubles (53 bits of precision).
Thanks for your idea! Sometimes the thought of someone else, who is fresh to the problem does add new insights.
[Edited on 25-2-14 by woelen]
|
|
AJKOER
Radically Dubious
Posts: 3026
Registered: 7-5-2011
Member Is Offline
Mood: No Mood
|
|
I would not so much test, but look at the iterative formula for potential problems (like when dividing by a quantity that could become close to
zero). I would put in a test to warn the user or switch to a different technique automatically.
Somebody checking your software will deliberately use improbable problematic inputs, so put in intelligent oversight.
[Edited on 19-3-2014 by AJKOER]
|
|
woelen
Super Administrator
Posts: 8027
Registered: 20-8-2005
Location: Netherlands
Member Is Offline
Mood: interested
|
|
@AJKOER: Your strategy may work for very simple problems, but the problem I have at hand is amazingly complex. There is not one formula for iteration
to the roots, the program I have at hand has a fascinating strategy to go towards the roots. Writing a general purpose polynomial root finder, which
works for all kinds of input problems is an art (and a science) on its own. For this reason, still the old programs CPoly and RPoly are used in
industry as general purpose solvers, even though these programs are 40 years old. They have been ported to C, C++, Javascript, but basically they
still are the same.
I myself have taken another piece of software (MPSolve 2.2) and made a more flexible and easy to use polynomial solver, which can be integrated in
your own C-programs or C++-programs. Have a look at this code and you understand what I wrote above.
http://woelen.homescience.net/science/software/mpsolve.html
Unfortunately, porting this to Java is really daunting, even if only a standard IEEE 53 bit floating point version is ported.
@turd: I implemented your strategy of first determining roots and constructing polynomials, based on those roots and then solving these polynomials.
Using that I indeed could improve the algorithms of Jenkins and Traub. The original CPoly and RPoly programs sometimes give inaccurate results because
they isolate a root and if it is isolated sufficiently then they stop. What I do is perform one additional iteration step once isolation is
guaranteed. This is an enormous improvement. It really is fascinating to see that this issue lingers around for a few decades already and only very
few people have run across such an issue:
http://software.intel.com/en-us/forums/topic/281693
Apparently, a commercial version of RPoly is available in the ISML library and they have fixed the issue. My Java port also fixes the issues, most
likely in a different way than ISML does, but the output of my version is much more accurate than the output of the original RPoly from Netlib.
I am really surprised by the speed of my ports. The Java version of the RPoly program takes appr. 10 microseconds for finding all 10 zeros of a tenth
degree polynomial and it takes appr. 100 microseconds for a 30th degree polynomial. These figures are obtained on a cheap 400$ little Nettop PC, based
on a Shuttle XH61V with cheap core i3 processor. This really is amazing.
I also made a 106-bit precision port, this time of another program, based on Aberth's method of Dario Bini: http://jblevins.org/mirror/amiller/pzeros.f90
pzeros.f90 is another established polynomial solver and I was able to port this to Java at 106 bit precision (using DoubleDouble for arithmetic). At
such extended precision solving a polynomial of 30th degree takes appr. 900 microseconds on the same little PC.
The source code of all these Java ports soon will appear on my website, I want to cleanup the code and add comments to make understanding the code
easier (but still it will not be easy to understand all of it). Using the code, however, in your own Java programs will be very easy.
[Edited on 19-3-14 by woelen]
|
|