This is the mail archive of the
gsl-discuss@sources.redhat.com
mailing list for the GSL project.
Re: minimization using the Brent algorithm (brent.c)
- From: Fabrice Rossi <rossi at ufrmd dot dauphine dot fr>
- To: Z F <mail4me9999 at yahoo dot com>
- Cc: gsl-discuss at sources dot redhat dot com
- Date: Mon, 17 Mar 2003 22:59:50 +0100
- Subject: Re: minimization using the Brent algorithm (brent.c)
- References: <20030317213750.19495.qmail@web20415.mail.yahoo.com>
Stated like this, I mostly agree with you, except maybe on vocabulary.
What is called the Brent algorithm or the Golden section search is
exactly what is (currently) implemented in GSL. But as Michael and you
pointed out, in practical situation, we need a way to start the
algorithm. The rationnal why this is not included into GSL (well, it
once was in my first implementation of multidimensional minimization) is
(to my mind) that it is highly problem dependant. Nevertheless, this
discussion seems to show that a general solution (maybe suboptimal) is
needed.
By the way a simple example why this is problem dependant: take
multidimensional minimization. In general, you have to perform a line
search at each step. As you want something fully automated, in some
cases you have to provide the now famous middle point! For instance,
when you are working with conjugate gradient algorithms, you must have a
correct (not very precise, but still correct) estimation of the minimum
in the line search. It is a good idea to rely on the Brent algorithm to
do this line search, but as you have evaluated the gradient at the
starting point, you can use this information to find the middle point
more efficiently that you would do in the general case (in which you
cannot assume knowledge of the derivative). Moreover, in some particular
applications (for instance neural networks) and under particular
circonstances, you can even more tune your algorithm, for instance be
reusing results from previous line searches.
Fabrice Rossi