This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Multivariate minimization


Dear friends,

I have some questions regarding the multivariate minimisation routines.
I have t0 minimise a highly non-linear function depending on several
variables (about 5000). I want to use the vector BFGS method because I
have heard it is the best method for high dimensions. Is it correct???

Secondly I have some design problems. The direction matrix in the BFGS
method is an approximation of the Hessian matrix (matrix of second
derivatives). At the optimum this approximation equals to the true
Hessian. I am using the routine to maximise the loglikelihood function and
I have to approximate the Hessisn, which is a task of estimating 12502500
distinct elements of the Hessian. Therefore I would like to ask if it is
not possible to return somehow the approximation of the Hessian at the end
of the optimisation procedure?? This would save a lot of time and energy.

The NAG procedures have an option which allows the user to return the
approximation of the Hessian.

My problem is that the estimation at the end of the optimisation is
sometimes not positive (negative when minimising) definite due to the
numerical accuracy. The BFGS method uses an approximation which is always
positive definite.

Thanks four your help,

Przem



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]