This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Help: working with large gsl_vector


Toan T Nguyen wrote:
Hi, thank you for the analysis. Apparently, my analytical calculation is able to guess the minimum reasonably good, so I need only 20000 iterations to find the minimum.

Nice!


However, the strange thing is if I take the final minimized vector and run it through the conjugate gradient routine again, the energy (my function to minimize) keeps going down for a few more 1000s iterations (albeit it change only the 4th significant digit). Do you know what's wrong?

It might be related to the restarting of the algorithm. A conjugate gradient calculates is descent direction thanks to the gradient at the current point and to the previously used direction. The first descent direction is (minus) the gradient alone. When you restart the algorithm, it will use the gradient rather than the descent direction it should have used.


I follow the example in the GSL document and set my success criteria as

status = gsl_multimin_test_gradient (s->gradient, 1e-20);

1e-20 is very very small, maybe to close to machine precision to give reliable results. But I'm far from being an expert in this domain. What look strange is that if the algorithm stops because of this criteria, I don't know why it can continue again after. It would mean that the norm of the gradient is going up after having reached 1e-20 once. This might be related to numerical errors.


The initial setup of the vector is

T = gsl_multimin_fdfminimizer_conjugate_pr;
s = gsl_multimin_fdfminimizer_alloc (T, 3*nparts);
gsl_multimin_fdfminimizer_set (s, &energy, x, 0.001, 1e-12);

1e-12 is again quite small.


For clarity, I used dimensionless units so ALL the nearest neighbor distances and the coupling constants are of the order of unity. nparts is the number of particles which is about 100000.

Could you give some advises? Thanks a lot in advance.

Toan

I think that you are basically puting the code under some pressure, because of the size of your problem and of the very low limits you are using. To my knowledge, it has not been tested under such constraints and it's therefore difficult to give sensible advice.


Fabrice



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]