Free Python optimization framework

Monday, December 3, 2007

constraints handling for NLP/NSP solver ralg

I have implemented some changes to ralg, now it can handle constrained problems (as well as 1st derivatives df, dc, dh; subgradents are also allowed). Currently only max residual is used, so you can pass just single constraint c(x)=max_i,j_ {0, c[i](x), h[j](x)}. I intend to add more accurate handling of constraints, especially box-bound and linear, in future.
Below you can see same benchmark example as published some days before.
I tried to reduce gradtol value for ALGENCAN for to achieve better f_opt but it yields either even more bigger (30.72) or endless calculations (or, at least, very huge, 1 minute was not enough).

One of features of the ralg constraints implementation is: no needs to calculate df in infeasible points (where max constr > contol). And (as it is common for non-smooth solvers) wise versa: no needs to calculate constraints derivatives in feasible points.


Other ralg features:
  • can handle ill-conditioned, non-smooth, noisy problems
  • each iteration consumes 5*n^2 multiplication operations (float64*float64); there is another modification of r-alg that consumes only 4*n^2, maybe it will be implemented in future
  • one of r-alg drawbacks is low precision (no better than ~1e6 on 32-bit architecture). Our department leader Petro I. Stetsyuk has an idea implemented in Fortran ralg version that allows to achieve precision up to 1e-40. Maybe, it will be implemented in Python ralg code in future.

No comments: