Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUTEst.jl and AutoDiff -- compute high-order derivatives #318

Closed
mahaa2 opened this issue Jan 2, 2024 · 10 comments
Closed

CUTEst.jl and AutoDiff -- compute high-order derivatives #318

mahaa2 opened this issue Jan 2, 2024 · 10 comments

Comments

@mahaa2
Copy link

mahaa2 commented Jan 2, 2024

Hi is there a way to use auto-diff within CUTEst.jl ?

For example, whenever using something like,

f(t) = -hprod(cute_model, x + u *t, u)
ForwardDiff.derivative(f, 0.0)

I get an error

@tmigot
Copy link
Member

tmigot commented Jan 2, 2024

Hi @mahaa2 !
I don't think it is possible because the problems are coded in a different language.
However, several all the problems are available here https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl in ADNLPModels format.

@mahaa2
Copy link
Author

mahaa2 commented Jan 2, 2024

Hi @tmigot,

Thanks for the reply.
For example, how could I load the model "WATSON" from the cute library in this ADNLP format ?

Thanks.

@tmigot
Copy link
Member

tmigot commented Jan 2, 2024

Sure, no problem.

using ADNLPModels, OptimizationProblems
nlp = OptimizationProblems.ADNLPProblems.watson()
using ADNLPModels, OptimizationProblems
nlp = OptimizationProblems.ADNLPProblems.watson(type = Float32) # allows to access the model in simple precision instead of `Float64`
using ADNLPModels, OptimizationProblems
nls = OptimizationProblems.ADNLPProblems.watson(use_nls = true)

would return the problem in NLS (nonlinear least squares) format so you can access the residual directly.

Feel free to have a look at the package documentation https://jso.dev/OptimizationProblems.jl/dev/ . In particular, it shows how to run benchmark, etc.

The aim is to add as much problem as possible there, but a this is a slow process.

@mahaa2
Copy link
Author

mahaa2 commented Jan 2, 2024

Hi @tmigot,

Thanks again.
But it seems that only few of cute models (from here https://www.cuter.rl.ac.uk//mastsif.html)
are available. I am not sure. At least this part

OptimizationProblems.ADNLPProblems.watson()

doesn't seems to give me any output nor it seems to be in the list of models ...

@mahaa2
Copy link
Author

mahaa2 commented Jan 2, 2024

Another thing, for example, this didn't run. If I am doing mistakes let me know,

using OptimizationProblems
using ADNLPModels

nlp = OptimizationProblems.ADNLPProblems.chnrosnb_mod()

x = randn(nlp.meta.nvar);
v = randn(nlp.meta.nvar);

f(t) = -hprod(nlp, x + v * t, v)
f(0.0)
FD.derivative(f, 0.0)

@tmigot
Copy link
Member

tmigot commented Jan 2, 2024

Hi @tmigot,

Thanks again. But it seems that only few of cute models (from here https://www.cuter.rl.ac.uk//mastsif.html) are available. I am not sure. At least this part

OptimizationProblems.ADNLPProblems.watson()

doesn't seems to give me any output nor it seems to be in the list of models ...

I am not sure, to understand. I tried this with OptimizationProblems v0.7.3 and ADNLPModels v0.7.0

julia> nlp = OptimizationProblems.ADNLPProblems.watson()
ADNLPModel - Model with automatic differentiation backend ADModelBackend{
  ForwardDiffADGradient,
  ForwardDiffADHvprod,
  EmptyADbackend,
  EmptyADbackend,
  EmptyADbackend,
  ForwardDiffADHessian,
  EmptyADbackend,
}
  Problem name: watson
   All variables: ████████████████████ 31     All constraints: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            free: ████████████████████ 31                free: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
          infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            nnzh: (  0.00% sparsity)   496             linear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                    nonlinear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                         nnzj: (------% sparsity)

  Counters:
             obj: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 grad: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 cons: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
        cons_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0             cons_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 jcon: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           jgrad: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                  jac: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              jac_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         jac_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                jprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0            jprod_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
       jprod_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               jtprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0           jtprod_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
      jtprod_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 hess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                hprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           jhess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               jhprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0

In case, computing the directional derivative of hprod doesn't work as you did, you can also access the objective function directly via nlp.f and then use ForwardDiff to compute all the derivatives that you need.

Adding more problems to OptimizationProblems.jl is a work in progress and quite slow as we are understaffed.

Maybe @dpo or @abelsiqueira would have more insights.

@tmigot tmigot changed the title CUTEst.jl and AutoDiff CUTEst.jl and AutoDiff -- compute high-order derivatives Jan 2, 2024
@mahaa2
Copy link
Author

mahaa2 commented Jan 2, 2024

I am getting an error if I load is this manner,

using ADNLPModels
using OptimizationProblems

image

@dpo
Copy link
Member

dpo commented Jan 3, 2024

@mahaa2 The code that Tangi shows also works for me on macOS with Julia 1.10.0 and

 [54578032] ADNLPModels v0.7.0
  [f6369f11] ForwardDiff v0.10.36
  [a4795742] NLPModels v0.20.0
  [5049e819] OptimizationProblems v0.7.3

Could you say what platform you are on, what version of Julia, and what version of the packages you are using?

You have to add and use NLPModels in order to have access to hprod() (ADNLPModels does not reexport it).

However, what you’re trying to do isn’t currently working. We will investigate.

@mahaa2
Copy link
Author

mahaa2 commented Jan 3, 2024

I am using :

  • Ubuntu 22.04.3 LTS
  • Julia version v"1.6.7" and vs-code

I have just changed the package version of OptimizationProblems, now it seems to work

@abelsiqueira
Copy link
Member

Looks like the issue was solved, so I am closing this. Feel free to reopen otherwise.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants