-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NLPModels computes gradient with elements out of order #316
Comments
Is that with the JuMP model, or the ADNLPModel? Could you please give a snippet that reproduces the issue? |
@dpo I don't know what NLPModels uses under the hood to compute gradient, but to reproduce do it: using NLPModels, CUTEst
nlp = CUTEstModel("HS27")
grad(nlp,nlp.meta.x0) One screenshot with the result: The initial values are [2.0,2.0,2.0]. |
@dpo this behaviour is strange because in some NLP problems the gradient is computed in the correct order, I mean: |
Ok, you're in the wrong repository. You're talking about CUTEst.jl models. |
Hi @MaiconMares, what order were you expecting? The paper's or wrt other packages? Maybe it is related to the setup options that we use to read the cutest file. |
Hi @abelsiqueira the correct order should be [16.02,-4.0,0.0]. How can I change the way CUTEst Julia package reads .SIF files? |
Line 264 in 9467c8c
See above and check the docs of the function. |
Hey @MaiconMares , what is reference you use to state what is the ''correct'' formulation? |
Thank you @abelsiqueira, it was the problem. CUTEstModel("HS8",lfirst=false,lvfirst=false) |
In some non linear problems the
grad()
function is computing gradient correctly but returning partial derivatives out of order. For example in the HS27 problem the correct result should be [16.02,-4.0,0.0], but it is returning [0.0,-4.0,16.02]. I don't understand why this behaviour is happening.The text was updated successfully, but these errors were encountered: