-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
only_fgh! doesn't work if algorithm does not need Hessian #718
Comments
Funny should should open this, as another user has also just made me aware of this. I'll have to think about it. But it'll be fixed. |
Just to clarify, would this allow for a |
I think that should be the job of the constructor of a twicedifferentiable that takes in a oncedifferentiable. I have a new version of optim in the works and this part will be changed quite a bit, so I’m sort of only patching here. |
Got it. Looking forward to the new design. 👍 |
You can be one of the lucky beta testers... I'll provide you with many internet points! (though they'll be myspace add credits...) |
this seems like a problem in NLSolversBase |
Sorry everyone! Kept slipping out of my mind. Should work now. |
Thanks so much! I'll give it a whirl soon. |
I can use
Optim.only_fg!
with a gradient-free method:This can be convenient, in the sense that you only have to write
fg!
and then your code works for all gradient free and gradient required algorithms, which can be useful e.g. for benchmarking purposes.As stated in the docs, the gradient free algorithm can call
fg!
withG === nothing
to request the function value only and not the gradient.However this behavior is broken for higher derivatives.
Here
NelderMead()
does not need gradient nor hessian, andonly_fgh!
fails:Instead I'd expect
NelderMead
to callfgh!
with bothG===nothing
andH===nothing
, to request the function value only but not the gradient nor the Hessian.Similarly,
LBFGS()
needs the gradient but not the Hessian, and againonly_fgh!
fails:In this case I'd expect
LBFGS
to callfgh!
withH===nothing
, to request the function value and the gradient but not the Hessian.The text was updated successfully, but these errors were encountered: