-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RFC: more generic convergence assessment #530
RFC: more generic convergence assessment #530
Conversation
I love that the commits have two versions of you as authors :) I like the change. Does this pass all tests locally @ you? |
Codecov Report
@@ Coverage Diff @@
## master #530 +/- ##
=======================================
Coverage 89.87% 89.87%
=======================================
Files 35 35
Lines 1777 1777
=======================================
Hits 1597 1597
Misses 180 180
Continue to review full report at Codecov.
|
I don't know why there's two me there... tests are passing yes, I had a small mistake in the PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, this is the right way to go. Will all the g_residual
methods still be necessary now? (Those for the nondifferentiable cases)
At some point could also redo convergence assessment to make it more flexible for the user.
@@ -13,6 +13,14 @@ update_h!(d, state, method::SecondOrderOptimizer) = hessian!(d, state.x) | |||
|
|||
after_while!(d, state, method, options) = nothing | |||
|
|||
initial_convergence(d, state, method, initial_x, options) = false | |||
initial_convergence(d, state, method::ZerothOrderOptimizer, initial_x, options) = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this used anywhere?
converged = x_converged || f_converged || g_converged | ||
|
||
return x_converged, f_converged, g_converged, converged, f_increased | ||
end | ||
|
||
gradient_convergence_assessment(state::AbstractOptimizerState, d, options) = g_residual(gradient(d)) ≤ options.g_tol | ||
gradient_convergence_assessment(state::ZerothOrderState, d, options) = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this used anywhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, in Jonathan's solver outside of Julia :) He wants to make some ZerothOrderSolver's outside of Optim but hook into everything else in Optim
@jonathanBieler could you add a few tests? Just to make sure everything works as expected. |
I've added a couple of tests, I had to make DummyMethod type and change a few things, since those were not used anywhere, I assumed it's fine. |
I've tried to implement some of the changes discussed in #309, which allows my two minimal examples of ZerothOrderOptimizer and FirstOrderOptimizer defined outside of Optim to run.