Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: more generic convergence assessment #530

Merged
merged 3 commits into from
Feb 15, 2018
Merged

RFC: more generic convergence assessment #530

merged 3 commits into from
Feb 15, 2018

Conversation

jonathanBieler
Copy link
Contributor

I've tried to implement some of the changes discussed in #309, which allows my two minimal examples of ZerothOrderOptimizer and FirstOrderOptimizer defined outside of Optim to run.

@pkofod
Copy link
Member

pkofod commented Feb 5, 2018

I love that the commits have two versions of you as authors :)

I like the change. Does this pass all tests locally @ you?

@codecov
Copy link

codecov bot commented Feb 5, 2018

Codecov Report

Merging #530 into master will not change coverage.
The diff coverage is 100%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #530   +/-   ##
=======================================
  Coverage   89.87%   89.87%           
=======================================
  Files          35       35           
  Lines        1777     1777           
=======================================
  Hits         1597     1597           
  Misses        180      180
Impacted Files Coverage Δ
src/multivariate/optimize/optimize.jl 93.18% <100%> (-0.3%) ⬇️
src/utilities/assess_convergence.jl 100% <100%> (ø) ⬆️
...c/multivariate/solvers/zeroth_order/nelder_mead.jl 78.01% <100%> (+0.15%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ede177e...58d1952. Read the comment docs.

@jonathanBieler
Copy link
Contributor Author

I don't know why there's two me there... tests are passing yes, I had a small mistake in the PR.

Copy link
Contributor

@anriseth anriseth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this is the right way to go. Will all the g_residual methods still be necessary now? (Those for the nondifferentiable cases)

At some point could also redo convergence assessment to make it more flexible for the user.

@@ -13,6 +13,14 @@ update_h!(d, state, method::SecondOrderOptimizer) = hessian!(d, state.x)

after_while!(d, state, method, options) = nothing

initial_convergence(d, state, method, initial_x, options) = false
initial_convergence(d, state, method::ZerothOrderOptimizer, initial_x, options) = false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this used anywhere?

converged = x_converged || f_converged || g_converged

return x_converged, f_converged, g_converged, converged, f_increased
end

gradient_convergence_assessment(state::AbstractOptimizerState, d, options) = g_residual(gradient(d)) ≤ options.g_tol
gradient_convergence_assessment(state::ZerothOrderState, d, options) = false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this used anywhere?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, in Jonathan's solver outside of Julia :) He wants to make some ZerothOrderSolver's outside of Optim but hook into everything else in Optim

@pkofod
Copy link
Member

pkofod commented Feb 8, 2018

@jonathanBieler could you add a few tests? Just to make sure everything works as expected.

@jonathanBieler
Copy link
Contributor Author

I've added a couple of tests, I had to make DummyMethod type and change a few things, since those were not used anywhere, I assumed it's fine.

@pkofod pkofod merged commit 0d304a1 into JuliaNLSolvers:master Feb 15, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants