-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GSoC 2018 proposals #523
Comments
MathProgBase is out. MathOptInterface is the new one.
Parameter estimation of differential equations is a good source of optimization problems as well. http://docs.juliadiffeq.org/latest/analysis/parameter_estimation.html But I think that this project would have to be merged with (2). You should probably have something harder setup as a backup to do alongside this. |
Do they have any reliable dates for when MOI gets nonlinear programming support?
Great :)
Yeah, I don't have much experience with GSoC so others will have to advise on more appropriate scaling of the difficulty and size of projects |
I would say it's always good to have a hard (algorithm) problem and an easy (benchmarking/testing) problem, so you can pivot between as necessary. This pairing should exist per student IMO. |
I should maybe mention that we actually have a repo for mathprogbase integration already, though it's of course if limited importance without constraints. |
Now that I think about it, this is nonlinear systems and not Optimisation. Do you want to include Nonlinear systems as well in this project?
I'll try to think of more ideas . . . |
I actually have a Broyden implementation I just haven't pushed to NLsolve yet, but nonlinear systems are at least as interesting as optimization for a GSoC proposal from my point of view. Remember, you're also more than welcome to suggest your students to participate @cortner ;) |
If you're talking about equality constraints then I think the manifold interface is perfectly fine: just compute the projection on {Ax=b}. Inequality constraints is a whole other thing though. Things I'd like to see in the Optimverse (not particularly formatted for a GSoC):
@ChrisRackauckas suggestion of a pairing between "hard" (implementation of new things) and "easy" (benchmarking/refactoring) seems like a good one. There are some TODOs on manifold optimization if anyone's interested, but that's more niche. |
Non-monotone linesearch |
I know where/what you're getting at, but can I ask how you would prefer a call to a line search procedure would look? Just curious as to what API you have in mind. |
Ideally, linesearch(alpha_guess, phi, phi_p, phi_pp) where phi is defined as f(x + alpha d). Caching could be hidden from linesearches by having the functions phi{,p,pp} memorize their result. |
what are edit: maybe I should clarify: I'm not tring to argue against anything. I've been thinking about an improved api myself, so I'm curious as to what you have in mind. |
I think that's already implemented
probably first and second derivatives along the step-size parameter |
If you're okay with |
I was not clear, sorry: looking at https://github.com/JuliaNLSolvers/LineSearches.jl/blob/master/src/static.jl there's lines like |
I think this sort of change will make the linesearches more easily usable from other packages |
Refactoring line searches as functions of \phi(\alpha) also aligns better with the line searches implemented in #303 (and hopefully merged to Optim within the month) |
Yeah, currently the linesearches are tied into the "Optimverse", but if we could still accommodate all our needs in Optim and still provide an easy to use API for "other people" that would be the best. I don't see what the trouble is here though. Caching is done at the f level as you say @antoine-levitt, but that is fine, right? This just means that if you use an NDifferentiable underneath, you get some extra benefits. I think this is doable and should be done, we've just not done it yet. |
what I'm saying it basically that in Optim phi would be defined in terms of the NDifferentiable
|
Ah, that's much simpler than what I had in mind. Then yes this is a very nice API. |
@pkofod and I thought it would be useful to brainstorm about possible GSoC 2018 projects.
If anybody has ideas for projects or feedback on the points below that would be great.
Constrained optimization
Implement better support for constrained optimization in Optim and possibly connect it up against MathProgBase.
Main goal: Implement a first-order interior point method.
By the summer we should have implemented the Interior-Point Newton algorithm written by Tim Holy in the Optim-universe, so I envision that this project can build on this work.
Secondary goal: Set up a MathProgBase interface, and benchmark the algorithms against Ipopt (or similar).
Benchmarking / parameter tuning
Implement a type of "parameter tuning/recommendation" software to work on top of the algorithms, in the spirit of what @cortner proposed. Potential research for inspiration is that of Philippe Toint. See, for example, Section 3.2 in http://perso.fundp.ac.be/~phtoint/pubs/NTR-06-2015.pdf
Secondary goal: Create a benchmarking suite that makes it easy to discover regressions and aide development of new algorithms.
Exotic optimization examples
I think it could be interesting to create multiple examples of "exotic" optimization examples that can show the flexibility of Optim in handling input types. Easy examples such as RecursiveArrays, complex numbers and manifolds can be a starting point. A cool goal could be to optimize over a function-space, as represented with, for example,
ApproxFun
s. I'm sure difficulties will arise which can require us to change the code somehow, such as considerations of other inner products thatell_2
. (Is this sufficient for GSoC?)The text was updated successfully, but these errors were encountered: