Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GSoC 2018 proposals #523

Closed
anriseth opened this issue Feb 1, 2018 · 19 comments
Closed

GSoC 2018 proposals #523

anriseth opened this issue Feb 1, 2018 · 19 comments

Comments

@anriseth
Copy link
Contributor

anriseth commented Feb 1, 2018

@pkofod and I thought it would be useful to brainstorm about possible GSoC 2018 projects.
If anybody has ideas for projects or feedback on the points below that would be great.

Constrained optimization
Implement better support for constrained optimization in Optim and possibly connect it up against MathProgBase.
Main goal: Implement a first-order interior point method.
By the summer we should have implemented the Interior-Point Newton algorithm written by Tim Holy in the Optim-universe, so I envision that this project can build on this work.
Secondary goal: Set up a MathProgBase interface, and benchmark the algorithms against Ipopt (or similar).

Benchmarking / parameter tuning
Implement a type of "parameter tuning/recommendation" software to work on top of the algorithms, in the spirit of what @cortner proposed. Potential research for inspiration is that of Philippe Toint. See, for example, Section 3.2 in http://perso.fundp.ac.be/~phtoint/pubs/NTR-06-2015.pdf
Secondary goal: Create a benchmarking suite that makes it easy to discover regressions and aide development of new algorithms.

Exotic optimization examples
I think it could be interesting to create multiple examples of "exotic" optimization examples that can show the flexibility of Optim in handling input types. Easy examples such as RecursiveArrays, complex numbers and manifolds can be a starting point. A cool goal could be to optimize over a function-space, as represented with, for example, ApproxFuns. I'm sure difficulties will arise which can require us to change the code somehow, such as considerations of other inner products that ell_2. (Is this sufficient for GSoC?)

@ChrisRackauckas
Copy link
Contributor

Implement better support for constrained optimization in Optim and possibly connect it up against MathProgBase.

MathProgBase is out. MathOptInterface is the new one.

Exotic optimization examples

Parameter estimation of differential equations is a good source of optimization problems as well.

http://docs.juliadiffeq.org/latest/analysis/parameter_estimation.html

But I think that this project would have to be merged with (2). You should probably have something harder setup as a backup to do alongside this.

@anriseth
Copy link
Contributor Author

anriseth commented Feb 1, 2018

MathProgBase is out. MathOptInterface is the new one.

Do they have any reliable dates for when MOI gets nonlinear programming support?

Parameter estimation of differential equations is a good source of optimization problems as well.

Great :)

But I think that this project would have to be merged with (2). You should probably have something harder setup as a backup to do alongside this.

Yeah, I don't have much experience with GSoC so others will have to advise on more appropriate scaling of the difficulty and size of projects

@ChrisRackauckas
Copy link
Contributor

I would say it's always good to have a hard (algorithm) problem and an easy (benchmarking/testing) problem, so you can pivot between as necessary. This pairing should exist per student IMO.

@pkofod
Copy link
Member

pkofod commented Feb 1, 2018

I should maybe mention that we actually have a repo for mathprogbase integration already, though it's of course if limited importance without constraints.

@cortner
Copy link
Contributor

cortner commented Feb 1, 2018

  • Integrate hessian-free solvers, such as Newton-Krylov and Broyden into Optim.jl (I've ported a Newton Krylov solver to Julia, but not integrated with Optim and I haven't got around to Broyden).

Now that I think about it, this is nonlinear systems and not Optimisation. Do you want to include Nonlinear systems as well in this project?

  • Design a type system that handles different kinds of constraints appropriately. E.g., Antoine's manifold optimisation will work very well for "simple" constraints where the projection is more or less explicit. Should affine constraints be handled by his system, or should they be treated by a simpler mechanism yet? And more complex constraints can be treated via interior point or augmented Lagrangian, etc.

I'll try to think of more ideas . . .

@pkofod
Copy link
Member

pkofod commented Feb 1, 2018

I actually have a Broyden implementation I just haven't pushed to NLsolve yet, but nonlinear systems are at least as interesting as optimization for a GSoC proposal from my point of view. Remember, you're also more than welcome to suggest your students to participate @cortner ;)

@antoine-levitt
Copy link
Contributor

Design a type system that handles different kinds of constraints appropriately. E.g., Antoine's manifold optimisation will work very well for "simple" constraints where the projection is more or less explicit. Should affine constraints be handled by his system, or should they be treated by a simpler mechanism yet? And more complex constraints can be treated via interior point or augmented Lagrangian, etc.

If you're talking about equality constraints then I think the manifold interface is perfectly fine: just compute the projection on {Ax=b}. Inequality constraints is a whole other thing though.

Things I'd like to see in the Optimverse (not particularly formatted for a GSoC):

  • Line searches interface refactor, so LineSearches.jl only works with a 1D function
  • Consistent audit of type specificity
  • Comparative benchmark with a state-of-the-art optimization package for unconstrained optimization, and battle-testing / fine-tuning of the best performing methods in Optim, culminating in a decision tree (possibly automated) for solver selection
  • Quasi-Newton (Broyden/Newton-Krylov) into NLSolve, and previous item applied to NLSolve
  • Merges between Optim and other libraries for unconstrained optimization (BlackBoxOptim/NLOpt?)
  • Investigation of large-scale parallelism (is it just a matter of plugging in DistributedArrays or something or does Optim need to do something specific?)
  • Automatic multi-precision (e.g. start optimizing with Float32 and then switch to Float64 / BigFloat according to the user-specified tolerance)

@ChrisRackauckas suggestion of a pairing between "hard" (implementation of new things) and "easy" (benchmarking/refactoring) seems like a good one.

There are some TODOs on manifold optimization if anyone's interested, but that's more niche.

@cortner
Copy link
Contributor

cortner commented Feb 7, 2018

Non-monotone linesearch

@pkofod
Copy link
Member

pkofod commented Feb 7, 2018

Line searches interface refactor, so LineSearches.jl only works with a 1D function

I know where/what you're getting at, but can I ask how you would prefer a call to a line search procedure would look? Just curious as to what API you have in mind.

@antoine-levitt
Copy link
Contributor

Ideally, linesearch(alpha_guess, phi, phi_p, phi_pp) where phi is defined as f(x + alpha d). Caching could be hidden from linesearches by having the functions phi{,p,pp} memorize their result.

@pkofod
Copy link
Member

pkofod commented Feb 7, 2018

what are phi_p and phi_pp?

edit: maybe I should clarify: I'm not tring to argue against anything. I've been thinking about an improved api myself, so I'm curious as to what you have in mind.

@cortner
Copy link
Contributor

cortner commented Feb 7, 2018

Caching could be hidden from linesearches by having the functions phi{,p,pp} memorize their result.

I think that's already implemented

what are phi_p and phi_pp?

probably first and second derivatives along the step-size parameter

@pkofod
Copy link
Member

pkofod commented Feb 7, 2018

If you're okay with linesearch being a callable type, then it's sort of already implemented close to that way I think.

@antoine-levitt
Copy link
Contributor

I was not clear, sorry: looking at https://github.com/JuliaNLSolvers/LineSearches.jl/blob/master/src/static.jl there's lines like x_scratch .= x .+ alpha.*s. Ideally, this should not be the case because line searches are about one thing - sort-of-minimizing phi(alpha), and it doesn't matter that phi(alpha) is defined as f(x+alpha s). So a cleaner interface would be to pass not f (and its derivatives) but phi (and its derivatives). This causes some concern for caching, which is done at the level of f, but that is solvable. Writing this I realize this may be wasteful because then the x .+= alpha.*s needs to be done multiple times (unless that is cached too). I am used to thinking about arithmetic being cheap and function evaluations being expensive, but that might not always be the case. In the end it's probably more trouble than it's worth, forget I said anything ;-)

@cortner
Copy link
Contributor

cortner commented Feb 7, 2018

I think this sort of change will make the linesearches more easily usable from other packages

@anriseth
Copy link
Contributor Author

anriseth commented Feb 7, 2018

Refactoring line searches as functions of \phi(\alpha) also aligns better with the line searches implemented in #303 (and hopefully merged to Optim within the month)

@pkofod
Copy link
Member

pkofod commented Feb 7, 2018

Yeah, currently the linesearches are tied into the "Optimverse", but if we could still accommodate all our needs in Optim and still provide an easy to use API for "other people" that would be the best.

I don't see what the trouble is here though. Caching is done at the f level as you say @antoine-levitt, but that is fine, right? This just means that if you use an NDifferentiable underneath, you get some extra benefits. I think this is doable and should be done, we've just not done it yet.

@pkofod
Copy link
Member

pkofod commented Feb 7, 2018

what I'm saying it basically that in Optim phi would be defined in terms of the NDifferentiable

od = OnceDifferentiable ...
# x_scratch, x and s live out here in some form (maybe in a struct or similar)
function myphi(alpha)
    x_scratch .= x .+ alpha.*s
    value!(od, x_scratch)
end
linesearch(alpha_guess, myphi)

@antoine-levitt
Copy link
Contributor

Ah, that's much simpler than what I had in mind. Then yes this is a very nice API.

@pkofod pkofod closed this as completed May 23, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants