-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Documentation: inconsistency in whether constraints are zero #460
Comments
Hi @timholy! |
But doesn't the fundamental problem represented by NLPModels say that it solves Is there the equivalent of |
It solves cc @tmigot |
Yes, you have this information in con(nlp, x)[nlp.meta.jfix] ≈ nlp.meta.lcon[nlp.meta.jfix] or con(nlp, x)[nlp.meta.jfix] ≈ nlp.meta.ucon[nlp.meta.jfix]
|
I agree with you. |
There seems to be an inconsistency in the docs. Below the equation you quote, we state that the constraints are |
Sorry, just circling back to this. @dpo I guess your point is that given the desirability of (notationally) unifying equalities and inequalities, the better approach is to drop the guarantee of I've edited the title of this issue accordingly. Perhaps someone with appropriate privileges should label it a documentation issue and transfer it to NLPModels instead? |
@timholy If you prefer the right-hand side of your equality constraints to be zero, it’s enough to build it into the constraint and set the right-hand side to zero explicitly when you define the model. In other words, the safe thing to do is to always evaluate |
Right. To be clear, the way this came up was using OptimizationProblems.jl as a repository of test problems for an external optimizer, and then being surprised that the solution wasn't as stated in the documentation. It's really just a documentation issue, but one that's a significant "gotcha" for external consumers of your interface. |
Thanks for bringing this to our attention! |
This eliminates the inconsistency in `c_i = 0` vs `c_L <= c_i <= c_U` notes in JuliaSmoothOptimizers#460. It also: - adds Manifest.toml to `.gitignore` - standardizes on `y` rather than `λ` for Lagrange multipliers - clarifies or polishes wording in various places - adds `NLPModels` to `docs/Project.toml` (this is standard and might allow simplification of your `workflow`s)
Consider HS62, for which the constraint is
and the initial value is
x0 = [0.7, 0.2, 0.1]
which sums to 1. Thus you expect the initial constraint value to be zero. However:My suspicion is that the bug is in this package, but I'm unsure. It seems likely that it's neglecting to incorporate the constant offset in the JuMP model?
The text was updated successfully, but these errors were encountered: