-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added problem hessians and tests #104
Conversation
Hm. For some reason your git is messing with the file permissions. I haven't seen that issue before, but have a look here. Maybe that can fix it. |
Can I ask where you can see that there is a problem? 1) so I can see the error message and 2) so I can see if it is fixed if I try fix it |
If you take look here https://github.com/JuliaOpt/Optim.jl/pull/104/files then after each file name it says 100644 → 100755 which is showing the change in file permissions. |
I have no idea why that happened. I just chmod'ed, committed, and pushed. Now it should be fine, right? |
Looks much better. |
@@ -77,70 +87,75 @@ examples["Fletcher-Powell"] = OptimizationProblem("Fletcher-Powell", | |||
fletcher_powell, | |||
fletcher_powell_gradient!, | |||
fletcher_powell_hessian!, | |||
[0.0, 0.0, 0.0], | |||
[[0.0, 0.0, 0.0]], # TODO: Fix | |||
[-1.0, 0.0, 0.0], # Same as in source |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So this works now with Newton's method?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, because the Newton tests are only run for the problems with twicedifferentiablefunction == true.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Optim.nelder_mead finds the optimum, but I'm not sure what would happen with a method based on the gradient or Hessian. The gradient isn't really defined at the optimum I guess, since d atan/dx is 1/(x^2/y + y), and the solution is x=(x,y,z)=(1,0,0), and you would also have a 1/y'ish term in the hessian.
Minor nit: we don't put spaces around |
I've changed the ^ locally, and they'll be fixed when I push the next time. |
So I changed the ^-spacing, and I also changed the initial value from [2.8,1.8] to [2.0,1.5] just to start a further away from [3,2]. To be honest I don't think it makes sense to think too much about what the starting values should be in these test problems, until the problem with starting in concave regions is resolved. Whenever I seem to start at a concave region, I get errors. I wouldn't expect this to happen, though, but I have to read a bit more into how Optim's Newton-algorithm is globalized. |
I don't think there's anything special done to globalize Newton's method. |
Okay, so in that case I guess it actually should throw an error if I start it in a concave region, right? I mean, the direction is going to be uphill, and without a correction to the hessian, the linesearch isn't going to help. |
@johnmyleswhite Do you have further comment here? If not, I think it can be merged when the commits have been squashed. |
No more comments. Merge ready now for sure. |
permission change permission change Changed space around ^
Squashed and pushed... Let's see what Travis has to say! |
@johnmyleswhite Can you push the bottom? I'm not authorized to that. |
Added problem hessians and tests
PR as a follow-up to #101 where I messed up the git history