Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve obj and grad! #375

Merged
merged 2 commits into from
Aug 26, 2024

Conversation

amontoison
Copy link
Member

No description provided.

@amontoison amontoison requested a review from dpo August 21, 2024 20:29
Copy link

codecov bot commented Aug 21, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 77.49%. Comparing base (f2ad6a4) to head (e8c16d2).
Report is 73 commits behind head on main.

Additional details and impacted files
@@             Coverage Diff             @@
##             main     #375       +/-   ##
===========================================
- Coverage   89.11%   77.49%   -11.63%     
===========================================
  Files           5        8        +3     
  Lines         790     1524      +734     
===========================================
+ Hits          704     1181      +477     
- Misses         86      343      +257     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@amontoison
Copy link
Member Author

Memory leak in cutest_cigr_...

Please submit a bug report with steps to reproduce this fault, and any error messages that follow (in their entirety). Thanks.
Exception: EXCEPTION_ACCESS_VIOLATION at 0x6c301825 -- .text at D:\a\CUTEst.jl\CUTEst.jl\deps\files\libHS6_double.DLL (unknown line)
in expression starting at D:\a\CUTEst.jl\CUTEst.jl\test\nlpmodelstest.jl:1
.text at D:\a\CUTEst.jl\CUTEst.jl\deps\files\libHS6_double.DLL (unknown line)
.text at D:\a\CUTEst.jl\CUTEst.jl\deps\files\libHS6_double.DLL (unknown line)
cutest_cigr_ at D:\a\CUTEst.jl\CUTEst.jl\deps\files\libHS6_double.DLL (unknown line)
cutest_cigr_ at D:\a\CUTEst.jl\CUTEst.jl\src\libcutest.jl:411
cigr at D:\a\CUTEst.jl\CUTEst.jl\src\core_interface.jl:3312 [inlined]
grad! at D:\a\CUTEst.jl\CUTEst.jl\src\julia_interface.jl:32

src/julia_interface.jl Show resolved Hide resolved

function NLPModels.obj(nlp::CUTEstModel{T}, x::AbstractVector) where T
@lencheck nlp.meta.nvar x
x_ = Vector{T}(x)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe this could be preallocated in the problem structure?

Copy link
Member Author

@amontoison amontoison Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we could but it's the same issue with all routines.
I'm wondering if we should not remove the dispatch for input vectors of wrong type.
If the CUTEstModel is in double precision, the user should provide input arrays in double precision.
What do you think?

Otherwise, we need to duplicate storage in the CUTEstModel.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most of Julia dispatches in that way. It’s useful because we routinely call problem functions with views.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Views are handled by the other method with x::StrideOneVector{T}.

This kind of method is only used if x is a Vector{Float32} or a CuVector{Float64} and nlp is a CUTEstModel{Float64}.
But should we handle that?

function NLPModels.grad!(nlp::CUTEstModel{T}, x::AbstractVector, g::AbstractVector) where T
@lencheck nlp.meta.nvar x g
x_ = Vector{T}(x)
g_ = Vector{T}(g)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Preallocate this too?

@amontoison amontoison merged commit 217a575 into JuliaSmoothOptimizers:main Aug 26, 2024
12 of 15 checks passed
@amontoison amontoison deleted the obj_grad_optimized branch August 26, 2024 17:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants