-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pros and cons of using other interfaces inside julia_interface.jl
#42
Comments
Perhaps we could just run some benchmarks on a few large problems to determine what the penalty would be. |
I'm wondering if the core interface should be exported at all. Most users will use the julia interface (I think) and if you don't want to work with NLPModels, the specialized interface is as friendly as it could be. What do you think? As a side comment, the non |
I think not exporting it is best. Also, the specialized interface has a reasonable amount of functions defined.
Maybe we should remove some? I feel using This might take some time. |
Ok, I agree with not exporting the core interface. Do you think a user is likely to use the specialized interface with nlp? If they use an nlp, won't they just use the julia interface? This is by no means urgent. |
There might be (hopefully insignficant) speed differences between julia and specialized, but I think the main reason to choose one over the other are the specific functions CUTEst provides, like |
We should benchmark the different interfaces but I agree that the difference is probably not significant. I'm happy to expose the Julia and specialized interfaces. |
I made some benchmarks on the NLPModels interface vs the specialized interface (here). Do you think we should invest more time on this? The NLPModels interface is already using |
Did you by any chance plot a flame graph to try and see where the difference comes from? Would it help if the Julia interface called the specialized interface? |
Okay, so, investigating did help. For both problems, What's strange though, is the difference in
I didn't find anything very useful with the flame graph. Creating array-like variables ( |
I suppose the problem is that |
It's possible that there is a slow down because of type-stability, I'll have to check, but for larger problems, the times of equivalent functions is equivalent. Try the following after the fix on uofg: using BenchmarkTools
using CUTEst
function foo()
nlp = CUTEstModel("DIXMAANJ")
try
x = nlp.meta.x0
t = @benchmark objcons($nlp, $x)
println(t)
io_err = Cint[0]
n = Cint[nlp.meta.nvar]
f = Cdouble[0]
g = Cdouble[]
grad = Cint[0]
t = @benchmark uofg($io_err, $n, $x, $f, $g, $grad)
println(t)
n = nlp.meta.nvar
t = @benchmark uofg($n, $x, false)
println(t)
finally
finalize(nlp)
end
end
foo() However, for some problems using BenchmarkTools
using CUTEst
function foo()
nlp = CUTEstModel("GENROSE", "param", "N=10000")
try
io_err = Cint[0]
n = Cint[nlp.meta.nvar]
x = nlp.meta.x0
f = Cdouble[0]
g = Cdouble[]
grad = Cint[0]
t = @benchmark ufn($io_err, $n, $x, $f)
println(t)
t = @benchmark uofg($io_err, $n, $x, $f, $g, $grad)
println(t)
finally
finalize(nlp)
end
end
foo() |
I just noticed that the in-place functions in the specialized interface don't return anything. I believe that's an error, e.g., function usetup(input::Int, out::Int, io_buffer::Int, n::Int)
x = Array{Cdouble}(n)
x_l = Array{Cdouble}(n)
x_u = Array{Cdouble}(n)
usetup!(input, out, io_buffer, n, x, x_l, x_u)
end
function usetup!(input::Int, out::Int, io_buffer:Int, n::Int, x::Vector{Float64}, x_l::Vector{Float64}, x_u::Vector{Float64})
io_err = Cint[0]
usetup(io_err, Cint[input], Cint[out], Cint[io_buffer], Cint[n], x, x_l, x_u)
@cutest_error
return x, x_l, x_u
end instead of what we have now. Also, Better yet would be to phase out either the specialized or core interface. It's too much code to maintain and it's essentially duplicated. |
I agree on phasing out the specialized interface. The core interface is just a wrapper for ccalls using the internal |
Can this be closed now? |
Yes, thank you. |
We need to evaluate the pros and cons of using the core or specialized interfaces inside
julia_interface.jl
.From #40:
The text was updated successfully, but these errors were encountered: