You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What would be good ways to store and persist a GP optimized with GP.jl? The purpose would be to make predictions with mean and standard deviation at a later time than training.
The text was updated successfully, but these errors were encountered:
Hi @mhangaard, this is something I've thought about before using BSON.jl.
It's certainly something that would be useful in the package, however, I'm not sure I personally have the time currently, so I've left it as an open issue.
If you keep hold of the training data and the kernel specification, all you need is the vector of parameters output by get_params. Then you would just re-create the GP object and feed the stored parameters to set_params!. Anything beyond that (such as some kind of serialisation scheme for the kernel and mean specification) would need some serious thought.
One option for this @mhangaard is to use JLD.jl. A minimal working example would look something like
using Random, Distributions, GaussianProcesses, JLD
Random.seed!(13579) # Set the seed using the 'Random' package
n = 20; # number of training points
x = 2π * rand(n); # predictors
y = sin.(x) + 0.05*randn(n); # regressors
# Select mean and covariance function
mZero = MeanZero() # Zero mean function
kern = SE(0.0,0.0) # Sqaured exponential kernel
logObsNoise = -1.0 # log standard deviation of observation noise
gp = GP(x,y,mZero,kern,logObsNoise) # Fit the GP
optimize!(gp) #Optimise the parameters
From which the model can be saved by save("gp_model.jld", "gp", gp) and consequently loaded back in by gp = load("gp_model.jld", "gp").
What would be good ways to store and persist a GP optimized with GP.jl? The purpose would be to make predictions with mean and standard deviation at a later time than training.
The text was updated successfully, but these errors were encountered: