-
-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batches of Initial Conditions #533
Comments
As per the In your code from what I can understand (it would be helpful to see how
|
n_samples = 20
ic = randn(n_samples, 3)
prob = ODEProblem(lorenz!, ic[1,:], tspan)
prob_func(prob, i, repeat) = remake(prob, u0 = ic[i,:])
ensemble_prob = EnsembleProblem(prob, prob_func = prob_func)
sim = solve(ensemble_prob, Tsit5(), EnsembleThreads(), saveat=t_vals, trajectories = n_samples)
u_vals = Array(sim[1:end]) I made the following changes to my code after reading your suggestion: function predict_traj(pp, init_c)
Array(solve(prob_nn, Tsit5(), u0 = init_c, p = pp, saveat = t_vals));
end
function loss_batch(pp, batch_ic, batch_traj)
N_ic = size(batch_traj)[3]
sum_loss = 0.0;
for k in 1:N_ic
pred = predict_traj(pp, batch_ic[:,k])
sum_loss += sum(abs2, batch_traj[:,:,k] .- pred)
end
sum_loss
end
b_s = 5
train_loader = Flux.Data.DataLoader((u_vals[:,1,:], u_vals), batchsize = b_s, shuffle = false)
numEpochs = 100;
optfun = OptimizationFunction((θ, p, batch_ic, batch_traj) -> loss_batch(θ, batch_ic, batch_traj),
Optimization.AutoZygote())
optprob = OptimizationProblem(optfun, p_M)
using IterTools: ncycle
res1 = Optimization.solve(optprob, Optimisers.ADAM(0.05), ncycle(train_loader, numEpochs),
callback = callback) This trains successfully. There is still some redundancy in passing both I suppose it makes more sense to split Nevertheless, I wonder whether it would be a good idea for |
I have been trying to adapt this minibatching tutorial, but using batches of initial conditions, rather than batches over time. I was able to come up with this rather inelegant method:
The
Optimization.solve
returns successfully, but I am passing two copies ofu_vals
intoFlux.Data.DataLoader
and disregarding the second one in the definition ofOptimizationFunction
.My original approach was to define the DataLoader with only one array:
but this gets me the following error:
I am able to evaluate my loss function using the
DataLoader
, butOptimization.solve
is not able to handle this. Is there a way to define batches over initial conditions without passing two copies toDataLoader
?The text was updated successfully, but these errors were encountered: