-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scheduling within models #41
Comments
Hi @tomhosking, Thank you for opening this issue. Yes indeed some models may need several warmup steps during training. To allow such a behavior, the current training epoch is automatically passed to the benchmark_VAE/src/pythae/trainers/base_trainer/base_trainer.py Lines 473 to 475 in 0f5c0cc
The current epoch can be retrieved in the epoch = kwargs.pop("epoch") and then passed to the $ git clone https://github.com/clementchadebec/benchmark_VAE.git
$ cd benchmark_VAE
$ pip install -e . As an example, you can check the implementation of the disentangled_beta_vae that uses a specific KL scheduling procedure. I will think of a way to include this for other models as well. Best, Clément |
Some models make use of some sort of scheduling or annealing internally (e.g. KL warmup or temperature annealing) based on the current step index - what's the correct way to implement this within
pythae
?The text was updated successfully, but these errors were encountered: