Skip to content
This repository has been archived by the owner on Nov 3, 2022. It is now read-only.

Add AdaMod optimizer #531

Open
mffigueroa opened this issue Jan 31, 2020 · 0 comments · May be fixed by #532
Open

Add AdaMod optimizer #531

mffigueroa opened this issue Jan 31, 2020 · 0 comments · May be fixed by #532

Comments

@mffigueroa
Copy link

mffigueroa commented Jan 31, 2020

I'd like to add the AdaMod optimizer to the keras-contrib optimizers.
Paper reference: https://arxiv.org/abs/1910.12249
I modified the Adam optimizer code from the main keras repo and added the exponential averaging of past learning rates via the beta_3 coefficient and clamping of learning rates as described by the paper.
Here is my current branch:
https://github.com/mffigueroa/keras-contrib/commits/user/mffigueroa/adamod

@mffigueroa mffigueroa linked a pull request Jan 31, 2020 that will close this issue
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant