Skip to content
This repository has been archived by the owner on Nov 3, 2022. It is now read-only.

Add AdaMod optimizer to keras-contrib #532

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

mffigueroa
Copy link

- What I did
I copied the Adam optimizer from the main keras repo and modified it according to the Adamod paper.

- How I did it
Mainly I added the exponential averaging of the learning rate after the gradients are divided by the 2nd moment, using the new beta_3 parameter. This exponential average is then used as an upper bound for this adaptive learning rate. The 1st and 2nd moment bias corrections had to be separated out into 2 different statements because the 1st moment term is applied after this upper bounding according to the Adamod paper. See this paper for further details.

- How you can verify it
I added a unit test in the same fashion as the other unit tests in the optimizers directory.


This pull request fixes #531

@mffigueroa mffigueroa changed the title User/mffigueroa/adamod Add AdaMod optimizer to keras-contrib Jan 31, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add AdaMod optimizer
1 participant