AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
-
Updated
Jan 13, 2021 - Python
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
Training the fully connected neural network (FCNN) using different optimizers for the backpropagation algorithm and compare the number of epochs that it takes for convergence along with their classification performance. Also building an autoencoder to obtain the hidden representation and use it for classification.
Generating a TensorFlow model that predicts values in a sinewave
A NumPy based Neural Network Package Implementation
En el siguiente repositorio encontrarás material relacionado con optimizadores usados en metodologías de aprendizaje de maquina. In the following repository you'll find examples of optimizers used in machine learning methods
Experimented with and compared DFW neural network optimizer with SGD and ADAM on both vision and language tasks
Introduction to the Adam Optimizer with examples
Unofficial implementation of the Adan optimizer with Schedule-Free
Pytorch library to test optimizers by visualizing how they descend on a your images. You can draw your own custom loss landscape and see what different optimizers do.
Add a description, image, and links to the optimizer-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the optimizer-algorithms topic, visit your repo's landing page and select "manage topics."