Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
-
Updated
Apr 23, 2024 - Jupyter Notebook
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Accelerate your training with this open-source library. Optimize performance with streamlined training and serving options with JAX. 🚀
JAX implementations of various deep reinforcement learning algorithms.
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors".
JAX/Flax implementation of finite-size scaling
An implementation of adan optimizer for optax
JAX implementation of Classical and Quantum Algorithms for Orthogonal Neural Networks by (Kerenidis et al., 2021)
H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh.
Goal-conditioned reinforcement learning like 🔥
Direct port of TD3_BC to JAX using Haiku and optax.
The (unofficial) vanilla version of WaveRNN
dm-haiku implementation of hyperbolic neural networks
Variational Graph Autoencoder implemented using JAX+Jraph
A helper library for training dm-haiku models.
Neural implicit digital elevation model
Add a description, image, and links to the optax topic page so that developers can more easily learn about it.
To associate your repository with the optax topic, visit your repo's landing page and select "manage topics."