Simple implementation of reverse-mode automatic differentiation on numpy arrays
-
Updated
Apr 18, 2021 - Jupyter Notebook
Simple implementation of reverse-mode automatic differentiation on numpy arrays
Yet another automatic differentiation engine to perform efficient and analytically precise partial differentiation of mathematical expressions.
A simple library for building computational graphs with autodiff support.
F-1 method
A Micrograd inspired (and largely copied) small autodiff engine.
Reversed mode second order automatic differentiation for python (WIP)
Dualitic is a Python package for forward mode automatic differentiation using dual numbers.
yacc lex for reversed automatic differentiation
Simple automatic differentiation implementation in python
Lightweight automatic differentiation and error propagation library
Experiments with forward gradients on optimization test functions
A brief (and inaccurate) history of derivatives, with a brief (and incomplete) Python implementation
Realization of models from existing papers
My implementation of Andrej Kaparthy's Micrograd library for back propagation and simple neural net training
C++20 numerical and analytical derivative computations
micrograd (smol autodiff lib by @karpathy) ported into various languages
A toy forward-mode autodiff utility written in Python
c++ header-only library for scientific programming.
[wip] Lightweight Automatic Differentiation & DeepLearning Framework implemented in pure Julia.
Automatic differentiation: A tool that allows you to calculate multivariable equations, vectors, matrices, and more. All done in C++, no libraries!
Add a description, image, and links to the autodifferentiation topic page so that developers can more easily learn about it.
To associate your repository with the autodifferentiation topic, visit your repo's landing page and select "manage topics."