knowledge-distillation
Here are 469 public repositories matching this topic...
Compressing Image Captioning Network using Knowledge Distillation
-
Updated
Mar 6, 2018 - Jupyter Notebook
A directory with some interesting research paper summaries in the field of Deep Learning
-
Updated
Apr 28, 2018
-
Updated
Jul 5, 2018 - Jupyter Notebook
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
-
Updated
Jul 11, 2018 - Jupyter Notebook
A simple Chainer implementation of Hinton's knowledge distillation(KD).
-
Updated
Jul 23, 2018 - Jupyter Notebook
Neural Network Compression
-
Updated
Aug 14, 2018 - Jupyter Notebook
Infrastructures™ for Machine Learning Training/Inference in Production.
-
Updated
May 24, 2019
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
-
Updated
Jun 20, 2019 - Python
-
Updated
Aug 8, 2019 - Python
Knowledge Distillation using Tensorflow
-
Updated
Aug 12, 2019 - Python
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
-
Updated
Sep 9, 2019 - Python
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
-
Updated
Sep 9, 2019 - Python
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
-
Updated
Oct 3, 2019 - Python
Reimplementing cool papers in PyTorch...
-
Updated
Oct 18, 2019 - Jupyter Notebook
Code for the Knowledge distillation work to enhance fine grained disease recognition.
-
Updated
Nov 4, 2019 - Python
Simple pytorch code for knowledge distillation
-
Updated
Nov 19, 2019 - Python
-
Updated
Nov 19, 2019 - Python
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
-
Updated
Nov 21, 2019 - Python
An implementation of Frosst & Hinton's "Distilling a Neural Network Into a Soft Decision Tree"
-
Updated
Dec 30, 2019 - Python
Improve this page
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."