Skip to content

MinJunKang/CIFAR-100--Knowledge-Distillation-with-Augmented-Data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CIFAR-100--Knowledge-Distillation-with-Augmented-Data

Knowledge distillation using CIFAR 100

Paper : https://arxiv.org/abs/1503.02531

Teacher Model

Resnet 20 (you can change with 50 or others)

Student Model

MobileNet or CNN - LSTM

Data

ImageNet Dataset : http://www.image-net.org/

How to get dataset?

Link = http://hpkim0512.blogspot.com/2017/12/sdf.html?m=1

Training Method

Stop Training if overfitting(validation set accuracy doesn't increase) happens more than 50 epochs

Programming Language

Python 3.6 Tensorflow, keras

OS dependency

windows 10, ubuntu linux

Result

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages