Skip to content

6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"

Notifications You must be signed in to change notification settings

pochih/MNIST-multitask

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MNIST-multitask

Reproduce ICLR2018 under-reviewed paper MULTI-TASK LEARNING ON MNIST IMAGE DATASETS

the paper argues that pre-train network with MNIST-like dataset can boost performance

results

dataset single-task M+F single-task (paper reported) M+F (paper reported)
MNIST 0.996 0.9956 0.9956 0.9971
FashionMNIST 0.9394 0.942 0.9432 0.9518

discussion

in my reproduction, FashionMNIST performs better with MNIST+FashionMNIST pre-trained first

but MNIST doesn't enjoy the benefits of pre-training.

The bias between reproduction and paper can result from preprocess of data.

Author

Po-Chih Huang / @pochih

About

6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages