Skip to content

youssef3173/MetaVAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MAML VAE Implementation in Pytorch

Implementation of MetaVAE for Few-shot Image Generation in Pytorch, the model is trained on Omniglot dataset

Omniglot Dataset

The Omniglot data set is designed for developing more human-like learning algorithms. It contains 1623 different handwritten characters from 50 different alphabets. Each of the 1623 characters was drawn online via Amazon's Mechanical Turk by 20 different people.

Usage

  1. train Meta_VAE on 5-shot:
$ python meta_train.py --name mt_vae_results --meta_dataroot omniglot-py/images_background/ --k_spt 5 --k_qry 5 --update_step 100 --finetune_step 100 --num_epochs 500 
  1. test Meta_VAE on 5-shot:
$ python fine_tuning.py --name mt_vae_results --meta_dataroot omniglot-py/images_background/ --k_spt 5 --k_qry 5 --update_step 100 --finetune_step 100 --test_epochs 50

Releases

No releases published

Packages

No packages published

Languages