Skip to content

Latest commit

 

History

History

task2

Deadline: December 1, 23.59.

Save notebooks into task2/SurnameTask2.ipynb

IMPORTANT: the code must not be written in Torch/Tensorflow. For deep learning use Jax.

  1. [Reporter: Vladimirov] Visualize message passing for factor-graph with algorithm Sum-Product. The directions of messages should be visualized in the resulting plots. Graph and model parameters must be sampled randomly during each run of notebook. Visualization format - plots step by step via interactive plots or animation.
  1. [Reporter: Chernikov] Generate Bayesian network randomly. Generate dataset using Bayesian network. Estimate likelihood and compare it with likelihood for random data ("out-of-distribution" data). Convert network to Markov Random Field. Estimate the likelihood of previously generated data and random data. Repeat experiment multiple times.

3.[Reporter: Marat Khusainov] Visualize KL-divergence (for 3 variants: KL(p1,p2), KL(p2,p1) and 0.5 KL(p1,p2) + 0.5 KL(p2,p1)) and JS distance between true data distribution and GAN distribution. Visulizae dependence of these KL-divergences on the numer of optimization iterations. Dataset: synthetic.

  1. [Reporter: Maksim Tyurikov] Repeat M1 and M2 models from Kingma for a dataset with noisy labels. Plots the performance of both models from the percentage of wrong (noisy) labels. Dataset: MNIST or similar dataset.
  1. [Reporter: TODO] Compare model performance, where models are implemented using Maximum likelihood optimization, MAP + Laplace approximation, ELBO for different dataset shifts: covariate shift, prior probability shift, concept shift. Plot dependency from the dataset shift significance. Model: any model starting from Logistic regression.

  2. [Reporter: Dmitry Protasov] Train VAE. Train vanilla AE to repeat VAE performance using GAN. Sample from AE, analyze sampling performance. Dataset: MNIST or similar dataset.

  3. [Reporter: TODO] Plot vector field from contractive AE. Plot heatmap for norm of it. Plot VAE density. Compare results. Analyze dependency of density estimation quality of AE from sigma. taset: multiple sintenic datasets from the paper.

  1. [Reporter: Kseniia Petrushina] Plot vector field from denoysing AE. Plot heatmap for norm of it. Plot VAE density. Compare results. Analyze dependency of density estimation quality of AE from sigma. Dataset: multiple sintenic datasets from the paper.
  1. [Reporter: Galina Boeva] Repeat two methods of optimization of GAN: using eq 1 and alternative method for better gradients (see paper). Compare the convergence of both models, their losses as well as their generator and discriminator losses.
  1. [Reporter: Parviz Karimov] Train vanilla autoencoder. Approximate latent variable distribution using uniform and normal distribution. Try to sample. Analyze sampling performance.