GithubHelp home page GithubHelp logo

jmtomczak / vae_vampprior Goto Github PK

View Code? Open in Web Editor NEW
221.0 5.0 50.0 1.3 MB

Code for the paper "VAE with a VampPrior", J.M. Tomczak & M. Welling

Home Page: https://jmtomczak.github.io/deebmed.html

License: MIT License

Python 100.00%
deep-learning representation-learning generative-model variational-autoencoders

vae_vampprior's People

Contributors

jmtomczak avatar jramapuram avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

vae_vampprior's Issues

Why use GatedDense in VAE?

I notice that you use GatedDense in self.q_z_layers to encode q(z | x):

image

Is there any reason to use GatedDense rather than a simple MLP? Thanks!

can we calculate KL divergence of VampPrior and posterior without sampling?

Hi, in line 78~80 of VAE.py, when calculating the KL divergence of VampPrior p(z) and posterior q(z|x), we put z_q into N(z_p_mean, z_p_logvar) and N(z_q_mean, z_q_logvar) then calculate the difference of log outputs of two Gaussian function.

log_p_z = self.log_p_z(z_q)

log_q_z = log_Normal_diag(z_q, z_q_mean, z_q_logvar, dim=1)

KL = -(log_p_z - log_q_z)

Since we have already get mean and variance of prior and posterior, can we directly calculate KL divergence of two Gaussian distribution? i.e.

image

This is because in line 226, z_q is just drawn from N(z_q_mean, z_q_logvar):

z_q = self.reparameterize(z_q_mean, z_q_logvar)

So can we skip this step? Thanks!

ImportError: No module named 'Model'

Trying to get this working today, but I'm getting the following error:

python experiment.py --dataset_name=cifar10

load data
Files already downloaded and verified
create model
Traceback (most recent call last):
File "experiment.py", line 160, in
run(args, kwargs)
File "experiment.py", line 124, in run
from models.VAE import VAE
File "/workspace/vae_vampprior/models/VAE.py", line 20, in
from Model import Model
ImportError: No module named 'Model'

Any ideas?

wrong dependency: z1_q instead of z1_p?

In the generative part of the forward pass when computing the joint density p(x | z1, z2):

x_mean, x_logvar = self.p_x(z1_q, z2_q)

We pass z1_q. This means we have a depency such as:

Screenshot 2020-03-27 at 16 03 45

I suppose there is a self.reparameterize(z1_p_mean, z1_p_logvar) missing, i.e.:

# p(z1 | z2)
z1_p_mean, z1_p_logvar = self.p_z1(z2_q)
z1_p = self.reparameterize(z1_p_mean, z1_p_logvar)

# x_mean = p(x|z1,z2)
x_mean, x_logvar = self.p_x(z1_p, z2_q)

Let me know if I am wrong :-)
Very good code anyways! ๐Ÿ‘

VampPrior Generation

I'm unsure of the role the pseudo inputs play in generation using a Vamp Propr. This is likey my lack of understanding of your paper because I find it hard to reconcile these implementation details with your paper (my lack of expertise in this area).

It seems that when you generate, you generate using a subset of the pseudo inputs as per:

means = self.means(self.idle_input)[0:N] z_sample_gen_mean, z_sample_gen_logvar = self.q_z(means) z_sample_rand = self.reparameterize(z_sample_gen_mean, z_sample_gen_logvar)

I understand how to generate from these pseudo inputs using reparameterization but how would I sample from the aggregated prior itself? After all, it is this prior that one would normally sample from in a VAE after all (though it wouldn't be aggregated in the vanilla VAE).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.