jmtomczak / vae_vampprior Goto Github PK
View Code? Open in Web Editor NEWCode for the paper "VAE with a VampPrior", J.M. Tomczak & M. Welling
Home Page: https://jmtomczak.github.io/deebmed.html
License: MIT License
Code for the paper "VAE with a VampPrior", J.M. Tomczak & M. Welling
Home Page: https://jmtomczak.github.io/deebmed.html
License: MIT License
Hi, in line 78~80 of VAE.py, when calculating the KL divergence of VampPrior p(z) and posterior q(z|x), we put z_q
into N(z_p_mean, z_p_logvar)
and N(z_q_mean, z_q_logvar)
then calculate the difference of log outputs of two Gaussian function.
Line 78 in bb6ff3e
Line 79 in bb6ff3e
Line 80 in bb6ff3e
Since we have already get mean and variance of prior and posterior, can we directly calculate KL divergence of two Gaussian distribution? i.e.
This is because in line 226, z_q
is just drawn from N(z_q_mean, z_q_logvar)
:
Line 226 in bb6ff3e
Trying to get this working today, but I'm getting the following error:
load data
Files already downloaded and verified
create model
Traceback (most recent call last):
File "experiment.py", line 160, in
run(args, kwargs)
File "experiment.py", line 124, in run
from models.VAE import VAE
File "/workspace/vae_vampprior/models/VAE.py", line 20, in
from Model import Model
ImportError: No module named 'Model'
Any ideas?
In the generative part of the forward pass when computing the joint density p(x | z1, z2):
vae_vampprior/models/HVAE_2level.py
Line 294 in bb6ff3e
I suppose there is a self.reparameterize(z1_p_mean, z1_p_logvar) missing, i.e.:
# p(z1 | z2)
z1_p_mean, z1_p_logvar = self.p_z1(z2_q)
z1_p = self.reparameterize(z1_p_mean, z1_p_logvar)
# x_mean = p(x|z1,z2)
x_mean, x_logvar = self.p_x(z1_p, z2_q)
Let me know if I am wrong :-)
Very good code anyways! ๐
I'm unsure of the role the pseudo inputs play in generation using a Vamp Propr. This is likey my lack of understanding of your paper because I find it hard to reconcile these implementation details with your paper (my lack of expertise in this area).
It seems that when you generate, you generate using a subset of the pseudo inputs as per:
means = self.means(self.idle_input)[0:N] z_sample_gen_mean, z_sample_gen_logvar = self.q_z(means) z_sample_rand = self.reparameterize(z_sample_gen_mean, z_sample_gen_logvar)
I understand how to generate from these pseudo inputs using reparameterization but how would I sample from the aggregated prior itself? After all, it is this prior that one would normally sample from in a VAE after all (though it wouldn't be aggregated in the vanilla VAE).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.