Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

8
  • 1
    $\begingroup$ But we can compute the KL divergence between N(0,I) and Q(z|X) in closed form if both are Gaussian, which is what we typically do. Once trained, the distribution of Q(z|X) will look like a quilt of 10 Gaussians for MNIST (analogous to the picture you described for MNIST on the uniform distribution), and the mixture will be bigger than N(0,I). I guess we have to manually inspect this mixture distribution in the latent space and try to come up with a way to sample it. On the other hand, this blog claims that we should sample from N(0,I) at generator time, not Q(z|X). $\endgroup$ Commented Mar 12, 2018 at 23:17
  • $\begingroup$ the blog: towardsdatascience.com/… $\endgroup$ Commented Mar 12, 2018 at 23:18
  • $\begingroup$ I edited my answer to answer your question on why you might expect the "quilt" to be approximately N(0,I) $\endgroup$ Commented Mar 13, 2018 at 18:25
  • $\begingroup$ @shimao I think it is misleading to say $KL(Q(z|x) \Vert P(z))$ is a regularizer term in some tutorials, it is simply a term in the lower bound. Moreover, for generative purposes, the only thing that we care about is $P(x) = \int P(x|z)P(z)$. $P(z)$ is the prior and we get $P(x|z)$ by training, so we don't actually care about $Q(z|x)$. $\endgroup$ Commented May 12, 2018 at 5:23
  • $\begingroup$ @me_Tchaikovsky i'm not exactly sure what you're getting at -- but there is indeed value in knowing how close $Q(z|x)$ comes to $P(z|x)$, because that determines whether it makes sense to sample from VAEs in the straightforward way (sampling the latent space and then running the decoder) $\endgroup$ Commented May 12, 2018 at 5:28