What is the purpose of the KL divergence term in the VAE loss function?

Machine Learning Hard

Machine Learning — Hard

What is the purpose of the KL divergence term in the VAE loss function?

Key points

  • The KL divergence term enforces a standard Gaussian prior on the latent space
  • It prevents the encoder from learning complex, unstructured latent representations
  • This regularization ensures the latent space is smooth and facilitates interpolation

Ready to go further?

Related questions