Learning in Variational Autoencoders with Kullback-Leibler and Renyi Integral Bounds

, , ,

ICML 2018 Workshop on Theoretical Foundations and Applications of Deep Generative Models (2018) .


Abstract

In this paper we propose two novel bounds for the log-likelihood based on Kullback-Leibler and the Renyi divergences, which can be used for variational inference and in particular for the training of Variational AutoEncoders. Our proposal is motivated by the difficulties encountered in training VAEs on continuous datasets with high contrast images, such as those with handwritten digits and characters, where numerical issues often appear unless noise is added, either to the dataset during training or to the generative model given by the decoder. The new bounds we propose, which are obtained from the maximization of the likelihood of an interval for the observations, allow numerically stable training procedures without the necessity of adding any extra source of noise to the data.



Add your rating and review

If all scientific publications that you have read were ranked according to their scientific quality and importance from 0% (worst) to 100% (best), where would you place this publication? Please rate by selecting a range.


0% - 100%

This publication ranks between % and % of publications that I have read in terms of scientific quality and importance.


Keep my rating and review anonymous
Show publicly that I gave the rating and I wrote the review