Constraining Variational Inference with Geometric Jensen-Shannon Divergence

Published in NeurIPS, 2020

Recommended citation: Deasy J, Simidjievski N and Lio P (2020). Constraining Variational Inference with Geometric Jensen-Shannon Divergence. arXiv preprint arXiv:2006.10599. http://jacobdeasy.github.io/files/publications/2020-10-18-gjs.pdf

This paper presents geometric Jensen-Shannon VAEs, a generalisation of the beta-VAE family and a closed-form interpolation between forward and reverse Kullback-Liebler divergence. This was derived during my PhD with Prof. Pietro Lio at the University of Cambridge and accepted at NeurIPS 2020.