Please do not share or link.
- Revisit Probabilistic PCA , Factor Analysis.
- Generalize to deep latent Gaussian models (DLGMs)  and describe how inference is done: amortized variational inference / stochastic backpropagation with inference networks.
- Generalize amortized variational inference to implicit distributions: Adversarial autoencoders, BiGAN/ALIGAN, AVB  .
- Formulate CycleGAN  as a deep latent Gaussian model with a implicit prior distribution, where inference is done using amortized variational inference with an implicit approximate posterior distribution.
|||M. E. Tipping and C. M. Bishop, "Probabilistic Principal Component Analysis," Journal of the Royal Statistical Society. Series B (Statistical Methodology), vol. 61. WileyRoyal Statistical Society, pp. 611–622, 1999.|
|||D. J. Rezende, S. Mohamed, and D. Wierstra, "Stochastic backpropagation and approximate inference in deep generative models," in Proceedings of The 31st Conference on Machine Learning, Beijing, China, 2014, vol. 32, no. 2, pp. 1278–1286.|
|||J.-Y. Zhu, T. Park, P. Isola, and A. A. Efros, "Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks," Mar. 2017.|
|||Z. Hu, Z. Yang, R. Salakhutdinov, and E. P. Xing, "On Unifying Deep Generative Models," Jun. 2017.|
|||L. Mescheder, S. Nowozin, and A. Geiger, "Adversarial Variational Bayes: Unifying Variational Autoencoders and Generative Adversarial Networks," in Proceedings of the 34th International Conference on Machine Learning, 2017, vol. 70, pp. 2391–2400.|
|||D. Tran, R. Ranganath, and D. Blei, "Hierarchical Implicit Models and Likelihood-Free Variational Inference," to appear in Advances in Neural Information Processing Systems 30, 2017.|