Tech Talk: Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference
Amazon Machine Learning Community Tech Talk, Berlin.
Amazon Machine Learning Community Tech Talk, Berlin.
We show how to approximate the KL divergence (in fact, any f-divergence) between implicit distributions using density ratio estimation by probabilistic classification.
We illustrate how to build complicated probability distributions in a modular fashion using the Bijector API from TensorFlow Probability.
ICML 2018 Workshop on Theoretical Foundations and Applications of Deep Generative Models (TAGDM), Stockholm.
We derive cycle-consistent adversarial learning (CycleGAN) as a special case of variational inference in a latent-variable model with implicit priors, establishing a Bayesian …
We give an in-depth practical guide to variational autoencoders from a probabilistic perspective.