An Illustrated Guide to the Knowledge Gradient Acquisition Function
We give a short illustrated reference guide to the Knowledge Gradient acquisition function with an implementation from scratch in TensorFlow Probability.
We give a short illustrated reference guide to the Knowledge Gradient acquisition function with an implementation from scratch in TensorFlow Probability.
Our paper "Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings" was accepted to NeurIPS 2020 as a Spotlight Presentation …
We summarize the notation, identities, and derivations underlying the sparse variational Gaussian process (SVGP) framework.
We show how to approximate the KL divergence (in fact, any f-divergence) between implicit distributions using density ratio estimation by probabilistic classification.
We illustrate how to build complicated probability distributions in a modular fashion using the Bijector API from TensorFlow Probability.
We give an in-depth practical guide to variational autoencoders from a probabilistic perspective.
We compare NumPy's `mgrid` and `meshgrid` for building coordinate grids — what each does, why both exist, and how broadcasting often makes them optional.