Probabilistic Models

BORE: Bayesian Optimization by Density Ratio Estimation

Bayesian optimization (BO) is among the most effective and widely-used blackbox optimization methods. BO proposes solutions according to an explore-exploit trade-off criterion encoded in an acquisition function, many of which are derived from the …

Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks. We formulate a joint probabilistic model that considers a …

Variational Graph Convolutional Networks

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks. We formulate a joint probabilistic model that considers a …

A Cheatsheet for Sparse Variational Gaussian Processes

A summary of notation, identities and derivations for the sparse variational Gaussian process (SVGP) framework.

Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

We formalize the problem of learning interdomain correspondences in the absence of paired data as Bayesian inference in a latent variable model (LVM), where one seeks the underlying hidden representations of entities from one domain as entities from …

A Tutorial on Variational Autoencoders with a Concise Keras Implementation

An in-depth practical guide to variational encoders from a probabilistic perspective.