Machine Learning

BORE: Bayesian Optimization by Density Ratio Estimation

Bayesian optimization (BO) is among the most effective and widely-used blackbox optimization methods. BO proposes solutions according to an explore-exploit trade-off criterion encoded in an acquisition function, many of which are derived from the …

Progress Review 2020

PhD candidature annual progress review for 2019-2020.

AutoGluon

AutoGluon is a library for asynchronously distributed hyperparameter optimization (HPO) and neural architecture search (NAS) that implements numerous state-of-the-art methods. I was a core developer of the [Gaussian process-based multi-fidelity searcher](https://autogluon.mxnet.io/api/autogluon.searcher.html#gpmultifidelitysearcher) module.

Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks. We formulate a joint probabilistic model that considers a …

Model-based Asynchronous Hyperparameter and Neural Architecture Search

We introduce a model-based asynchronous multi-fidelity method for hyperparameter and neural architecture search that combines the strengths of asynchronous Hyperband and Gaussian process-based Bayesian optimization. At the heart of our method is a …

Variational Graph Convolutional Networks

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks. We formulate a joint probabilistic model that considers a …

A Cheatsheet for Sparse Variational Gaussian Processes

A summary of notation, identities and derivations for the sparse variational Gaussian process (SVGP) framework.

Density Ratio Estimation for KL Divergence Minimization between Implicit Distributions

This post demonstrates how to approximate the KL divergence (in fact, any f-divergence) between implicit distributions, using density ratio estimation by probabilistic classification.

Building Probability Distributions with the TensorFlow Probability Bijector API

We illustrate how to build complicated probability distributions in a modular fashion using the Bijector API from TensorFlow Probability.

Contributed Talk: Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference