Machine Learning

Progress Review 2020

PhD candidature annual progress review for 2019-2020.

AutoGluon

AutoGluon is a library for asynchronously distributed hyperparameter optimization (HPO) and neural architecture search (NAS) that implements numerous state-of-the-art methods. I was a core developer of the [Gaussian process-based multi-fidelity searcher](https://autogluon.mxnet.io/api/autogluon.searcher.html#gpmultifidelitysearcher) module.

Model-based Asynchronous Hyperparameter Optimization

We introduce a model-based asynchronous multi-fidelity hyperparameter optimization (HPO) method, combining strengths of asynchronous Hyperband and Gaussian process-based Bayesian optimization. Our method obtains substantial speed-ups in wall-clock …

Variational Graph Convolutional Networks

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks. We formulate a joint probabilistic model that considers a …

Density Ratio Estimation for KL Divergence Minimization between Implicit Distributions

This post demonstrates how to approximate the KL divergence (in fact, any f-divergence) between implicit distributions, using density ratio estimation by probabilistic classification.

Building Probability Distributions with the TensorFlow Probability Bijector API

We illustrate how to build complicated probability distributions in a modular fashion using the Bijector API from TensorFlow Probability.

Contributed Talk: Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

We formalize the problem of learning interdomain correspondences in the absence of paired data as Bayesian inference in a latent variable model (LVM), where one seeks the underlying hidden representations of entities from one domain as entities from …

Variational Bayes for Implicit Probabilistic Models

Expanding the scope and applicability of variational inference to encompass implicit probabilistic models.

Aboleth

Aboleth is a minimalistic TensorFlow framework for scalable Bayesian deep learning and Gaussian process approximation.