Louis Tiao

Louis Tiao

PhD Candidate

University of Sydney

CSIRO Data61


Hi. My name is Louis. I am a budding machine learning researcher and PhD candidate at the University of Sydney, working with Edwin Bonilla and Fabio Ramos. My main research interests lie at the intersection of Bayesian deep learning, approximate inference, and probabilistic models with intractable likelihoods.

Until 2017, I was a software engineer at NICTA (now incorporated under CSIRO as Data61) in the inference systems group, working on scalable Bayesian machine learning. I now work at Data61 on a part-time basis when I am not teaching.

Prior to that, I studied computer science at the University of New South Wales, where I had a major emphasis on algorithm design and analysis, theoretical computer science, programming language theory, artificial intelligence, machine learning, and a minor emphasis on mathematics and statistics. I undertook my final-year thesis under Aleksandar Ignjatovic, and graduated with first-class honours in 2015.


Research Experience


Applied Scientist (Intern)


Jun 2019 – Dec 2019 Berlin, Germany

In the Summer-Fall of 2019, I interned as an Applied Scientist at Amazon Berlin, conducting research in the areas of AutoML and Hyperparameter Optimization for the AWS SageMaker’s Automatic Model Tuning service.

I had the good fortune of working with Matthias Seeger and Cédric Archambeau, and together, we tackled the challenges of making Multi-fidelity Bayesian Optimization asynchronously parallel.

The research developed during my internship culminated in a research paper and the release of our code as part of the open-source AutoGluon library.


PhD Candidate

University of Sydney

Jul 2017 – Present Sydney, Australia

Research Engineer

CSIRO Data61

Jul 2016 – Apr 2019 Sydney, Australia

Software Engineer

National ICT Australia (NICTA)

May 2015 – Jun 2016 Sydney, Australia

Research Intern

Commonwealth Scientific and Industrial Research Organisation (CSIRO)

Nov 2013 – Feb 2014 Sydney, Australia

Recent Publications

(2020). BORE: Bayesian Optimization by Density Ratio Estimation. In NeurIPS2020 Meta-Learn. Accepted as Contributed Talk (Awarded to Best 3 Papers).

Preprint Code Poster Slides Video Supplementary material

(2020). Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings. In Neural Information Processing Systems (NeurIPS) 2020. Accepted as Spotlight Presentation (Awarded to Top 3% of Papers).

PDF Code Video

(2020). Model-based Asynchronous Hyperparameter and Neural Architecture Search. Preprint.

PDF Code Video

(2019). Variational Graph Convolutional Networks. In NeurIPS2019 Graph Representation Learning. Accepted as Outstanding Contribution Talk (Awarded to Best 3 Papers).

PDF Code Video Workshop Homepage

(2018). Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference. In ICML2018 Theoretical Foundations and Applications of Deep Generative Models. Accepted as Contributed Talk..

Preprint PDF Code Project Poster Slides Workshop Homepage

Recent Posts

A Cheatsheet for Sparse Variational Gaussian Processes

A summary of notation, identities and derivations for the sparse variational Gaussian process (SVGP) framework.

Density Ratio Estimation for KL Divergence Minimization between Implicit Distributions

Density Ratio Estimation for KL Divergence Minimization between Implicit Distributions

This post demonstrates how to approximate the KL divergence (in fact, any f-divergence) between implicit distributions, using density ratio estimation by probabilistic classification.

Building Probability Distributions with the TensorFlow Probability Bijector API

We illustrate how to build complicated probability distributions in a modular fashion using the Bijector API from TensorFlow Probability.

A Tutorial on Variational Autoencoders with a Concise Keras Implementation

A Tutorial on Variational Autoencoders with a Concise Keras Implementation

An in-depth practical guide to variational encoders from a probabilistic perspective.

NumPy mgrid vs. meshgrid

NumPy mgrid vs. meshgrid

The meshgrid function is useful for creating coordinate arrays to vectorize function evaluations over a grid. Experienced NumPy users will have noticed some discrepancy between meshgrid and the mgrid, a function that is used just as often, for exactly the same purpose.

Recent & Upcoming Talks

Model-based Asynchronous Hyperparameter and Neural Architecture Search (by Matthias Seeger)

ECCV2020 Tutorial – From HPO to NAS: Automated Deep Learning
Model-based Asynchronous Hyperparameter and Neural Architecture Search (by Matthias Seeger)

Outstanding Contribution Talk: Variational Graph Convolutional Networks (by Edwin Bonilla)

Outstanding Contribution Talk at NeurIPS2019 Workshop on Graph Representation Learning
Outstanding Contribution Talk: Variational Graph Convolutional Networks (by Edwin Bonilla)



AutoGluon is a library for asynchronously distributed hyperparameter optimization (HPO) and neural architecture search (NAS) that implements numerous state-of-the-art methods. I was a core developer of the Gaussian process-based multi-fidelity searcher module.

Variational Bayes for Implicit Probabilistic Models

Expanding the scope and applicability of variational inference to encompass implicit probabilistic models.


Aboleth is a minimalistic TensorFlow framework for scalable Bayesian deep learning and Gaussian process approximation.


Determinant is a software service that makes predictions from sparse data, and learns what data it needs to optimise its performance.


Revrand is a full-featured Python library for Bayesian generalized linear models, with random basis kernels for large-scale Gaussian process approximations.


COMP9418: Advanced Topics in Statistical Machine Learning ( UNSW Sydney)

The course has a primary focus on probabilistic machine learning methods, covering the topics of exact and approximate inference in directed and undirected probabilistic graphical models - continuous latent variable models, structured prediction models, and non-parametric models based on Gaussian processes.

Lab exercise on Gaussian Process Regression, running in JupyterLab.

This course has a major emphasis on maintaining a good balance between theory and practice. As the teaching assistant (TA) for this course, my primary responsibility was to create lab exercises that aid students in gaining hands-on experience with these methods, specifically applying them to real-world data using the most current tools and libraries. The labs were Python-based, and relied heavily on the Python scientific computing and data analysis stack ( NumPy, SciPy, Matplotlib, Seaborn, Pandas, IPython/Jupyter notebooks), and the popular machine learning libraries scikit-learn and TensorFlow.

Students were given the chance to experiment with a broad range of methods on various problems, such as Markov chain Monte Carlo (MCMC) for Bayesian logistic regression, probabilistic PCA (PPCA), factor analysis (FA) and independent component analysis (ICA) for dimensionality reduction, hidden Markov models (HMMs) for speech recognition, conditional random fields (CRFs) for named-entity recognition, and Gaussian processes (GPs) for regression and classification.