Louis Tiao

Louis Tiao

Research Scientist

Meta - New York, US

About

Hi, I’m Louis. I’m a research scientist at Meta on the Adaptive Experimentation team within Central Applied Science (CAS), based in New York City. I work on Bayesian optimization, Gaussian processes, and sample-efficient methods for automated machine learning.

Education

PhD Computer Science

University of Sydney

BSc (Honours Class 1) Computer Science

University of New South Wales (UNSW) Sydney

Interests

Gaussian Processes Bayesian Optimization Automated Machine Learning (AutoML)
My Research

My research is in probabilistic machine learning, with particular focus on approximate Bayesian inference and Gaussian processes, and their applications to Bayesian optimization. More broadly, my interests extend to automated machine learning (AutoML), encompassing hyperparameter optimization and adaptive resource allocation techniques such as early stopping and scaling laws. Past work includes graph representation learning and deep generative models. Some of this work has appeared as Orals and Spotlights at NeurIPS and ICML.

Always happy to hear from people working on related problems — get in touch.

News

🎓 PhD thesis completed

Submitted my PhD thesis, Probabilistic Machine Learning in the Age of Deep Learning, at the University of Sydney.

📄 One paper accepted to ICML 2023

Our paper “Spherical Inducing Features for Orthogonally-Decoupled Gaussian Processes” was accepted to ICML2023 as an Oral Presentation!

Featured Publications
Empirical Gaussian Processes featured image

Empirical Gaussian Processes

We study Empirical GPs, a principled framework for constructing flexible, data-driven Gaussian process priors. By estimating mean and covariance directly from a corpus of …

Jihao Andreas Lin
Ax: A Platform for Adaptive Experimentation featured image

Ax: A Platform for Adaptive Experimentation

We present Ax, an open-source platform for adaptive experimentation built on BoTorch. Off the shelf, Ax achieves state-of-the-art performance across a wide range of synthetic and …

Miles Olson
Spherical Inducing Features for Orthogonally-Decoupled Gaussian Processes featured image

Spherical Inducing Features for Orthogonally-Decoupled Gaussian Processes

We introduce spherical inter-domain inducing features that yield more flexible, data-dependent basis functions for orthogonally-decoupled GP approximations, narrowing the …

avatar
Louis Tiao
BORE: Bayesian Optimization by Density-Ratio Estimation featured image

BORE: Bayesian Optimization by Density-Ratio Estimation

We reformulate the computation of the acquisition function in Bayesian optimization (BO) as a probabilistic classification problem, providing advantages in scalability, …

avatar
Louis Tiao
Recent Posts
Efficient Cholesky decomposition of low-rank updates featured image

Efficient Cholesky decomposition of low-rank updates

We give a short and practical guide to efficiently computing the Cholesky decomposition of matrices perturbed by low-rank updates.

avatar
Louis Tiao
A Primer on Pólya-gamma Random Variables - Part II: Bayesian Logistic Regression featured image

A Primer on Pólya-gamma Random Variables - Part II: Bayesian Logistic Regression

We use one weird trick — Pólya-Gamma augmentation — to make exact inference in Bayesian logistic regression tractable.

avatar
Louis Tiao
An Illustrated Guide to the Knowledge Gradient Acquisition Function featured image

An Illustrated Guide to the Knowledge Gradient Acquisition Function

We give a short illustrated reference guide to the Knowledge Gradient acquisition function with an implementation from scratch in TensorFlow Probability.

avatar
Louis Tiao
A Handbook for Sparse Variational Gaussian Processes featured image

A Handbook for Sparse Variational Gaussian Processes

We summarize the notation, identities, and derivations underlying the sparse variational Gaussian process (SVGP) framework.

avatar
Louis Tiao
Density Ratio Estimation for KL Divergence Minimization between Implicit Distributions featured image

Density Ratio Estimation for KL Divergence Minimization between Implicit Distributions

We show how to approximate the KL divergence (in fact, any f-divergence) between implicit distributions using density ratio estimation by probabilistic classification.

avatar
Louis Tiao

Building Probability Distributions with the TensorFlow Probability Bijector API

We illustrate how to build complicated probability distributions in a modular fashion using the Bijector API from TensorFlow Probability.

avatar
Louis Tiao
Selected Talks
Featured Projects
Ax featured image

Ax

A platform for adaptive experimentation

GPflux featured image

GPflux

A TensorFlow/Keras framework for Deep Gaussian Processes

BORE

A framework for Bayesian Optimization by probabilistic classification

Contact

Connect

Drop me a line

✓ Copied to clipboard!
Send a message

Find me on