Louis Tiao
Louis Tiao
Home
Publications
Posts
Projects
Talks
Contact
CV
Light
Dark
Automatic
Bayesian Optimization
Probabilistic Machine Learning in the Age of Deep Learning: New Perspectives for Gaussian Processes, Bayesian Optimization and Beyond (PhD Thesis)
This thesis explores the intersection of deep learning and probabilistic machine learning to enhance the capabilities of artificial intelligence. It addresses the limitations of Gaussian processes (GPs) in practical applications, particularly in comparison to neural networks (NNs), and proposes advancements such as improved approximations and a novel formulation of Bayesian optimization (BO) that seamlessly integrates deep learning methods. The contributions aim to enrich the interplay between deep learning and probabilistic ML, advancing the foundations of AI and fostering the development of more capable and reliable automated decision-making systems.
Louis Tiao
PDF
Preprint
Full Acknowledgements
Batch Bayesian Optimisation via Density-ratio Estimation with Guarantees
We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given …
Rafael Oliveira
,
Louis Tiao
,
Fabio Ramos
PDF
Code
BORE: Bayesian Optimization by Density-Ratio Estimation
We reformulate the computation of the acquisition function in Bayesian optimization (BO) as a probabilistic classification problem, providing advantages in scalability, flexibility, and representational capacity, while casting aside the limitations of tractability constraints on the model.
Louis Tiao
,
Aaron Klein
,
Matthias Seeger
,
Edwin v. Bonilla
,
Cédric Archambeau
,
Fabio Ramos
PDF
Cite
Code
Poster
Slides
Video
Conference Proceeding
Supplementary material
An Illustrated Guide to the Knowledge Gradient Acquisition Function
A short illustrated reference guide to the Knowledge Gradient acquisition function with an implementation from scratch in TensorFlow Probability.
Louis Tiao
Last updated on Oct 22, 2022
7 min read
Technical
Model-based Asynchronous Hyperparameter and Neural Architecture Search
We introduce a model-based method for asynchronous multi-fidelity hyperparameter and neural architecture search that combines the strengths of asynchronous Hyperband and Gaussian process-based Bayesian optimization, achieving substantial speed-ups over current state-of-the-art methods on challenging benchmarks for tabular data, image classification, and language modeling.
Aaron Klein
,
Louis Tiao
,
Thibaut Lienart
,
Cédric Archambeau
,
Matthias Seeger
PDF
Cite
Code
Video
Cite
×