Career Summary

I’m a research scientist at Meta on the Adaptive Experimentation team within Central Applied Science (CAS), based in New York City. My work focuses on Bayesian optimization, Gaussian processes, and sample-efficient methods for automated machine learning and deep learning applications. I completed my PhD at the University of Sydney, with research recognized at NeurIPS and ICML through Oral and Spotlight presentations.

I began my career as a research software engineer at National ICT Australia (NICTA) and later CSIRO’s Data61. During my doctoral studies I gained extensive industrial research experience through appointments at Amazon Development Centers in Berlin and Cambridge, UK, and at Secondmind Labs in Cambridge. My publication record reflects this breadth, with work spanning both academia and industry.

Experience

Research Scientist

Meta · New York, NY

At Meta’s Central Applied Science (CAS), on the Adaptive Experimentation (AE) team, I develop sample-efficient Bayesian optimization and AutoML methods within the open-source Ax and BoTorch frameworks — early stopping, data truncation, and scaling-law modeling — and apply them to hyperparameter tuning and capacity management for large-scale Ads ranking models. Co-first author on Ax: A Platform for Adaptive Experimentation (AutoML 2025).

Applied Scientist Intern

Amazon Web Services · Cambridge, UK

At AWS, I led an exploratory research project on hyperparameter optimization for large language models, focused on the scaling behavior of LLMs and the feasibility of extrapolating optimal hyperparameters from smaller models to larger ones. Reunited with Aaron Klein, Matthias Seeger, and Cédric Archambeau from my earlier AWS Berlin internship.

Doctoral Student Researcher

Secondmind · Cambridge, UK

At Secondmind (formerly Prowler.io), an AI research lab focused on Bayesian optimization and Gaussian processes, I contributed open-source software for efficient GP sampling and led research on integrating neural network features into GP approximations — work that led to Spherical Inducing Features for Orthogonally-Decoupled Gaussian Processes (ICML 2023, Oral). Worked closely with Vincent Dutordoir and Victor Picheny.

Applied Scientist Intern

Amazon Web Services · Berlin, Germany

At AWS, I contributed to Automatic Model Tuning in SageMaker, focusing on Bayesian optimization methods for AutoML. I led research on integrating multi-fidelity BO with asynchronous parallelism, resulting in a paper and open-source code in AutoGluon — work that later formed the basis of SyneTune. Worked with Matthias Seeger, Cédric Archambeau, and Aaron Klein.

Teaching Assistant

University of New South Wales (UNSW) · Sydney, Australia

Teaching assistant for COMP9418 — Advanced Topics in Statistical Machine Learning, a postgraduate course covering probabilistic graphical models, approximate inference, and Bayesian methods.

Software Engineer

CSIRO's Data61 · Sydney, Australia

At CSIRO’s Data61 — Australia’s national AI research division — I worked on the Inference Systems Engineering team, building microservices and open-source libraries for large-scale Bayesian deep learning. A stint with the Graph Analytics Engineering team led to research on graph representation learning, resulting in a NeurIPS spotlight paper.

Software Engineer

National ICT Australia (NICTA) · Sydney, Australia

At NICTA, I worked on the Big Data Knowledge Discovery initiative within an interdisciplinary ML research team, developing and releasing open-source libraries for applying Bayesian ML at scale. NICTA later merged into CSIRO’s Data61.

Research Intern

Commonwealth Scientific and Industrial Research Organisation (CSIRO) · Sydney, Australia

As a summer vacation scholar with CSIRO’s Language and Social Computing team, I built a text classification system for automated sentiment analysis using contemporary ML and NLP methods.

Education

PhD Computer Science

University of Sydney

Thesis: Probabilistic Machine Learning in the Age of Deep Learning: New Perspectives for Gaussian Processes, Bayesian Optimization and Beyond. Supervised by Fabio Ramos and Edwin Bonilla.

BSc (Honours Class 1) Computer Science

University of New South Wales (UNSW) Sydney

Collaborators

I owe a great deal to the people below — colleagues and mentors who shaped how I approach research. They include my current Meta teammates on the Adaptive Experimentation team, and the researchers who hosted me at Amazon and Secondmind during my PhD.

  • Matthias Seeger — Principal Applied Scientist, AWS Berlin
  • Eytan Bakshy — Research Director, Meta CAS (Adaptive Experimentation)
  • Cédric Archambeau — formerly Principal Applied Scientist, AWS Berlin · now Director of Artificial Intelligence, Helsing
  • Max Balandat — Research Scientist Manager, Meta CAS (Adaptive Experimentation)
  • David Eriksson — Research Scientist Manager, Meta CAS (Adaptive Experimentation)
  • Victor Picheny — Director of Research, Secondmind
  • Nicolas Durrande — formerly Director of Research, Secondmind · now Research Lead, Shift Lab
  • Aaron Klein — formerly Applied Scientist, AWS Berlin · now Research Group Leader, ELLIS Institute Tübingen
  • Sam Daulton — Research Scientist, Meta CAS (Adaptive Experimentation)
  • Sebastian Ament — Research Scientist, Meta CAS (Adaptive Experimentation)
  • Vincent Dutordoir — formerly Research Scientist, Secondmind · now Google DeepMind
Skills
Languages
Python
C
Java
Bash / shell
R
MATLAB
Mathematica
Haskell
Prolog
Machine learning
PyTorch
TensorFlow / TF Probability
scikit-learn
GPyTorch
GPflow
JAX
HuggingFace transformers / datasets
NetworkX
Scientific computing & visualization
NumPy / SciPy / pandas
Matplotlib
Jupyter / IPython
Tools & infrastructure
Git
Docker
Kubernetes
Hydra
LaTeX
AWS
Slurm / HPC
Weights & Biases
Academic service

Conference reviewing: NeurIPS (2021–2026), ICML (2021–2026, Best Reviewer Award 2021 — top 10%), ICLR (2022–2026), AISTATS (2025–2026), UAI 2026; first AutoML-Conf (2022); ICML Workshop on Graph Representation Learning and Beyond (2020); NeurIPS 4th Workshop on Meta-Learning (2020); ICLR 2nd Workshop on Neural Architecture Search (2021).

Journal reviewing: TMLR (2022–); IEEE TPAMI (2019–).

Languages
100%
English Native
80%
Chinese (Mandarin) Fluent