back to list

Project: Never-ending Automated Machine Learning: Surrogate model transfer


There is an infinite number of ways to design a machine learning system, and many careful decisions need to be made based on prior experience. The field of automated machine learning (AutoML) aims to make these decisions in a data-driven, objective, and automated way. This typically entails a search over a very large space of possible models.

Ironically, while AutoML systems do find good models for a given task, they don't learn anything in the process. If you give the same task twice, it simply restarts from scratch. Ideally, the AutoML system should get more experienced with every task you give it, so that it will find better solutions, faster.

One way to achieve this is through surrogate model transfer. In this case, the search method used is Bayesian optimization. After initial experiments, such methods build a meta-model (called a surrogate model) that predicts which models are most interesting to try next. It is used in AutoML systems such as auto-sklearn, but also to fine-tune the AlphaGo model.


The idea for this thesis is to investigate how we can transfer such surrogate models (trained on specific tasks) to speed up AutoML on new tasks:

  • Given a distribution of 100s of tasks (or more), build a surrogate model for each task
  • Given a new task, choose or combine surrogate models so that they will likely give good predictions
  • Leverage the predictions of these surrogate models to speed up the search for good models on the new task

One requirement is that your method should scale well (to hundreds of prior tasks).

One particular angle would be to cast this as a continual learning problem, i.e. to adapt surrogate models to each new task, hopefully without forgetting how to solve earlier tasks, but there are many other ways to approach this problem.

Joaquin Vanschoren
Get in contact