back to list
Project: NXP: Scalable Neural Architecture Search
Description
- Neural Architecture Search (NAS) is an attractive methodology to design and optimize good and efficient Neural Networks, but expensive for large scale models and or on high-bandwidth datasets. To enable NAS for a wide variety of domains requires exploring, improving and inventing e.g.:
- Zero-cost proxies to improve guidance in search space exploration
- Automatic efficient search space design
- Surrogate models to speed up candidate model evaluation (both how to collect data, how to design and how to exploit; covering multiple objectives)
- Low-fidelity estimators and prioritization techniques (training effort schedulers such as ASHA, weight sharing, learning curve extrapolation, …)
Contact the TU/e supervisor (Joaquin Vanschoren). Please note that final acceptance will depend on the availability of supervision bandwidth and daily supervision at TU/e, as well as NXP.
Details
- Supervisor
-
Joaquin Vanschoren
- External location
- NXP
- Interested?
-
Get in contact