Art Theory

AutoML20: Demystifying NAS in Theory and Practice



AICamp

Deep learning offers the promise of bypassing the process of manual feature engineering by learning representations in conjunction with statistical models in an end-to-end fashion. However, neural network architectures themselves are typically designed by experts in a painstaking, ad-hoc fashion. Neural architecture search (NAS) presents a promising path for alleviating this pain by automatically identifying architectures that are superior to hand-designed ones. In this talk we will present our recent GAEA framework, which provides principled and computationally efficient algorithms for NAS that yield SOTA performance on a wide range of leading NAS benchmarks in computer vision. We will also briefly discuss practical infrastructural hurdles associated with large-scale NAS workflows, and how we tackle these hurdles with Determined AI’s open-source training platform.
Speaker: Liam Li, Determined AI

Source

Similar Posts

WP2Social Auto Publish Powered By : XYZScripts.com