NeuralArTS: Structuring Neural Architecture Search with Type Theory (Student Abstract)
Abstract
Neural Architecture Search (NAS) algorithms automate the task of finding optimal deep learning architectures given an initial search space of possible operations. Developing these search spaces is usually a manual affair with pre-optimized search spaces being more efficient, rather than searching from scratch. In this paper we present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system. We further demonstrate how NeuralArTS can be applied to convolutional layers and propose several future directions.
Cite
Text
Wu et al. "NeuralArTS: Structuring Neural Architecture Search with Type Theory (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I11.21679Markdown
[Wu et al. "NeuralArTS: Structuring Neural Architecture Search with Type Theory (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/wu2022aaai-neuralarts/) doi:10.1609/AAAI.V36I11.21679BibTeX
@inproceedings{wu2022aaai-neuralarts,
title = {{NeuralArTS: Structuring Neural Architecture Search with Type Theory (Student Abstract)}},
author = {Wu, Robert and Saxena, Nayan and Jain, Rohan},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {13085-13086},
doi = {10.1609/AAAI.V36I11.21679},
url = {https://mlanthology.org/aaai/2022/wu2022aaai-neuralarts/}
}