Lightweight Monocular Depth with a Novel Neural Architecture Search Method
Abstract
This paper presents a novel neural architecture search method, called LiDNAS, for generating lightweight monocular depth estimation models. Unlike previous neural architecture search (NAS) approaches, where finding optimized networks is computationally highly demanding, the introduced novel Assisted Tabu Search leads to efficient architecture exploration. Moreover, we construct the search space on a pre-defined backbone network to balance layer diversity and search space size. The LiDNAS method outperforms the state-of-the-art NAS approach, proposed for disparity and depth estimation, in terms of search efficiency and output model performance. The LiDNAS optimized models achieve result superior to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and ScanNet, while being 7%-500% more compact in size, i.e the number of model parameters.
Cite
Text
Huynh et al. "Lightweight Monocular Depth with a Novel Neural Architecture Search Method." Winter Conference on Applications of Computer Vision, 2022.Markdown
[Huynh et al. "Lightweight Monocular Depth with a Novel Neural Architecture Search Method." Winter Conference on Applications of Computer Vision, 2022.](https://mlanthology.org/wacv/2022/huynh2022wacv-lightweight/)BibTeX
@inproceedings{huynh2022wacv-lightweight,
title = {{Lightweight Monocular Depth with a Novel Neural Architecture Search Method}},
author = {Huynh, Lam and Nguyen, Phong and Matas, Jiří and Rahtu, Esa and Heikkilä, Janne},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2022},
pages = {3643-3653},
url = {https://mlanthology.org/wacv/2022/huynh2022wacv-lightweight/}
}