Information-Theoretic Methods in Deep Neural Networks: Recent Advances and Emerging Opportunities
Abstract
We present a review on the recent advances and emerging opportunities around the theme of analyzing deep neural networks (DNNs) with information-theoretic methods. We first discuss popular information-theoretic quantities and their estimators. We then introduce recent developments on information-theoretic learning principles (e.g., loss functions, regularizers and objectives) and their parameterization with DNNs. We finally briefly review current usages of information-theoretic concepts in a few modern machine learning problems and list a few emerging opportunities.
Cite
Text
Yu et al. "Information-Theoretic Methods in Deep Neural Networks: Recent Advances and Emerging Opportunities." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/633Markdown
[Yu et al. "Information-Theoretic Methods in Deep Neural Networks: Recent Advances and Emerging Opportunities." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/yu2021ijcai-information/) doi:10.24963/IJCAI.2021/633BibTeX
@inproceedings{yu2021ijcai-information,
title = {{Information-Theoretic Methods in Deep Neural Networks: Recent Advances and Emerging Opportunities}},
author = {Yu, Shujian and Giraldo, Luis G. Sánchez and Príncipe, José C.},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2021},
pages = {4669-4678},
doi = {10.24963/IJCAI.2021/633},
url = {https://mlanthology.org/ijcai/2021/yu2021ijcai-information/}
}