Fractional Adaptive Linear Units
Abstract
This work introduces Fractional Adaptive Linear Units (FALUs), a flexible generalization of adaptive activation functions. Leveraging principles from fractional calculus, FALUs define a diverse family of activation functions (AFs) that encompass many traditional and state-of-the-art activation functions. This family includes the Sigmoid, Gaussian, ReLU, GELU, and Swish functions, as well as a large variety of smooth interpolations between these functions. Our technique requires only a small number of additional trainable parameters, and needs no further specialized optimization or initialization procedures. For this reason, FALUs present a seamless and rich automated solution to the problem of activation function optimization. Through experiments on a variety of conventional tasks and network architectures, we demonstrate the effectiveness of FALUs when compared to traditional and state-of-the-art AFs. To facilitate practical use of this work, we plan to make our code publicly available
Cite
Text
Zamora et al. "Fractional Adaptive Linear Units." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I8.20882Markdown
[Zamora et al. "Fractional Adaptive Linear Units." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/zamora2022aaai-fractional/) doi:10.1609/AAAI.V36I8.20882BibTeX
@inproceedings{zamora2022aaai-fractional,
title = {{Fractional Adaptive Linear Units}},
author = {Zamora, Julio and Rhodes, Anthony D. and Nachman, Lama},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {8988-8996},
doi = {10.1609/AAAI.V36I8.20882},
url = {https://mlanthology.org/aaai/2022/zamora2022aaai-fractional/}
}