Adaptive Activation Functions Using Fractional Calculus
Abstract
We introduce a generalization methodology for the automatic selection of the activation functions inside a neural network, taking advantage of concepts defined in fractional calculus. This methodology enables the neural network to define and optimize its own activation functions during the training process, by defining the fractional order of the derivative of a given primitive activation function, tuned as an additional training hyper-parameter. By following this approach, the neurons inside the network can adjust their activation functions, e.g. from MLP to RBF networks, to best fit the input data, and reduce the output error. The result show the benefits of using this technique implemented on a ResNet18 topology by outperforming the accuracy of a ResNet100 trained with CIFAR10 reported in the literature.
Cite
Text
Zamora-Esquivel et al. "Adaptive Activation Functions Using Fractional Calculus." IEEE/CVF International Conference on Computer Vision Workshops, 2019. doi:10.1109/ICCVW.2019.00250Markdown
[Zamora-Esquivel et al. "Adaptive Activation Functions Using Fractional Calculus." IEEE/CVF International Conference on Computer Vision Workshops, 2019.](https://mlanthology.org/iccvw/2019/zamoraesquivel2019iccvw-adaptive-a/) doi:10.1109/ICCVW.2019.00250BibTeX
@inproceedings{zamoraesquivel2019iccvw-adaptive-a,
title = {{Adaptive Activation Functions Using Fractional Calculus}},
author = {Zamora-Esquivel, Julio and Vargas, Jesus Adan Cruz and Pérez, José Rodrigo Camacho and Lopez-Meyer, Paulo and Cordourier, Héctor A. and Tickoo, Omesh},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2019},
pages = {2006-2013},
doi = {10.1109/ICCVW.2019.00250},
url = {https://mlanthology.org/iccvw/2019/zamoraesquivel2019iccvw-adaptive-a/}
}