Gender and Smile Classification Using Deep Convolutional Neural Networks
Abstract
Facial gender and smile classification in unconstrained environment is challenging due to the invertible and large variations of face images. In this paper, we propose a deep model composed of GNet and SNet for these two tasks. We leverage the multi-task learning and the general-to-specific fine-tuning scheme to enhance the performance of our model. Our strategies exploit the inherent correlation between face identity, smile, gender and other face attributes to relieve the problem of over-fitting on small training set and improve the classification performance. We also propose the tasks-aware face cropping scheme to extract attribute-specific regions. The experimental results on the ChaLearn 16 FotW dataset for gender and smile classification demonstrate the effectiveness of our proposed methods.
Cite
Text
Zhang et al. "Gender and Smile Classification Using Deep Convolutional Neural Networks." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2016. doi:10.1109/CVPRW.2016.97Markdown
[Zhang et al. "Gender and Smile Classification Using Deep Convolutional Neural Networks." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2016.](https://mlanthology.org/cvprw/2016/zhang2016cvprw-gender/) doi:10.1109/CVPRW.2016.97BibTeX
@inproceedings{zhang2016cvprw-gender,
title = {{Gender and Smile Classification Using Deep Convolutional Neural Networks}},
author = {Zhang, Kaipeng and Tan, Lianzhi and Li, Zhifeng and Qiao, Yu},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2016},
pages = {739-743},
doi = {10.1109/CVPRW.2016.97},
url = {https://mlanthology.org/cvprw/2016/zhang2016cvprw-gender/}
}