Live Face Verification with Multiple Instantialized Local Homographic Parameterization
Abstract
State-of-the-art live face verification methods would easily be attacked by recorded facial expression sequence. This work directly addresses this issue via proposing a patch-wise motion parameterization based verification network infrastructure. This method directly explores the underlying subtle motion difference between the facial movements re-captured from a planer screen (e.g., a pad) and those from a real face; therefore interactive facial expression is no longer required. Furthermore, inspired by the fact that ?a fake facial movement sequence MUST contains many patch-wise fake sequences?, we embed our network into a multiple instance learning framework, which further enhance the recall rate of the proposed technique. Extensive experimental results on several face benchmarks well demonstrate the superior performance of our method.
Cite
Text
Lin et al. "Live Face Verification with Multiple Instantialized Local Homographic Parameterization." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/113Markdown
[Lin et al. "Live Face Verification with Multiple Instantialized Local Homographic Parameterization." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/lin2018ijcai-live/) doi:10.24963/IJCAI.2018/113BibTeX
@inproceedings{lin2018ijcai-live,
title = {{Live Face Verification with Multiple Instantialized Local Homographic Parameterization}},
author = {Lin, Chen and Liao, Zhouyingcheng and Zhou, Peng and Hu, Jianguo and Ni, Bingbing},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2018},
pages = {814-820},
doi = {10.24963/IJCAI.2018/113},
url = {https://mlanthology.org/ijcai/2018/lin2018ijcai-live/}
}