Proper Loss Functions for Nonlinear Hawkes Processes
Abstract
Temporal point processes are a statistical framework for modelling the times at which events of interest occur. The Hawkes process is a well-studied instance of this framework that captures self-exciting behaviour, wherein the occurrence of one event increases the likelihood of future events. Such processes have been successfully applied to model phenomena ranging from earthquakes to behaviour in a social network. We propose a framework to design new loss functions to train linear and nonlinear Hawkes processes. This captures standard maximum likelihood as a special case, but allows for other losses that guarantee convex objective functions (for certain types of kernel), and admit simpler optimisation. We illustrate these points with three concrete examples: for linear Hawkes processes, we provide a least-squares style loss potentially admitting closed-form optimisation; for exponential Hawkes processes, we reduce training to a weighted logistic regression; and for sigmoidal Hawkes processes, we propose an asymmetric form of logistic regression.
Cite
Text
Menon and Lee. "Proper Loss Functions for Nonlinear Hawkes Processes." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.11615Markdown
[Menon and Lee. "Proper Loss Functions for Nonlinear Hawkes Processes." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/menon2018aaai-proper/) doi:10.1609/AAAI.V32I1.11615BibTeX
@inproceedings{menon2018aaai-proper,
title = {{Proper Loss Functions for Nonlinear Hawkes Processes}},
author = {Menon, Aditya Krishna and Lee, Young},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2018},
pages = {3804-3811},
doi = {10.1609/AAAI.V32I1.11615},
url = {https://mlanthology.org/aaai/2018/menon2018aaai-proper/}
}