Optimal Attack Against Autoregressive Models by Manipulating the Environment

Abstract

We describe an optimal adversarial attack formulation against autoregressive time series forecast using Linear Quadratic Regulator (LQR). In this threat model, the environment evolves according to a dynamical system; an autoregressive model observes the current environment state and predicts its future values; an attacker has the ability to modify the environment state in order to manipulate future autoregressive forecasts. The attacker's goal is to force autoregressive forecasts into tracking a target trajectory while minimizing its attack expenditure. In the white-box setting where the attacker knows the environment and forecast models, we present the optimal attack using LQR for linear models, and Model Predictive Control (MPC) for nonlinear models. In the black-box setting, we combine system identification and MPC. Experiments demonstrate the effectiveness of our attacks.

Cite

Text

Chen and Zhu. "Optimal Attack Against Autoregressive Models by Manipulating the Environment." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.5760

Markdown

[Chen and Zhu. "Optimal Attack Against Autoregressive Models by Manipulating the Environment." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/chen2020aaai-optimal/) doi:10.1609/AAAI.V34I04.5760

BibTeX

@inproceedings{chen2020aaai-optimal,
  title     = {{Optimal Attack Against Autoregressive Models by Manipulating the Environment}},
  author    = {Chen, Yiding and Zhu, Xiaojin},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {3545-3552},
  doi       = {10.1609/AAAI.V34I04.5760},
  url       = {https://mlanthology.org/aaai/2020/chen2020aaai-optimal/}
}