A Short Review of Automatic Differentiation Pitfalls in Scientific Computing

Abstract

Automatic differentiation, also known as backpropagation, AD, autodiff, or algorithmic differentiation, is a popular technique for computing derivatives of computer programs. While AD has been successfully used in countless engineering, science and machine learning applications, it can sometimes nevertheless produce surprising results. In this paper we categorize problematic usages of AD and illustrate each category with examples such as chaos, time-averages, discretizations, fixed-point loops, lookup tables, linear solvers, and probabilistic programs, in the hope that readers may more easily avoid or detect such pitfalls.

Cite

Text

Hueckelheim et al. "A Short Review of Automatic Differentiation Pitfalls in Scientific Computing." ICML 2023 Workshops: Differentiable_Almost_Everything, 2023.

Markdown

[Hueckelheim et al. "A Short Review of Automatic Differentiation Pitfalls in Scientific Computing." ICML 2023 Workshops: Differentiable_Almost_Everything, 2023.](https://mlanthology.org/icmlw/2023/hueckelheim2023icmlw-short/)

BibTeX

@inproceedings{hueckelheim2023icmlw-short,
  title     = {{A Short Review of Automatic Differentiation Pitfalls in Scientific Computing}},
  author    = {Hueckelheim, Jan and Menon, Harshitha and Moses, William S. and Christianson, Bruce and Hovland, Paul and Hascoet, Laurent},
  booktitle = {ICML 2023 Workshops: Differentiable_Almost_Everything},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/hueckelheim2023icmlw-short/}
}