TorchOpt: An Efficient Library for Differentiable Optimization

Abstract

Differentiable optimization algorithms often involve expensive computations of various meta-gradients. To address this, we design and implement TorchOpt, a new PyTorch-based differentiable optimization library. TorchOpt provides an expressive and unified programming interface that simplifies the implementation of explicit, implicit, and zero-order gradients. Moreover, TorchOpt has a distributed execution runtime capable of parallelizing diverse operations linked to differentiable optimization tasks across CPU and GPU devices. Experimental results demonstrate that TorchOpt achieves a 5.2× training time speedup in a cluster. TorchOpt is open-sourced at https://github.com/metaopt/torchopt and has become a PyTorch Ecosystem project.

Cite

Text

Ren et al. "TorchOpt: An Efficient Library for Differentiable Optimization." Machine Learning Open Source Software, 2023.

Markdown

[Ren et al. "TorchOpt: An Efficient Library for Differentiable Optimization." Machine Learning Open Source Software, 2023.](https://mlanthology.org/mloss/2023/ren2023jmlr-torchopt/)

BibTeX

@article{ren2023jmlr-torchopt,
  title     = {{TorchOpt: An Efficient Library for Differentiable Optimization}},
  author    = {Ren, Jie and Feng, Xidong and Liu, Bo and Pan, Xuehai and Fu, Yao and Mai, Luo and Yang, Yaodong},
  journal   = {Machine Learning Open Source Software},
  year      = {2023},
  pages     = {1-14},
  volume    = {24},
  url       = {https://mlanthology.org/mloss/2023/ren2023jmlr-torchopt/}
}