Self-Driving Multimodal Studies at User Facilities

Abstract

Multimodal characterization is commonly required for understanding materials. User facilities possess the infrastructure to perform these measurements, albeit in serial over days to months. In this paper, we describe a unified multimodal measurement of a single sample library at distant instruments, driven by a concert of distributed agents that use analysis from each modality to inform the direction of the other in real time. Powered by the Bluesky project at the National Synchrotron Light Source II, this experiment is a world's first for beamline science, and provides a blueprint for future approaches to multimodal and multifidelity experiments at user facilities.

Cite

Text

Maffettone et al. "Self-Driving Multimodal Studies at User Facilities." NeurIPS 2022 Workshops: AI4Mat, 2022.

Markdown

[Maffettone et al. "Self-Driving Multimodal Studies at User Facilities." NeurIPS 2022 Workshops: AI4Mat, 2022.](https://mlanthology.org/neuripsw/2022/maffettone2022neuripsw-selfdriving/)

BibTeX

@inproceedings{maffettone2022neuripsw-selfdriving,
  title     = {{Self-Driving Multimodal Studies at User Facilities}},
  author    = {Maffettone, Phillip and Allan, Daniel and Campbell, Stuart Ian and Carbone, Matthew R and Caswell, Thomas and DeCost, Brian L and Gavrilov, Dmitri and Hanwell, Marcus and Joress, Howie and Lynch, Joshua and Ravel, Bruce and Wilkins, Stuart and Wlodek, Jakub and Olds, Daniel},
  booktitle = {NeurIPS 2022 Workshops: AI4Mat},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/maffettone2022neuripsw-selfdriving/}
}