ODIN: An OmniDirectional INdoor Dataset Capturing Activities of Daily Living from Multiple Synchronized Modalities

Abstract

We introduce ODIN (the OmniDirectional INdoor dataset), the first large-scale multi-modal dataset aimed at spurring research using top-view omnidirectional cameras in challenges related to human behaviour understanding. Recorded in real-life indoor environments with varying levels of occlusion, the dataset contains images of participants performing various activities of daily living. Along with omnidirectional images, additional synchronized modalities of data are provided. These include (1) RGB, infrared, and depth images from multiple RGB-D cameras, (2) egocentric videos, (3) physiological signals and accelerometer readings from a smart bracelet, and (4) 3D scans of the recording environments. To the best of our knowledge, ODIN is also the first dataset to provide camera-frame 3D human pose estimates for omnidirectional images, which are obtained using our novel pipeline. The project is open sourced and available at https://odin-dataset.github.io.

Cite

Text

Ravi et al. "ODIN: An OmniDirectional INdoor Dataset Capturing Activities of Daily Living from Multiple Synchronized Modalities." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023. doi:10.1109/CVPRW59228.2023.00690

Markdown

[Ravi et al. "ODIN: An OmniDirectional INdoor Dataset Capturing Activities of Daily Living from Multiple Synchronized Modalities." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023.](https://mlanthology.org/cvprw/2023/ravi2023cvprw-odin/) doi:10.1109/CVPRW59228.2023.00690

BibTeX

@inproceedings{ravi2023cvprw-odin,
  title     = {{ODIN: An OmniDirectional INdoor Dataset Capturing Activities of Daily Living from Multiple Synchronized Modalities}},
  author    = {Ravi, Siddharth and Climent-Pérez, Pau and Morales, Théo and Huesca-Spairani, Carlo and Hashemifard, Kooshan and Flórez-Revuelta, Francisco},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2023},
  pages     = {6488-6497},
  doi       = {10.1109/CVPRW59228.2023.00690},
  url       = {https://mlanthology.org/cvprw/2023/ravi2023cvprw-odin/}
}