EgoPet: Egomotion and Interaction Data from an Animal's Perspective

Abstract

Animals perceive the world to plan their actions and interact with other agents to accomplish complex tasks, demonstrating capabilities that are still unmatched by AI systems. To advance our understanding and reduce the gap between the capabilities of animals and AI systems, we introduce a dataset of pet egomotion imagery with diverse examples of simultaneous egomotion and multi-agent interaction. Current video datasets separately contain egomotion and interaction examples, but rarely both at the same time. In addition, EgoPet offers a radically distinct perspective from existing egocentric datasets of humans or vehicles. We define two in-domain benchmark tasks that capture animal behavior, and a third benchmark to assess the utility of EgoPet as a pretraining resource to robotic quadruped locomotion, showing that models trained from EgoPet outperform those trained from prior datasets. 1 1 Project page: www.amirbar.net/egopet

Cite

Text

Bar et al. "EgoPet: Egomotion and Interaction Data from an Animal's Perspective." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-72913-3_21

Markdown

[Bar et al. "EgoPet: Egomotion and Interaction Data from an Animal's Perspective." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/bar2024eccv-egopet/) doi:10.1007/978-3-031-72913-3_21

BibTeX

@inproceedings{bar2024eccv-egopet,
  title     = {{EgoPet: Egomotion and Interaction Data from an Animal's Perspective}},
  author    = {Bar, Amir and Bakhtiar, Arya and Tran, Danny L and Loquercio, Antonio and Rajasegaran, Jathushan and Lecun, Yann and Globerson, Amir and Darrell, Trevor},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2024},
  doi       = {10.1007/978-3-031-72913-3_21},
  url       = {https://mlanthology.org/eccv/2024/bar2024eccv-egopet/}
}