Local Grouping for Optical Flow

Abstract

Optical flow estimation requires spatial integration, which essentially poses a grouping question: what points belong to the same motion and what do not. Classical local approaches to optical flow, such as Lucas-Kanade, use isotropic neighborhoods and have considerable difficulty near motion boundaries. In this work we utilize image-based grouping to facilitate spatial- and scale-adaptive integration. We define soft spatial support using pairwise affinities computed through intervening contour. We sample images at edges and corners, and iteratively estimate affine motion at sample points. Figure-ground organization further improves grouping and flow estimation near boundaries. We show that affinity-based spatial integration enables reliable flow estimation and avoids erroneous motion propagation from and/or across object boundaries. We demonstrate our approach on the Middlebury flow dataset.

Cite

Text

Ren. "Local Grouping for Optical Flow." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2008. doi:10.1109/CVPR.2008.4587536

Markdown

[Ren. "Local Grouping for Optical Flow." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2008.](https://mlanthology.org/cvpr/2008/ren2008cvpr-local/) doi:10.1109/CVPR.2008.4587536

BibTeX

@inproceedings{ren2008cvpr-local,
  title     = {{Local Grouping for Optical Flow}},
  author    = {Ren, Xiaofeng},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2008},
  doi       = {10.1109/CVPR.2008.4587536},
  url       = {https://mlanthology.org/cvpr/2008/ren2008cvpr-local/}
}