Blurry-Edges: Photon-Limited Depth Estimation from Defocused Boundaries

Abstract

Extracting depth information from photon-limited, defocused images is challenging because depth from defocus (DfD) relies on accurate estimation of defocus blur, which is fundamentally sensitive to image noise. We present a novel approach to robustly measure object depths from photon-limited images along the defocused boundaries. It is based on a new image patch representation, Blurry-Edges, that explicitly stores and visualizes a rich set of low-level patch information, including boundaries, color, and smoothness. We develop a deep neural network architecture that predicts the Blurry-Edges representation from a pair of differently defocused images, from which depth can be calculated using a closed-form DfD relation we derive. The experimental results on synthetic and real data show that our method achieves the highest depth estimation accuracy on photon-limited images compared to a broad range of state-of-the-art DfD methods.

Cite

Text

Xu et al. "Blurry-Edges: Photon-Limited Depth Estimation from Defocused Boundaries." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.00049

Markdown

[Xu et al. "Blurry-Edges: Photon-Limited Depth Estimation from Defocused Boundaries." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/xu2025cvpr-blurryedges/) doi:10.1109/CVPR52734.2025.00049

BibTeX

@inproceedings{xu2025cvpr-blurryedges,
  title     = {{Blurry-Edges: Photon-Limited Depth Estimation from Defocused Boundaries}},
  author    = {Xu, Wei and Wagner, Charles James and Luo, Junjie and Guo, Qi},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {432-441},
  doi       = {10.1109/CVPR52734.2025.00049},
  url       = {https://mlanthology.org/cvpr/2025/xu2025cvpr-blurryedges/}
}