Maximum Response Filters for Texture Analysis
Abstract
Current texture analysis focuses either on gathering correlations between image patches and filters, or on explicitly modeling the dependencies between pixels. Both strategies are unable to cope directly with changes in scale, or more general, in viewpoint and illumination. To accommodate to these extra variations, texture segmentation analyzes the texture over multiple scales and classification algorithms include multiple models for a single texture class. We propose a filter-based texture model that allows for a more compact texture representation, independent of viewpoint and illumination. This is achieved by locally optimizing the filter responses through a predefined set of transformations of the filter support. Results are shown for both texture classification and texture segmentation experiments.
Cite
Text
Caenen and Van Gool. "Maximum Response Filters for Texture Analysis." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2004. doi:10.1109/CVPR.2004.393Markdown
[Caenen and Van Gool. "Maximum Response Filters for Texture Analysis." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2004.](https://mlanthology.org/cvpr/2004/caenen2004cvpr-maximum/) doi:10.1109/CVPR.2004.393BibTeX
@inproceedings{caenen2004cvpr-maximum,
title = {{Maximum Response Filters for Texture Analysis}},
author = {Caenen, Geert and Van Gool, Luc},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2004},
pages = {58},
doi = {10.1109/CVPR.2004.393},
url = {https://mlanthology.org/cvpr/2004/caenen2004cvpr-maximum/}
}