Order Parameters for Minimax Entropy Distributions: When Does High Level Knowledge Help?
Abstract
Many problems in vision can be formulated as Bayesian inference. It is important to determine the accuracy of these inferences and how they depend on the problem domain. In recent work, Coughlan and Yuille showed that, for a restricted class of problems, the performance of Bayesian inference could be summarized by an order parameter K which depends on the probability distributions which characterize the problem domain. In this paper we generalize the theory of order parameters so that it applies to domains for which the probability models can be obtained by Minimax Entropy learning theory. By analyzing order parameters it is possible to determine whether a target can be detected using a general purpose "generic" model or whether a more specific "high-level" model is needed. At critical values of the order parameters the problem becomes unsolvable without the addition of extra prior knowledge.
Cite
Text
Yuille et al. "Order Parameters for Minimax Entropy Distributions: When Does High Level Knowledge Help?." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2000. doi:10.1109/CVPR.2000.855869Markdown
[Yuille et al. "Order Parameters for Minimax Entropy Distributions: When Does High Level Knowledge Help?." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2000.](https://mlanthology.org/cvpr/2000/yuille2000cvpr-order/) doi:10.1109/CVPR.2000.855869BibTeX
@inproceedings{yuille2000cvpr-order,
title = {{Order Parameters for Minimax Entropy Distributions: When Does High Level Knowledge Help?}},
author = {Yuille, Alan L. and Coughlan, James M. and Zhu, Song Chun and Wu, Ying Nian},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2000},
pages = {1558-1565},
doi = {10.1109/CVPR.2000.855869},
url = {https://mlanthology.org/cvpr/2000/yuille2000cvpr-order/}
}