A Note on Mixtures of Experts for Multiclass Responses: Approximation Rate and Consistent Bayesian Inference

Abstract

We report that mixtures of m multinomial logistic regression can be used to approximate a class of 'smooth' probability models for multiclass responses. With bounded second derivatives of log-odds, the approximation rate is O(m-2/s) in Hellinger distance or O(m-4/s) in Kullback-Leibler divergence. Here s = dim(x) is the dimension of the input space (or the number of predictors). With the availability of training data of size n, we also show that 'consistency' in multiclass regression and classification can be achieved, simultaneously for all classes, when posterior based inference is performed in a Bayesian framework. Loosely speaking, such 'consistency' refers to performance being often close to the best possible for large n. Consistency can be achieved either by taking m = mn, or by taking m to be uniformly distributed among 1, ...,mn according to the prior, where 1 ≺ mn ≺ na in order as n grows, for some a ∈ (0, 1).

Cite

Text

Ge and Jiang. "A Note on Mixtures of Experts for Multiclass Responses: Approximation Rate and Consistent Bayesian Inference." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143886

Markdown

[Ge and Jiang. "A Note on Mixtures of Experts for Multiclass Responses: Approximation Rate and Consistent Bayesian Inference." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/ge2006icml-note/) doi:10.1145/1143844.1143886

BibTeX

@inproceedings{ge2006icml-note,
  title     = {{A Note on Mixtures of Experts for Multiclass Responses: Approximation Rate and Consistent Bayesian Inference}},
  author    = {Ge, Yang and Jiang, Wenxin},
  booktitle = {International Conference on Machine Learning},
  year      = {2006},
  pages     = {329-335},
  doi       = {10.1145/1143844.1143886},
  url       = {https://mlanthology.org/icml/2006/ge2006icml-note/}
}