Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification
Abstract
We prove certain theoretical properties of a graph-regularized transductive learning objective that is based on minimizing a Kullback-Leibler divergence based loss. These include showing that the iterative alternating minimization procedure used to minimize the objective converges to the correct solution and deriving a test for convergence. We also propose a graph node ordering algorithm that is cache cognizant and leads to a linear speedup in parallel computations. This ensures that the algorithm scales to large data sets. By making use of empirical evaluation on the TIMIT and Switchboard I corpora, we show this approach is able to out-perform other state-of-the-art SSL approaches. In one instance, we solve a problem on a 120 million node graph.
Cite
Text
Subramanya and Bilmes. "Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification." Neural Information Processing Systems, 2009.Markdown
[Subramanya and Bilmes. "Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification." Neural Information Processing Systems, 2009.](https://mlanthology.org/neurips/2009/subramanya2009neurips-entropic/)BibTeX
@inproceedings{subramanya2009neurips-entropic,
title = {{Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification}},
author = {Subramanya, Amarnag and Bilmes, Jeff A.},
booktitle = {Neural Information Processing Systems},
year = {2009},
pages = {1803-1811},
url = {https://mlanthology.org/neurips/2009/subramanya2009neurips-entropic/}
}