Learning Large Logic Programs by Going Beyond Entailment
Abstract
Classical clustering methods usually face tough challenges when we have a larger set of features compared to the number of items to be partitioned. We propose a Sparse MinMax k-Means Clustering approach by reformulating the objective of the MinMax k-Means algorithm (a variation of classical k-Means that minimizes the maximum intra-cluster variance instead of the sum of intra-cluster variances), into a new weighted between-cluster sum of squares (BCSS) form. We impose sparse regularization on these weights to make it suitable for high-dimensional clustering. We seek to use the advantages of the MinMax k-Means algorithm in the high-dimensional space to generate good quality clusters. The efficacy of the proposal is showcased through comparison against a few representative clustering methods over several real world datasets.
Cite
Text
Cropper and Dumancic. "Learning Large Logic Programs by Going Beyond Entailment." International Joint Conference on Artificial Intelligence, 2020. doi:10.24963/IJCAI.2020/287Markdown
[Cropper and Dumancic. "Learning Large Logic Programs by Going Beyond Entailment." International Joint Conference on Artificial Intelligence, 2020.](https://mlanthology.org/ijcai/2020/cropper2020ijcai-learning/) doi:10.24963/IJCAI.2020/287BibTeX
@inproceedings{cropper2020ijcai-learning,
title = {{Learning Large Logic Programs by Going Beyond Entailment}},
author = {Cropper, Andrew and Dumancic, Sebastijan},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2020},
pages = {2073-2079},
doi = {10.24963/IJCAI.2020/287},
url = {https://mlanthology.org/ijcai/2020/cropper2020ijcai-learning/}
}