Softassign Versus SoftMax: Benchmarks in Combinatorial Optimization
Abstract
A new technique, termed soft assign, is applied for the first time to two classic combinatorial optimization problems, the travel(cid:173) ing salesman problem and graph partitioning. Soft assign , which has emerged from the recurrent neural network/statistical physics framework, enforces two-way (assignment) constraints without the use of penalty terms in the energy functions. The soft assign can also be generalized from two-way winner-take-all constraints to multiple membership constraints which are required for graph par(cid:173) titioning. The soft assign technique is compared to the softmax (Potts glass). Within the statistical physics framework, softmax and a penalty term has been a widely used method for enforcing the two-way constraints common within many combinatorial optimiza(cid:173) tion problems. The benchmarks present evidence that soft assign has clear advantages in accuracy, speed, parallelizabilityand algo(cid:173) rithmic simplicity over softmax and a penalty term in optimization problems with two-way constraints.
Cite
Text
Gold and Rangarajan. "Softassign Versus SoftMax: Benchmarks in Combinatorial Optimization." Neural Information Processing Systems, 1995.Markdown
[Gold and Rangarajan. "Softassign Versus SoftMax: Benchmarks in Combinatorial Optimization." Neural Information Processing Systems, 1995.](https://mlanthology.org/neurips/1995/gold1995neurips-softassign/)BibTeX
@inproceedings{gold1995neurips-softassign,
title = {{Softassign Versus SoftMax: Benchmarks in Combinatorial Optimization}},
author = {Gold, Steven and Rangarajan, Anand},
booktitle = {Neural Information Processing Systems},
year = {1995},
pages = {626-632},
url = {https://mlanthology.org/neurips/1995/gold1995neurips-softassign/}
}