Learning from Less: Bayesian Neural Networks for Optimization Proxy Using Limited Labeled Data

Abstract

This work introduces a learning scheme using Bayesian Neural Networks (BNNs) to solve constrained optimization problems in a setting with limited labeled data and restricted model training time. We propose a Semi-Supervised BNN for this practical but complex regime wherein training commences in a sandwiched fashion, alternating between a supervised (using labeled data) learning step for minimizing cost, and an unsupervised (using unlabeled data) learning step for enforcing constraint feasibility. Both supervised and unsupervised steps use Bayesian approach where variational inference is used for approximate Bayesian inference. We show that the proposed Semi-supervised learning method outperforms conventional BNN and deep neural network (DNN) architectures for important non-convex constrained optimization problems from energy network operations, with 50% reduction in mean square error (MSE) along with halving of optimality and feasibility gaps without requiring correction or projection steps.

Cite

Text

Pareek et al. "Learning from Less: Bayesian Neural Networks for Optimization Proxy Using Limited Labeled Data." NeurIPS 2024 Workshops: BDU, 2024.

Markdown

[Pareek et al. "Learning from Less: Bayesian Neural Networks for Optimization Proxy Using Limited Labeled Data." NeurIPS 2024 Workshops: BDU, 2024.](https://mlanthology.org/neuripsw/2024/pareek2024neuripsw-learning/)

BibTeX

@inproceedings{pareek2024neuripsw-learning,
  title     = {{Learning from Less: Bayesian Neural Networks for Optimization Proxy Using Limited Labeled Data}},
  author    = {Pareek, Parikshit and Sundar, Kaarthik and Deka, Deepjyoti and Misra, Sidhant},
  booktitle = {NeurIPS 2024 Workshops: BDU},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/pareek2024neuripsw-learning/}
}