Publication Type
Journal Article
Version
acceptedVersion
Publication Date
8-2025
Abstract
Solving stochastic integer programs (SIPs) is extremely intractable due to the high computational complexity. To solve two-stage SIPs efficiently, we propose a conditional variational autoencoder (CVAE) for scenario representation learning. A graph convolutional network (GCN) based VAE embeds scenarios into a low-dimensional latent space, conditioned on the deterministic context of each instance. With the latent representations of stochastic scenarios, we perform two auxiliary tasks: objective prediction and scenario contrast, which predict scenario objective values and the similarities between them, respectively. These tasks further integrate objective information into the representations through gradient backpropagation. Experiments show that the learned scenario representations can help reduce scenarios in SIPs, facilitating high-quality solutions in a short computational time. This superiority generalizes well to instances of larger sizes, more scenarios, and various distributions.
Keywords
Conditional variational autoencoder, Contrastive learning, Semi-supervised learning, Stochastic integer programs
Discipline
Artificial Intelligence and Robotics | Theory and Algorithms
Research Areas
Intelligent Systems and Optimization
Publication
Neural Networks
Volume
188
First Page
1
Last Page
15
ISSN
0893-6080
Identifier
10.1016/j.neunet.2025.107446
Publisher
Elsevier
Citation
WU, Yaoxin; CAO, Zhiguang; SONG, Wen; and ZHANG, Yingqian.
Solving two-stage stochastic integer programs via representation learning. (2025). Neural Networks. 188, 1-15.
Available at: https://ink.library.smu.edu.sg/sis_research/10158
Copyright Owner and License
Authors
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1016/j.neunet.2025.107446