Content area

Abstract

Belief Propagation (BP) is a fundamental heuristic for solving Constraint Optimization Problems (COPs), yet its practical applicability is constrained by slow convergence and instability in loopy factor graphs. While Damped BP (DBP) improves convergence by using manually tuned damping factors, its reliance on labor-intensive hyperparameter optimization limits scalability. Deep Attentive BP (DABP) addresses this by automating damping through recurrent neural networks (RNNs), but introduces significant memory overhead and sequential computation bottlenecks. To reduce memory usage and accelerate deep belief propagation, this paper introduces Fast Deep Belief Propagation (FDBP), a deep learning framework that improves COP solving through online self-supervised learning and graphics processing unit (GPU) acceleration. FDBP decouples the learning of damping factors from BP message passing, inferring all parameters for an entire BP iteration in a single step, and leverages mixed precision to further optimize GPU memory usage. This approach substantially improves both the efficiency and scalability of BP optimization. Extensive evaluations on synthetic and real-world benchmarks highlight the superiority of FDBP, especially for large-scale instances where DABP fails due to memory constraints. Moreover, FDBP achieves an average speedup of 2.87× over DABP with the same restart counts. Because BP for COPs is a mathematically grounded GPU-parallel message-passing framework that bridges applied mathematics, computing, and machine learning, and is widely applicable across science and engineering, our work offers a promising step toward more efficient solutions to these problems.

Details

1009240
Business indexing term
Title
Fast Deep Belief Propagation: An Efficient Learning-Based Algorithm for Solving Constraint Optimization Problems
Author
Kong Shufeng 1 ; Chen, Feifan 2 ; Wang, Zijie 2 ; Liu, Caihua 3 

 School of Software Engineering, Sun Yat-sen University, Zhuhai 519000, China; [email protected] (S.K.);, Department of Computer Science, Cornell University, Ithaca, NY 14850, USA 
 School of Software Engineering, Sun Yat-sen University, Zhuhai 519000, China; [email protected] (S.K.); 
 Department of Computer Science, Cornell University, Ithaca, NY 14850, USA 
Publication title
Volume
13
Issue
20
First page
3349
Number of pages
19
Publication year
2025
Publication date
2025
Publisher
MDPI AG
Place of publication
Basel
Country of publication
Switzerland
Publication subject
e-ISSN
22277390
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-10-21
Milestone dates
2025-09-20 (Received); 2025-10-18 (Accepted)
Publication history
 
 
   First posting date
21 Oct 2025
ProQuest document ID
3265921024
Document URL
https://www.proquest.com/scholarly-journals/fast-deep-belief-propagation-efficient-learning/docview/3265921024/se-2?accountid=208611
Copyright
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-10-28
Database
ProQuest One Academic