Combining Local Symmetry Exploitation and Reinforcement Learning for Optimised Probabilistic Inference - A Work In Progress

Hamid, Sagad; Braun, Tanya

Research article in digital collection | Preprint | Peer reviewed

Abstract

Efficient probabilistic inference by variable elimination in graphical models requires an optimal elimination order. However, finding an optimal order is a challenging combinatorial optimisation problem for models with a large number of random variables. Most recently, a reinforcement learning approach has been proposed to find efficient contraction orders in tensor networks. Due to the duality between graphical models and tensor networks, we adapt this approach to probabilistic inference in graphical models. Furthermore, we incorporate structure exploitation into the process of finding an optimal order. Currently, the agent’s cost function is formulated in terms of intermediate result sizes which are exponential in the number of indices (i.e., random variables). We show that leveraging specific structures during inference allows for introducing compact encodings of intermediate results which can be significantly smaller. By considering the compact encoding sizes for the cost function in- stead, we enable the agent to explore more efficient contraction orders. The structure we consider in this work is the presence of local symmetries (i.e., symmetries within a model’s factors).

Details about the publication

Name of the repositoryarXiv
Article number2503.08786
StatusPublished
Release year2025
Language in which the publication is writtenEnglish
Link to the full texthttps://arxiv.org/pdf/2503.08786
Keywordsvariable elimination; reinforcement learning; probabilistic inference; tensor networks

Authors from the University of Münster

Braun, Tanya
Junior professorship for practical computer science - modern aspects of data processing / data science (Prof. Braun)
Hamid, Sagad
Junior professorship for practical computer science - modern aspects of data processing / data science (Prof. Braun)