Skip to yearly menu bar Skip to main content


Exploring the Power of Graph Neural Networks in Solving Linear Optimization Problems

Chendi Qian · Didier Ch├ętelat · Christopher Morris

MR1 & MR2 - Number 145
[ ]
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT


Recently, machine learning, particularly message-passing graph neural networks (MPNNs), has gained traction in enhancing exact optimization algorithms. For example, MPNNs speed up solving mixed-integer optimization problems by imitating computational intensive heuristics like strong branching, which entails solving multiple linear optimization problems (LPs). Despite the empirical success, the reasons behind MPNNs' effectiveness in emulating linear optimization remain largely unclear. Here, we show that MPNNs can simulate standard interior-point methods for LPs, explaining their practical success. Furthermore, we highlight how MPNNs can serve as a lightweight proxy for solving LPs, adapting to a given problem instance distribution. Empirically, we show that MPNNs solve LP relaxations of standard combinatorial optimization problems close to optimality, often surpassing conventional solvers and competing approaches in solving time.

Live content is unavailable. Log in and register to view live content