Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Stochastic Gradient Descent for Bézier Simplex Representation of Pareto Set in Multi-Objective Optimization

Yasunari Hikima · Jian Wu · Naonori Ueda · Akiyoshi Sannai · Naoki Hamada


Abstract: Multi-objective optimization aims to find a set of solutions that achieve the best trade-off among multiple conflicting objective functions. While various multi-objective optimization algorithms have been proposed so far, most of them aim to find finite solutions as an approximation of the Pareto set, which may not adequately capture the entire structure of the Pareto set, especially when the number of variables is large. To overcome this limitation, we propose a method to obtain a parametric hypersurface representing the entire Pareto set instead of a finite set of points. Since the Pareto set of an M-objective optimization problem typically forms an (M1)-dimensional simplex, we use a Bézier simplex as a model to express the Pareto set. We then develop a stochastic gradient descent-based algorithm that updates the Bézier simplex model toward the Pareto set, introducing a preconditioning matrix to enhance convergence. Our convergence analysis demonstrated that the proposed algorithm outperforms naive stochastic gradient descent in terms of convergence rate. Furthermore, we validate the effectiveness of our method through various multi-objective optimization problem instances, including real-world problems.

Live content is unavailable. Log in and register to view live content