Skip to yearly menu bar Skip to main content


Lower-level Duality Based Reformulation and Majorization Minimization Algorithm for Hyperparameter Optimization

He Chen · Haochen Xu · Rujun Jiang · Anthony Man-Cho So

MR1 & MR2 - Number 166
[ ]
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT


Hyperparameter tuning is an important task of machine learning, which can be formulated as a bilevel program (BLP). However, most existing algorithms are not applicable for BLP with non-smooth lower-level problems. To address this, we propose a single-level reformulation of the BLP based on lower-level duality without involving any implicit value function. To solve the reformulation, we propose a majorization minimization algorithm that marjorizes the constraint in each iteration. Furthermore, we show that the subproblems of the proposed algorithm for several widely-used hyperparameter turning models can be reformulated into conic programs that can be efficiently solved by the off-the-shelf solvers. We theoretically prove the convergence of the proposed algorithm and demonstrate its superiority through numerical experiments.

Live content is unavailable. Log in and register to view live content