Alternating Direction Method of Multipliers for Quantization

Tianjian Huang · Prajwal Singhania · Maziar Sanjabi · Pabitra Mitra · Meisam Razaviyayn

Keywords: [ Applications ] [ Privacy, Anonymity, and Security ] [ Theory ] [ Algorithms, Optimization and Computation Methods ] [ Nonconvex Optimization ]

[ Abstract ]
Tue 13 Apr 2 p.m. PDT — 4 p.m. PDT


Quantization of the parameters of machine learning models, such as deep neural networks, requires solving constrained optimization problems, where the constraint set is formed by the Cartesian product of many simple discrete sets. For such optimization problems, we study the performance of the Alternating Direction Method of Multipliers for Quantization (ADMM-Q) algorithm, which is a variant of the widely-used ADMM method applied to our discrete optimization problem. We establish the convergence of the iterates of ADMM-Q to certain stationary points. To the best of our knowledge, this is the first analysis of an ADMM-type method for problems with discrete variables/constraints. Based on our theoretical insights, we develop a few variants of ADMM-Q that can handle inexact update rules, and have improved performance via the use of "soft projection" and "injecting randomness to the algorithm". We empirically evaluate the efficacy of our proposed approaches.

Chat is not available.