Skip to yearly menu bar Skip to main content


DNNLasso: Scalable Graph Learning for Matrix-Variate Data

Meixia Lin · Yangjing Zhang

MR1 & MR2 - Number 146
[ ]
Thu 2 May 8 a.m. PDT — 8:30 a.m. PDT


We consider the problem of jointly learning row-wise and column-wise dependencies of matrix-variate observations, which are modelled separately by two precision matrices. Due to the complicated structure of Kronecker-product precision matrices in the commonly used matrix-variate Gaussian graphical models, a sparser Kronecker-sum structure was proposed recently based on the Cartesian product of graphs. However, existing methods for estimating Kronecker-sum structured precision matrices do not scale well to large scale datasets. In this paper, we introduce DNNLasso, a diagonally non-negative graphical lasso model for estimating the Kronecker-sum structured precision matrix, which outperforms the state-of-the-art methods by a large margin in both accuracy and computational time.

Live content is unavailable. Log in and register to view live content