Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Noisy Low-Rank Matrix Completion via Transformed L1 Regularization and its Theoretical Properties

Linlin Yu · Yang Luo


Abstract: This paper focuses on recovering an underlying matrix from its noisy partial entries, a problem commonly known as matrix completion. We delve into the investigation of a non-convex regularization, referred to as transformed L1 (TL1), which interpolates between the rank and the nuclear norm of matrices through a hyper-parameter a(0,). While some literature adopts such regularization for matrix completion, it primarily addresses scenarios with uniformly missing entries and focuses on algorithmic advances. To fill in the gap in the current literature, we provide a comprehensive statistical analysis for the estimator from a TL1-regularized recovery model under general sampling distribution. In particular, we show that when a is sufficiently large, the matrix recovered by the TL1-based model enjoys a convergence rate measured by the Frobenius norm, comparable to that of the model based on the nuclear norm, despite the challenges posed by the non-convexity of the TL1 regularization. When a is small enough, we show that the rank of the estimated matrix remains a constant order when the true matrix is exactly low-rank. A trade-off between controlling the error and the rank is established through different choices of tuning parameters. The appealing practical performance of TL1 regularization is demonstrated through a simulation study that encompasses various sampling mechanisms, as well as two real-world applications. Additionally, the role of the hyper-parameter a on the TL1-based model is explored via experiments to offer guidance in practical scenarios.

Live content is unavailable. Log in and register to view live content