Authors: Miao Xu,Rong Jin,Zhi-Hua Zhou
Where published:
NeurIPS 2013 12
Document:
PDF
DOI
Abstract URL: http://papers.nips.cc/paper/4999-speedup-matrix-completion-with-side-information-application-to-multi-label-learning
In standard matrix completion theory, it is required to have at least $O(n\ln^2 n)$ observed entries to perfectly recover a low-rank matrix $M$ of size $n\times n$, leading to a large number of observations when $n$ is large. In many real tasks, side information in addition to the observed entries is often available. In this work, we develop a novel theory of matrix completion that explicitly explore the side information to reduce the requirement on the number of observed entries. We show that, under appropriate conditions, with the assistance of side information matrices, the number of observed entries needed for a perfect recovery of matrix $M$ can be dramatically reduced to $O(\ln n)$. We demonstrate the effectiveness of the proposed approach for matrix completion in transductive incomplete multi-label learning.