Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

7-2024

Abstract

We provide generalization bounds for matrix completion with Schatten $p$ quasi-norm constraints, which is equivalent to deep matrix factorization with Frobenius constraints. In the uniform sampling regime, the sample complexity scales like $\widetilde{O}\left( rn\right)$ where $n$ is the size of the matrix and $r$ is a constraint of the same order as the ground truth rank in the isotropic case. In the distribution-free setting, the bounds scale as $\widetilde{O}\left(r^{1-\frac{p}{2}}n^{1+\frac{p}{2}}\right)$, which reduces to the familiar $\sqrt{r}n^{\frac{3}{2}}$ for $p=1$. Furthermore, we provide an analogue of the weighted trace norm for this setting which brings the sample complexity down to $\widetilde{O}(nr)$ in all cases. We then present a non-linear model, Functionally Rescaled Matrix Completion (FRMC) which applies a single trainable function from $\R\rightarrow \R$ to each entry of a latent matrix, and prove that this adds only negligible terms of the overall sample complexity, whilst experiments demonstrate that this simple model improvement already leads to significant gains on real data. We also provide extensions of our results to various neural architectures, thereby providing the first comprehensive uniform convergence PAC analysis of neural network matrix completion.

Keywords

Matrix Completion, Schatten p Quasi Norms, Learning Theory

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 41st International Conference on Machine Learning, Vienna, Austria 2024 July 21-27

Volume

235

First Page

26290

Last Page

26360

Publisher

PMLR

City or Country

Vienna

Additional URL

https://proceedings.mlr.press/v235/ledent24a.html

Share

COinS