Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

2-2023

Abstract

We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds with the following three properties: (1) they scale like the standard deviation of the noise and in particular approach zero in the exact recovery case; (2) even in the presence of noise, they converge to zero when the sample size approaches infinity; and (3) for a fixed dimension of the side information, they only have a logarithmic dependence on the size of the matrix. Differently from many works in approximate recovery, we present results both for bounded Lipschitz losses and for the absolute loss, with the latter relying on Talagrand-type inequalities. The proofs create a bridge between two approaches to the theoretical analysis of matrix completion, since they consist in a combination of techniques from both the exact recovery literature and the approximate recovery literature.

Keywords

Recommender Systems, Matrix Completion, Learning Theory

Discipline

Artificial Intelligence and Robotics | Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 36th AAAI Conference on Artificial Intelligence, Washington, 2023 February 7-14

First Page

8447

Last Page

8455

Identifier

10.1609/aaai.v37i7.26018

Publisher

AAAI

City or Country

Washington

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1609/aaai.v37i7.26018

Share

COinS