Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

5-2019

Abstract

Factorization Machine (FM) is a general supervised learning framework for many AI applications due to its powerful capability of feature engineering. Despite being extensively studied, existing FM methods have several limitations in common. First of all, most existing FM methods often adopt the squared loss in the modeling process, which can be very sensitive when the data for learning contains noises and outliers. Second, some recent FM variants often explore the low-rank structure of the feature interactions matrix by relaxing the low-rank minimization problem as a trace norm minimization, which cannot always achieve a tight approximation to the original one. To address the aforementioned issues, this paper proposes a new scheme of Robust Factorization Machine (RFM) by exploring a doubly capped norms minimization approach, which employs both a capped squared trace norm in achieving a tighter approximation of the rank minimization and a capped ℓ1-norm loss to enhance the robustness of the empirical loss minimization from noisy data. We develop an efficient algorithm with a rigorous convergence proof of RFM. Experiments on public real-world datasets show that our method outperforms the state-of-the-art FM methods significantly.

Keywords

Factorization machines, Feature engineerings, Feature interactions, Loss minimization, Modeling process, Rank minimizations, Real-world datasets, State of the art

Discipline

Artificial Intelligence and Robotics | Databases and Information Systems | Software Engineering

Research Areas

Data Science and Engineering

Publication

Proceedings of the 2019 SIAM International Conference on Data Mining: Calgary, Canada, May 2-4

First Page

738

Last Page

746

ISBN

9781611975673

Identifier

10.1137/1.9781611975673.83

Publisher

SIAM

City or Country

Philadelphia, PA

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1137/1.9781611975673.83

Share

COinS