Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

12-2022

Abstract

Vertical federated learning (VFL), where data features are stored in multiple parties distributively, is an important area in machine learning. However, the communication complexity for VFL is typically very high. In this paper, we propose a unified framework by constructing coresets in a distributed fashion for communication-efficient VFL. We study two important learning tasks in the VFL setting: regularized linear regression and $k$-means clustering, and apply our coreset framework to both problems. We theoretically show that using coresets can drastically alleviate the communication complexity, while nearly maintain the solution quality. Numerical experiments are conducted to corroborate our theoretical findings.

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering; Intelligent Systems and Optimization

Publication

Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022), Hybrid Conference, November 28 - December 9

First Page

1

Last Page

32

Publisher

Neural Information Processing Systems Foundation

City or Country

New Orleans, USA

Additional URL

https://proceedings.neurips.cc/paper_files/paper/2022/hash/be7b70477c8fca697f14b1dbb1c086d1-Abstract-Conference.html

Share

COinS