Publication Type

Journal Article

Version

acceptedVersion

Publication Date

1-2025

Abstract

Federated Learning (FL) has emerged as a promising paradigm for collaborative model training across distributed clients while preserving data privacy. However, prevailing FL approaches aggregate the clients’ local models into a global model through multi-round iterative parameter averaging. This leads to the undesirable bias of the aggregated model towards certain clients in the presence of heterogeneous data distributions among the clients. Moreover, such approaches are restricted to supervised classification tasks and do not support unsupervised clustering. To address these limitations, we propose a novel one-shot FL approach called Federated Adaptive Resonance Theory (FedART) which leverages self-organizing Adaptive Resonance Theory (ART) models to learn category codes, where each code represents a cluster of similar data samples. In FedART, the clients learn to associate their private data with various local category codes. Under heterogeneity, the local codes across different clients represent heterogeneous data. In turn, a global model takes these local codes as inputs and aggregates them into global category codes, wherein heterogeneous client data is indirectly represented by distinctly encoded global codes, in contrast to the averaging out of parameters in the existing approaches. This enables the learned global model to handle heterogeneous data. In addition, FedART employs a universal learning mechanism to support both federated classification and clustering tasks. Our experiments conducted on various federated classification and clustering tasks show that FedART consistently outperforms state-of-the-art FL methods on data with heterogeneous distribution across clients.

Keywords

Federated Learning, Machine Learning, Federated Clustering, Adaptive Resonance Theory

Discipline

Artificial Intelligence and Robotics | Databases and Information Systems

Research Areas

Information Systems and Management

Publication

Neural Networks

Volume

181

First Page

1

Last Page

13

ISSN

0893-6080

Identifier

10.1016/j.neunet.2024.106845

Publisher

Elsevier

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1016/j.neunet.2024.106845

Share

COinS