TY - JOUR
T1 - A two-stage federated learning method for personalization via selective collaboration
AU - Xu, Jiuyun
AU - Zhou, Liang
AU - Zhao, Yingzhi
AU - Li, Xiaowen
AU - Zhu, Kongshang
AU - Xu, Xiangrui
AU - Duan, Qiang
AU - Zhang, Ru Ru
N1 - Publisher Copyright:
© 2025 Elsevier B.V.
PY - 2025/2/15
Y1 - 2025/2/15
N2 - As an emerging distributed learning method, Federated learning has received much attention recently. Traditional federated learning aims to train a global model on a decentralized dataset, but in the case of uneven data distribution, a single global model may not be well adapted to each client, and even the local training performance of some clients may be superior to the global model. Under this background, clustering resemblance clients into the same group is a common approach. However, there is still some heterogeneity of clients within the same group, and general clustering methods usually assume that clients belong to a specific class only, but in real-world scenarios, it is difficult to accurately categorize clients into one class due to the complexity of data distribution. To solve these problems, we propose a two-stage federated learning method for personalization via selective collaboration (FedSC). Different from previous clustering methods, we focus on how to independently exclude other clients with significant distributional differences for each client and break the restriction that clients can only belong to one category. We tend to select collaborators for each client who are more conducive to achieving local mission goals and build a collaborative group for them independently, and every client engages in a federated learning process only with group members to avoid negative knowledge transfer. Furthermore, FedSC performs finer-grained processing within each group, using an adaptive hierarchical fusion strategy of group and local models instead of the traditional approach's scheme of directly overriding local models. Extensive experiments show that our proposed method considerably increases model performance under different heterogeneity scenarios.
AB - As an emerging distributed learning method, Federated learning has received much attention recently. Traditional federated learning aims to train a global model on a decentralized dataset, but in the case of uneven data distribution, a single global model may not be well adapted to each client, and even the local training performance of some clients may be superior to the global model. Under this background, clustering resemblance clients into the same group is a common approach. However, there is still some heterogeneity of clients within the same group, and general clustering methods usually assume that clients belong to a specific class only, but in real-world scenarios, it is difficult to accurately categorize clients into one class due to the complexity of data distribution. To solve these problems, we propose a two-stage federated learning method for personalization via selective collaboration (FedSC). Different from previous clustering methods, we focus on how to independently exclude other clients with significant distributional differences for each client and break the restriction that clients can only belong to one category. We tend to select collaborators for each client who are more conducive to achieving local mission goals and build a collaborative group for them independently, and every client engages in a federated learning process only with group members to avoid negative knowledge transfer. Furthermore, FedSC performs finer-grained processing within each group, using an adaptive hierarchical fusion strategy of group and local models instead of the traditional approach's scheme of directly overriding local models. Extensive experiments show that our proposed method considerably increases model performance under different heterogeneity scenarios.
UR - https://www.scopus.com/pages/publications/85214902956
UR - https://www.scopus.com/pages/publications/85214902956#tab=citedBy
U2 - 10.1016/j.comcom.2025.108053
DO - 10.1016/j.comcom.2025.108053
M3 - Article
AN - SCOPUS:85214902956
SN - 0140-3664
VL - 232
JO - Computer Communications
JF - Computer Communications
M1 - 108053
ER -