WANG Shuhong,HAN Bin,LI Chuan,et al.Layer Similarity Optimization for Federated Learning towards Heterogeneous Clients[J].Journal of Chengdu University of Information Technology,2026,41(01):55-62.[doi:10.16836/j.cnki.jcuit.2026.01.008]
面向异质客户端的层相似度联邦学习优化
- Title:
- Layer Similarity Optimization for Federated Learning towards Heterogeneous Clients
- 文章编号:
- 2096-1618(2026)01-0055-08
- Keywords:
- federated learning; layer similarity; personalized modeling; data heterogeneity; dynamic threshold adjustment
- 分类号:
- TP389.1
- 文献标志码:
- A
- 摘要:
- 在联邦学习的实际应用场景中,Non-IID数据广泛存在,客户端数据的异质性带来的个性化需求和全局泛化之间的矛盾成为亟待解决的问题之一。针对这一问题,提出一种面向异质客户端的层相似度联邦学习优化方案FedLaySim。通过在服务端计算不同客户端模型层参数间的余弦相似度,并据此动态调整聚合的阈值。为避免模型泛化能力不足,方案中还加入针对低相似度的微调策略,这些策略旨在为每个客户端生成更符合其数据特征的个性化模型。在CIFAR-10、MNIST和MedMNISTC这3个非独立同分布的数据集上进行的实验验证了FedLaySim的有效性。实验结果表明,在多种不同的应用场景中,FedLaySim在精确率上接近或超越了FedPAC方法,最高可达98.44%。进一步将FedLaySim作为服务端优化算法,集成到用于客户端个性化的联邦学习算法FedBn和FedALA中,结果显示两者的平均准确率均有所提升,其中FedLaySim与FedBn的融合方案FedLayBn在8个场景中取得最高精确率,高达99.82%。
- Abstract:
- In practical applications of federated learning, Non-IID data is prevalent, leading to a critical challenge: the conflict between personalization needs arising from heterogeneity in client data and the requirement for global generalization. To address this issue, this paper proposes a layer similarity-based federated learning optimization scheme named FedLaySim, tailored for heterogeneous clients. This scheme calculates the cosine similarity between the parameters of different client model layers on the server side and dynamically adjusts the aggregation threshold based on these similarities. To prevent insufficient model generalization, the scheme also incorporates fine-tuning strategies for low similarity cases, aiming to generate personalized models that better match the data characteristics of each client. Experiments conducted on non-IID datasets such as CIFAR-10, MNIST, and MedMNISTC validate the effectiveness of FedLaySim. The results demonstrate that FedLaySim achieves accuracy levels close to or surpassing those of the FedPAC method in various application scenarios, with a maximum accuracy of up to 98.44%. Furthermore, integrating FedLaySim as a server-side optimization algorithm into federated learning algorithms for client personalization, such as FedBn and FedALA, shows overall improvement in average accuracy. Notably, the fusion scheme of FedLaySim with FedBn, dubbed FedLayBn, achieved the highest accuracy in eight scenarios, reaching as high as 99.82%.
参考文献/References:
[1] McMahan B,Moore E,Ramage D,et al.Communication-efficient learning of deep networks from decentralized data[C].Artificial intelligence and statistics.PMLR,2017:1273-1282.
[2] Li T,Sahu A K,Talwalkar A,et al.Federated Learning:Challenges,Methods,and Future Directions[J].IEEE SIGNAL PROCESSING MAGAZINE,2020,37(3):50-60.
[3] Wang Z,Xu H,Liu J,et al.Accelerating Federated Learning With Cluster Construction and Hierarchical Aggregation[J].IEEE TRANSACTIONS ON MOBILE COMPUTING,2023,22(7):3805-3822.
[4] Ye R,Xu M,Wang J,et al.Feddisco:Federated learning with discrepancy-aware collaboration[C].International Conference on Machine Learning.PMLR,2023:39879-39902.
[5] Pillutla K,Malik K,Mohamed A R,et al.Federated learning with partial model personalization[C].International Conference on Machine Learning.PMLR,2022:17716-17758.
[6] 孙艳华,史亚会,李萌,等.基于合作博弈和知识蒸馏的个性化联邦学习算法[J].电子与信息学报,2023,45(10):3702-3709.
[7] Wang C,Chen D,Mei J P,et al.SemCKD:Semantic Calibration for Cross-Layer Knowledge Distillation[J].IEEE Transactions on Knowledge and Data Engineering,2023,35(6):6305-6319.
[8] Yang Z,Zhang Y,Zheng Y,et al.FedFed:Feature distillation against data heterogeneity in federated learning[J].Advances in Neural Information Processing Systems,2024,36.
[9] Wang H,Yurochkin M,Sun Y,et al.Federated learning with matched averaging[C].ICLR 2020,2020.
[10] Arivazhagan M G,Aggarwal V,Singh A K,et al.Federated learning with personalization layers[J].arXiv preprint arXiv:1912.00818,2019.
[11] Tan A Z,Yu H,Cui L,et al.Towards personalized federated learning[J].IEEE Transactions on Neural Networks and Learning Systems,2022,34(12):9587-9603.
[12] Li T,Sahu A K,Zaheer M,et al.Federated optimization in heterogeneous networks[J].Proceedings of Machine Learning and Systems,2020,2:429-450.
[13] Fallah A,Mokhtari A,Ozdaglar A.Personalized federated learning with theoretical guarantees:A model-agnostic meta-learning approach[C].Advances in Neural Information Processing Systems,2020:3557-3568.
[14] Zhang H,Li C,Dai W,et al.FedCR:personalized federated learning based on across-client common representation with conditional mutual information regularization[C].International Conference on Machine Learning.PMLR,2023:41314-41330.
[15] Chen H,Vikalo H.The Best of Both Worlds:Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation[C].ICLR 2023,2023.
[16] Li X,Jiang M,Zhang X,et al.FedBN:Federated Learning on Non-IID Features via Local Batch Normalization[C].ICLR 2021,2021.
[17] Zhang J,Hua Y,Wang H,et al.Fedala:Adaptive local aggregation for personalized federated learning[C].Proceedings of the AAAI Conference on Artificial Intelligence.2023,37(9):11237-11244.
[18] Zhang X,Huang A,Fan L,et al.Probably approximately correct federated learning[J].arXiv preprint arXiv:2304.04641,2023.
备注/Memo
收稿日期:2024-09-18
基金项目:四川省国际科技创新合作/港澳台科技创新合作资助项目(2021YFH0076)
通信作者:韩斌.E-mail:hanbin@cuit.edu.cn
