Friday Jun 13, 2025

FedEXD: Self-Propelled Federated Learning with Extraction-Based Knowledge Distillation in ...

Federated learning (FL) is a pivotal paradigm for decentralized model training while preserving data privacy. However, data heterogeneity among clients significantly degrades model performance and convergence efficiency. In response, we introduce a federated knowledge distillation mechanism, FedEXD, that addresses robustness and convergence in diverse client environments through a self-propelled learning architecture. FedEXD employs a novel density ratio-based data extraction algorithm, leveraging KLIEP to select representative data, enhancing global knowledge synthesis and local model adaptability while preserving privacy. Extensive evaluations on benchmark datasets demonstrate FedEXD’s substantial improvements in efficiency and accuracy, demonstrating a substantial 1.51% accuracy improvement over state-of-the-art methods under firm heterogeneity while reducing communication rounds by over 46.3%. These findings underscore FedEXD’s potential to advance FL systems’ generalizability across complex, non-IID data distributions, offering a scalable solution for privacy-conscious, high-performance distributed learning.

FedEXD: Self-Propelled Federated Learning with Extraction-Based Knowledge Distillation in Heterogeneous Environments

Yunfan Li, Xidian university; Jie Feng, Lei Liu, Xidian University; Bodong Shang, Eastern Institute for Advanced Study Eastern Institute of Technology; Jing Lei, Xidian university; Qingqi Pei, Xidian University

Comment (0)

No comments yet. Be the first to say something!

Copyright 2025 All rights reserved.

Version: 20241125