Home
Scholarly Works
AdaFML: Adaptive Federated Meta Learning With...
Journal article

AdaFML: Adaptive Federated Meta Learning With Multi-Objectives and Context-Awareness in Dynamic Heterogeneous Networks

Abstract

Recent advancements in Federated Learning (FL) have enabled the widespread deployment of distributed computing resources across connected devices, enhancing data processing capabilities and facilitating collaborative decision-making while maintaining user privacy. However, in Internet of Things (IoT) systems, the heterogeneity of devices and unstable network connections present significant challenges to the effective and efficient execution of FL tasks in real-world environments. To address these challenges, we propose an Adaptive Federated Meta Learning Framework with Multi-Objectives and Context-Awareness (AdaFML). This framework aims to achieve multiple objectives, including improving the performance of the FL global model, optimizing time efficiency, and enabling local model adaptation in dynamic and heterogeneous environments. Specifically, AdaFML extracts contextual information from each device, including its data distribution, computation, and communication conditions, to train a multimodal model that optimizes the FL task and time cost estimation, enhancing global model performance and time efficiency. Moreover, AdaFML fine-tunes two critical meta-learning parameters: the mixture ratio between local and global models and the selection weights for model aggregation. This enables adaptive local model updates across different devices while improving global model performance. Experimental results demonstrate that AdaFML boosts the effectiveness, efficiency, and adaptability of FL task execution in dynamic and heterogeneous environments.

Authors

Han Q; Wang X; Shen W; Shi Y

Journal

IEEE Transactions on Emerging Topics in Computational Intelligence, Vol. 9, No. 2, pp. 1428–1440

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

January 1, 2025

DOI

10.1109/tetci.2025.3537940

ISSN

2471-285X

Contact the Experts team