
Strategic Optimization of Convergence and Energy in Federated Learning Systems
Federated Learning (FL) is a Machine Learning (ML) paradigm in which multiple devices collaboratively train a model without sharing their local data. This decentralized approach provides significant privacy benefits, enabling compliance with data protection regulations and safeguarding sensitive user information by keeping raw data on local devices. Instead of transmitting raw data, FL sends model updates to a central aggregator to improve the global model. However, this process can result in higher Carbon Dioxide (CO₂) emissions compared to traditional centralized ML systems, due to the increased number of participating devices and communication rounds. This study evaluates the performance, convergence speed, energy efficiency, and environmental impact of FL models compared to centralized models, using the Modified National Institute of Standards and Technology dataset (MNIST) and Canadian Institute for Advanced Research-10 classes dataset (CIFAR-10). Four models were tested: two FL models and two centralized models. The evaluation focused on accuracy, number of training rounds to convergence, and total CO₂ emissions. To optimize both convergence and energy efficiency, a dynamic hill-climbing-based early stopping technique was introduced. After every 100 rounds, model accuracy improvements were assessed, and training was terminated early if further gains fell below a shrinking threshold, effectively reducing unnecessary computation and energy consumption. Results show that, under the tested conditions, FL models achieved competitive or higher accuracy than centralized models, particularly on non-Independent and Identically Distributed (IID) data distributions. For example, the federated MNIST model reached 98.79% accuracy with a significantly lower carbon footprint when early stopping was applied. Overall, the proposed optimization approach reduced CO₂ emissions by approximately 60% without substantial loss in accuracy. By integrating privacy preservation, explicit regulatory relevance, and a practical dynamic optimization method, this research demonstrates that FL can deliver strong model performance while meeting modern requirements for data privacy and environmental sustainability.
[1] Alazaidah R., Hassan M., Al-Rbabah L., Samara G., and et al., “Utilizing Machine Learning in Medical Diagnosis: Systematic Review and Empirical Analysis,” in Proceedings of the 24th International Arab Conference on Information Technology, Ajman, pp. 1-9, 2023. DOI: 10.1109/ACIT58888.2023.10453690
[2] Alazaidah R., Samara G., Aljaidi M., Haj Qasem M., and et al., “Potential of Machine Learning for Predicting Sleep Disorders: A Comprehensive Analysis of Regression and Classification Models,” Diagnostics, vol. 14, no. 1, pp. 1-19, 2023. https://doi.org/10.3390/diagnostics14010027
[3] Al-Mousa M., Amer W., Abualhaj M., Albilasi S., and et al., “Agile Proactive Cybercrime Evidence Analysis Model for Digital Forensics,” The International Arab Journal of Information Technology, vol. 22, no. 3, pp. 627-636, 2025. https://doi.org/10.34028/iajit/22/3/15
[4] Bhowmick A., Duchi J., Freudiger J., Kapoor G., and Rogers R., “Protection Against Reconstruction and its Applications in Private Federated Learning,” arXiv Preprint, vol. arXiv:1812.00984v2, pp. 1-45, 2019. https://arxiv.org/abs/1812.00984
[5] Elhanashi A., Dini P., Saponara S., and Zheng Q., “Integration of Deep Learning into the IoT: A Survey of Techniques and Challenges for Real- World Applications,” Electronics, vol. 12, no. 24, pp. 1-20, 2023. https://doi.org/10.3390/electronics12244925
[6] Farayola O., Olorunfemi O., and Shoetan P., “Data Privacy and Security in IT: A Review of Techniques and Challenges,” Computer Science and IT Research Journal, vol. 5, no. 3, pp. 606- 615, 2024. https://doi.org/10.51594/csitrj.v5i3.909
[7] Haj Qasem M., Aljaidi M., Samara G., Alazaidah R., and et al., “An Intelligent Decision Support System Based on Multi Agent Systems for Business Classification Problem,” Sustainability, vol. 15, no. 14, pp. 1-14, 2023. https://doi.org/10.3390/su151410977
[8] Hassan M., Samara G., and AbuFadda M., “IoT Forensic Frameworks (DFIF, IoTDots, FSAIoT): A Comprehensive Study,” International Journal of Advanced Soft Computing Applications, vol. 14, no. 1, pp. 72-86, 2022. DOI: 10.15849/IJASCA.220328.06
[9] Hijazi N., Aloqaily M., Guizani M., Ouni B., and Karray F., “Secure Federated Learning with Fully Homomorphic Encryption for IoT Communications,” IEEE Internet Things Journal, vol. 11, no. 3, pp. 4289-4300, 2024. DOI: 10.1109/JIOT.2023.3302065
[10] Kadam S., Adamuthe A., and Patil A., “CNN Model for Image Classification on MNIST and Fashion-MNIST Dataset,” Journal of Scientific Research, vol. 64, no. 2, pp. 374-384, 2020. https://www.bhu.ac.in/Images/files/51.pdf
[11] Kayed M., Anter A., and Mohamed H., “Classification of Garments from Fashion MNIST Dataset Using CNN LeNet-5 Architecture,” in Proceedings of the International Conference on Innovative Trends in Communication and Computer Engineering, Aswan, pp. 238-243, 2020. DOI: 10.1109/ITCE48509.2020.9047776
[12] Mandreoli F. and Montangero M., “Dealing with Data Heterogeneity in a Data Fusion Perspective: Models, Methodologies, and Algorithms,” Data Handling in Science and Technology, vol. 31, pp. 1052 The International Arab Journal of Information Technology, Vol. 22, No. 6, November 2025 235-270, 2019. https://doi.org/10.1016/B978-0- 444-63984-4.00009-0
[13] Mansouri M., Onen M., and Ben Jaballah W., “Learning from Failures: Secure and Fault- Tolerant Aggregation for Federated Learning,” in Proceedings of the 38th Annual Computer Security Applications Conference, Austin, pp. 146-158, 2022. https://doi.org/10.1145/3564625.3568135
[14] Nocentini O., Kim J., Bashir M., and Cavallo F., “Image Classification Using Multiple Convolutional Neural Networks on the Fashion- MNIST Dataset,” Sensors, vol. 22, no. 23, pp. 1- 14, 2022. https://doi.org/10.3390/s22239544
[15] Pan R. and Rajan H., “Decomposing Convolutional Neural Networks into Reusable and Replaceable Modules,” in Proceedings of the 44th International Conference on Software Engineering, Pittsburgh, pp. 524-535, 2022. https://doi.org/10.1145/3510003.3510051
[16] Qi P., Chiaro D., Guzzo A., Ianni M., and et al., “Model Aggregation Techniques in Federated Learning: A Comprehensive Survey,” Future Generation Computer Systems, vol. 150, pp. 272- 293, 2024. https://doi.org/10.1016/j.future.2023.09.008
[17] Qiu X., Parcollet T., Fernandez-Marques J., Gusmao P., and et al., “A First Look into the Carbon Footprint of Federated Learning,” The Journal of Machine Learning Research, vol. 24, no. 1, pp. 5899-5921, 2023. https://dl.acm.org/doi/abs/10.5555/3648699.3648828
[18] Reyes J., Di Jorio L., Low-Kam C., and Kersten- Oertel M., “Precision-Weighted Federated Learning,” arXiv Preprint, vol. arXiv:2107.09627v1, pp. 1-10, 2021. https://doi.org/10.48550/arXiv.2107.09627
[19] Samara G., Elhilo A., Aljaidi M., Alamleh A., and et al., “Enhancing Convergence Efficiency in Federated Learning Models,” in Proceedings of the 25th International Arab Conference on Information Technology, Zarqa, pp. 1-4, 2024. DOI: 10.1109/ACIT62805.2024.10877154
[20] Samara G., Mohammad A., Alazaidah R., Haj Qasem M., and et al., Artificial Intelligence, Internet of Things, and Society 5.0, Springer, 2023. https://link.springer.com/chapter/10.1007/978-3- 031-43300-9_38
[21] Sanderson D. and Kalgonova T., “Maintaining Performance with Less Data,” arXiv Preprint, vol. arXiv:2208.02007v1, pp. 1-12, 2022. https://doi.org/10.48550/arXiv.2208.02007
[22] Savazzi S., Rampa V., Kianoush S., and Bennis M., “An Energy and Carbon Footprint Analysis of Distributed and Federated Learning,” IEEE Transactions on Green Communications and Networking, vol. 7, no. 1, pp. 248-264, 2023. https://ieeexplore.ieee.org/stamp/stamp.jsp?arnu mber=9807354
[23] Wang S., Lee M., Hosseinalipour S., Morabito R., and et al., “Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation,” in Proceedings of the INFOCOM IEEE Conference on Computer Communications, Vancouver, pp. 1-10, 2021. DOI: 10.1109/INFOCOM42981.2021.9488906
[24] Wen J., Zhang Z., Lan Y., and Cui Z., and et al., “A Survey on Federated Learning: Challenges and Applications,” International Journal of Machine Learning and Cybernetics, vol. 14, no. 2, pp. 513- 535, 2023. https://link.springer.com/article/10.1007/s13042- 022-01647-y
[25] Xing S., Ning Z., Zhou J., Liao X., and et al., “N- FedAvg: Novel Federated Average Algorithm Based on FedAvg,” in Proceedings of the 14th International Conference on Communication Software and Networks, Chongqing, pp. 187-196, 2022. DOI: 10.1109/ICCSN55126.2022.9817607
[26] Zhang C. and Zhong Y., “Federated Graph Neural Networks for Dynamic IoT Collaboration Optimization in Smart Home Environments,” The International Arab Journal of Information Technology, vol. 22, no. 4, pp. 678-693 2025. https://doi.org/10.34028/iajit/22/4/4
[27] Zhang X., Mavromatis A., Vafeas A., Nejabati R., and Simeonidou D., “Federated Feature Selection for Horizontal Federated Learning in IoT Networks,” IEEE Internet Things Journal, vol. 10, no. 11, pp. 10095-10112, 2023. DOI: 10.1109/JIOT.2023.3237032