PUBLISHED PAPERS #12.06

Samir Aliyev.
FedND: Federated Newton Direction for Federated Learning Environment
Abstract. Federated Learning (FL) is a decentralized approach to machine learning that allows models to be trained across multiple clients without sharing raw data. This study investigates the performance of two federated optimization algorithms—Federated Newton Direction (FedND) and Federated Stochastic Gradient Descent (FedSGD)—under different client weighting schemes. Two weighting strategies were applied: the conventional sample-size-based weighting and a Fuzzy Inference System (FIS)-based approach. The results indicate that FedND consistently outperformed FedSGD in terms of accuracy, AUC, and F1 Score. Furthermore, FIS-based weighting provided additional improvements, particularly in accuracy and F1 Score, by addressing client heterogeneity more effectively. While FedND achieved superior overall performance, the study highlights that adaptive weighting methods can further enhance federated learning efficiency. These findings contribute to the development of more robust aggregation techniques for FL frameworks, particularly in heterogeneous environments.
Keywords: Federated Learning, Federated Stochastic Gradient, Second-Order methods, Newton Method, Fuzzy Inference System
Download PDF
DOI: https://doi.org/10.30546/MaCoSEP2025.1118