Journal article2024Expert Systems with Applications

Fedstellar: A Platform for Decentralized Federated Learning

In 2016, Google proposed Federated Learning (FL) as a novel paradigm to train Machine Learning (ML) models across the participants of a federation while preserving data privacy. Since its birth, Centralized FL (CFL) has been the most used approach, where a central entity aggregates participants’ models to create a global one. However, CFL presents limitation...

Decentralized Federated LearningDeep learningCollaborative trainingCommunication mechanisms

Quick facts

Year
2024
Venue
Expert Systems with Applications
Identifier
martinezbeltran2024fedstellar

Suggested citation

Enrique Tomás Martínez Beltrán, Ángel Luis Perales Gómez, Chao Feng, Pedro Miguel Sánchez Sánchez, Sergio López Bernal, Gérôme Bovet, Manuel Gil Pérez, Gregorio Martínez Pérez, Alberto Huertas Celdrán (2024). Fedstellar: A Platform for Decentralized Federated Learning. Expert Systems with Applications.

Abstract

In 2016, Google proposed Federated Learning (FL) as a novel paradigm to train Machine Learning (ML) models across the participants of a federation while preserving data privacy. Since its birth, Centralized FL (CFL) has been the most used approach, where a central entity aggregates participants’ models to create a global one. However, CFL presents limitations such as communication bottlenecks, single point of failure, and reliance on a central server. Decentralized Federated Learning (DFL) addresses these issues by enabling decentralized model aggregation and minimizing dependency on a central entity. Despite these advances, current platforms training DFL models struggle with key issues such as managing heterogeneous federation network topologies, adapting the FL process to virtualized or physical deployments, and using a limited number of metrics to evaluate different federation scenarios for efficient implementation. To overcome these challenges, this paper presents Fedstellar, a novel platform designed to train FL models in a decentralized, semi-decentralized, and centralized fashion across diverse federations of physical or virtualized devices. Fedstellar allows users to create federations by customizing parameters like the number and type of devices training FL models, the network topology connecting them, the machine and deep learning algorithms, or the datasets of each participant, among others. Additionally, it offers real-time monitoring of model and network performance. The Fedstellar implementation encompasses a web application with an interactive graphical interface, a controller for deploying federations of nodes using physical or virtual devices, and a core deployed on each device, which provides the logic needed to train, aggregate, and communicate in the network. The effectiveness of the platform has been demonstrated in two scenarios: a physical deployment involving single-board devices such as Raspberry Pis for detecting cyberattacks and a virtualized deployment comparing various FL approaches in a controlled environment using MNIST and CIFAR-10 datasets. In both scenarios, Fedstellar demonstrated consistent performance and adaptability, achieving F1scores of 91%, 98%, and 91.2% using DFL for detecting cyberattacks and classifying MNIST and CIFAR-10, respectively, reducing training time by 32% compared to centralized approaches.

Authors

Enrique Tomás Martínez BeltránÁngel Luis Perales GómezChao FengPedro Miguel Sánchez SánchezSergio López BernalGérôme BovetManuel Gil PérezGregorio Martínez PérezAlberto Huertas Celdrán

Keywords

Decentralized Federated LearningDeep learningCollaborative trainingCommunication mechanisms

Related publications

Works with stronger overlap in topic, type, and tags.

Journal article2024Applied Intelligence

Analyzing the robustness of decentralized horizontal and vertical federated learning architectures in a non-IID scenario

Pedro Miguel Sánchez Sánchez, Alberto Huertas Celdrán, Enrique Tomás Martínez Beltrán, Daniel Demeter, Gérôme Bovet, Gregorio Martínez Pérez, Burkhard Stiller

Federated learning (FL) enables participants to collaboratively train machine and deep learning models while safeguarding data privacy. However, the FL paradigm still has drawbacks that affect its trustworthiness, as malicious participants...

Journal article2024Array

DART: A Solution for decentralized federated learning model robustness analysis

Chao Feng, Alberto Huertas Celdrán, Jan von der Assen, Enrique Tomás Martínez Beltrán, Gérôme Bovet, Burkhard Stiller

Federated Learning (FL) has emerged as a promising approach to address privacy concerns inherent in Machine Learning (ML) practices. However, conventional FL methods, particularly those following the Centralized FL (CFL) paradigm, utilize a...

Journal article2024Information Fusion

Data fusion in neuromarketing: Multimodal analysis of biosignals, lifecycle stages, current advances, datasets, trends, and challenges

Mario Quiles Pérez, Enrique Tomás Martínez Beltrán, Sergio López Bernal, Eduardo Horna Prat, Luis Montesano Del Campo, Lorenzo Fernández Maimó, Alberto Huertas Celdrán

The primary goal of any company is to increase its profits by improving both the quality of its products and how they are advertised. In this context, neuromarketing seeks to enhance the promotion of products and generate a greater acceptan...

Related Research