Bluesky Facebook Reddit Email

A survey on federated learning: A perspective from multi-party computation

02.27.24 | Higher Education Press

GoPro HERO13 Black

GoPro HERO13 Black records stabilized 5.3K video for instrument deployments, field notes, and outreach, even in harsh weather and underwater conditions.

Federated learning (FL) has emerged as a popular machine learning paradigm which allows multiple data owners to train models collaboratively with out sharing their raw datasets. It holds potential for a wide spectrum of nalytics applications on sensitive data. For example, federated learning has been applied on medical big data analysis such as disease prediction and diagnosis without revealing the patients’ private medical information to thirdparty services. It has also been exploited by banks and insurance companies to train an accurate machine learning model for risk assessment or customer recommendation.

Federated learning enables collaborative model training without sharing raw datasets among data owners by decomposing the training procedure into local training and model aggregation. Each data owner performs local training on its own data partition and only communicates intermediate results e.g., gradients for model aggregation at either a centralized server or other data owners. Federated learning with a central server to coordinate the model aggregation is called centralized FL, while model aggregation in a peer-to-peer manner is known as decentralized FL. Centralized FL imposes high computation workload to the server, whereas decentralized FL involves excessive communication among peers. Consequently, semi-centralized FL is recently proposed to balance the computation and communication cost by conducting clustered or hierarchical model aggregation.

We focus on federated learning with privacy guarantees. Note that exchanging intermediate results e.g.,gradients rather than raw datasets may still leak privacy. Accordingly, extra techniques are compulsory for secure communication and computation during federated learning. Of our particular interest is multi-party computation, a generic and fundamental category of techniques that takes multi-party private inputs for aggregated computation without revealing the private data of each party. Common multi-party computation techniques include garbled circuit, secret sharing, homomorphic encryption, differential privacy, and so on. Recent years have witnessed a surge to enhance the privacy of federated learning via multiparty computation.

DOI: 10.1007/s11704-023-3282-7

Frontiers of Computer Science

10.1007/s11704-023-3282-7

Experimental study

Not applicable

A survey on federated learning: a perspective from multi-party computation

15-Feb-2024

Keywords

Article Information

Contact Information

Rong Xie
Higher Education Press
xierong@hep.com.cn

Source

How to Cite This Article

APA:
Higher Education Press. (2024, February 27). A survey on federated learning: A perspective from multi-party computation. Brightsurf News. https://www.brightsurf.com/news/19NWXDJ1/a-survey-on-federated-learning-a-perspective-from-multi-party-computation.html
MLA:
"A survey on federated learning: A perspective from multi-party computation." Brightsurf News, Feb. 27 2024, https://www.brightsurf.com/news/19NWXDJ1/a-survey-on-federated-learning-a-perspective-from-multi-party-computation.html.