Bluesky Facebook Reddit Email

How to simultaneously achieve privacy-preservation, byzantine-resilience and verifiability in federated learning?

11.03.24 | Higher Education Press

Anker Laptop Power Bank 25,000mAh (Triple 100W USB-C)

Anker Laptop Power Bank 25,000mAh (Triple 100W USB-C) keeps Macs, tablets, and meters powered during extended observing runs and remote surveys.


Differentially Private Federated Learning (DPFL) is designed to enhance the privacy in Federated Learning (FL). As DPFL operates in the distributed settings, there exist potential malicious adversaries. However, existing aggregation protocols for DPFL concern either the existence of some corrupted clients (Byzantines) or the corrupted server. Such protocols are limited to eliminate the effects of corrupted clients and server when both are in existence simultaneously due to the complicated threat model.

To solve the problems, a research team led by Shaojing FU published their new research on 15 October 2024 in Frontiers of Computer Science co-published by Higher Education Press and Springer Nature.

The team proposed a Byzantine-Resilient and Verifiable Aggregation for Differentially Private Federated Learning based on three proposed primary methods that demonstrate a great compatibility. 1) Differentially Private Federated Averaging (DPFA) algorithm applies DP-SGD to FedAvg and achieves more lightweight and easily portable implementation; 2) DPLoss allows the aggregation server to remove Byzantine gradients by calculating the Loss Score that indicates the trustworthiness of DP gradients; 3) DPVeri provides a secure verification scheme to support honest clients in verifying the integrity of the aggregated result and to resist collusion with no more than a certain number of participants.

The team also give a comprehensive analysis of BVDFed and a series of practical performance evaluations. They prove that the honest clients in BVDFed will not only prevent any privacy leakage from the transmitted information but also avoid the victimization by corrupted participants.

For the future direction, one is to reduce the overheads of computation when each client verify the integrity of aggregation and the overheads of communication. Another is to take the drop-out clients which may lead to the privacy leakage into account.

DOI: 10.1007/s11704-023-3142-5

Frontiers of Computer Science

10.1007/s11704-023-3142-5

Experimental study

Not applicable

BVDFed: Byzantine-resilient and verifiable aggregation for differentially private federated learning

15-Oct-2024

Keywords

Article Information

Contact Information

Rong Xie
Higher Education Press
xierong@hep.com.cn

Source

How to Cite This Article

APA:
Higher Education Press. (2024, November 3). How to simultaneously achieve privacy-preservation, byzantine-resilience and verifiability in federated learning?. Brightsurf News. https://www.brightsurf.com/news/LQ4GG2G8/how-to-simultaneously-achieve-privacy-preservation-byzantine-resilience-and-verifiability-in-federated-learning.html
MLA:
"How to simultaneously achieve privacy-preservation, byzantine-resilience and verifiability in federated learning?." Brightsurf News, Nov. 3 2024, https://www.brightsurf.com/news/LQ4GG2G8/how-to-simultaneously-achieve-privacy-preservation-byzantine-resilience-and-verifiability-in-federated-learning.html.