, O. Ozan Koyluoglu and Kannan Ramchandran With Swanand Kadhe, O. Ozan Koyluoglu and Kannan Ramchandran

Federated learning is popular paradigm for decentralized learning in the presence of heterogeneous learning agents. This brings with itself the challenge of carrying out learning asynchronously in a computational and communication efficient manner, in the presence of unreliable agents. We propose a communication protocol called FastSecAgg for secure aggregation in the federated learning setting which is robust to unreliable clients in the form of dropouts or collusion. We theoretically show that FastSecAgg is secure against the server colluding with any 10 of the clients in the honest-but-curious model, while at the same time tolerating a dropout of a random 10 of the clients. These guarantees are achieved at a computation cost smaller than existing schemes at the same (orderwise) communication cost. The algorithm is also shown to be secure against adaptive adversaries who can perform client corruptions dynamically during the execution of the protocol.

High level protocol of FastSecAgg 

High level overview of the FastSecAgg protocol