Privacy Preserving Federated Learning: A Novel Approach for Combining Differential Privacy and Homomorphic Encryption
Résumé
Ensuring the data security and privacy stands as a prominent concern in the landscape of machine learning. The conventional approach of centralizing training data raises privacy concerns. Federated learning addresses this by avoiding the need to transfer local data when training a global model, opting to share only local model updates. Despite this, the challenge of information leakage persists. Various attempts tried to tackle this issue, but existing solutions lead to a tradeoff between accuracy, privacy and computation time. This is an unevitable challenge. In this paper, we address that challenge by combining differential privacy and homomorphic encryption. This approach allow to add less noise to the data by shuffling to anonymize the data, not only at the client level but also at the parameter level. Hence, it improves the accuracy of the output models while offering strong privacy guarantees. Importantly, our method avoids complex homomorphic operation, thereby mitigating the computational overhead of HE. In this manner, the data remains protected from all participants in the learning process. Our findings demonstrate that, for an equivalent level of privacy, our method introduces less noise compared to the local DP method, resulting in increased accuracy after aggregation. However, the privacy amplification requires a substantial number of clients, which make our approach more suitable for cross-device Federated learning. © IFIP International Federation for Information Processing 2024.