ABSTRACT

This chapter is dedicated to cross-silo private parameter aggregation. ML/DL has demonstrated promising results in a variety of application domains, especially when vast volumes of data are collected in one location, such as a data center or a cloud service. The goal of FL is to improve the quality of ML/DL models while minimizing their drawbacks. Participating devices in an FL task could range in size from a single smartphone or watch to a global corporation housing multiple data centers. It was originally believed that just a little amount of information about the original training data would be carried over into subsequent model updates as FL interactions occurred. The differential privacy framework is concerned with restricting the release of private information while sharing the outcomes of computations or queries performed on a dataset. Recently, many researchers have begun to employ differential privacy while training models in a federated setting.