Federated learning, in which training data is distributed among users and
never shared, has emerged as a popular approach to privacy-preserving machine
learning. Cryptographic techniques such as secure aggregation are used to
aggregate contributions, like a model update, from all users. A robust
technique for making such aggregates differentially private is to exploit
infinite divisibility of the Laplace distribution, namely, that a Laplace
distribution can be expressed as a sum of i.i.d. noise shares from a Gamma
distribution, one share added by each user.

360 Mobile Vision - 360mobilevision.com North & South Carolina Security products and Systems Installations for Commercial and Residential - $55 Hourly Rate. ACCESS CONTROL, INTRUSION ALARM, ACCESS CONTROLLED GATES, INTERCOMS AND CCTV INSTALL OR REPAIR 360 Mobile Vision - 360mobilevision.com is committed to excellence in every aspect of our business. We uphold a standard of integrity bound by fairness, honesty and personal responsibility. Our distinction is the quality of service we bring to our customers. Accurate knowledge of our trade combined with ability is what makes us true professionals. Above all, we are watchful of our customers interests, and make their concerns the basis of our business.

However, Laplace noise is known to have suboptimal error in the low privacy
regime for $varepsilon$-differential privacy, where $varepsilon > 1$ is a
large constant. In this paper we present the first infinitely divisible noise
distribution for real-valued data that achieves $varepsilon$-differential
privacy and has expected error that decreases exponentially with $varepsilon$.

By admin