Robust
Decentralized Dierentially Private Stochastic
Gradient Descent MTA-SZTE
Research Group on AI Abstract Stochastic gradient descent (SGD) is one of the most
applied machine learning algorithms in unreliable large-scale decentralized
environments. In this type of environment data privacy is a fundamental
concern. The most popular way to investigate this topic is based on the
framework of differential privacy. However, many important implementation
details and the performance of differentially private SGD variants have not
yet been completely addressed. Here, we analyze a set of distributed
differentially private SGD implementations in a system, where every private
data record is stored separately by an autonomous node. The examined SGD
methods apply only local computations and communications contain only
protected information in a differentially private manner. A key middleware
service these implementations require is the single random walk service,
where a single random walk is maintained in the face of different failure
scenarios. First we propose a robust implementation for the decentralized
single random walk service and then perform experiments to evaluate the
proposed random walk service as well as the private SGD implementations. Our
main conclusion here is that the proposed differentially private SGD
implementations can approximate the performance of their original noise-free
variants in faulty decentralized environments, provided the algorithm
parameters are set properly. Keywords: decentralized differential privacy,
stochastic gradient descent, machine learning, random walks +: Corresponding author: Márk Jelasity Journal of Wireless Mobile
Networks, Ubiquitous Computing, and Dependable Applications (JoWUA) |