Application Vulnerabilities in Risk Assessment and Management

Fabrizio Baiardi
+, Federico Tonelli, and Lorenzo Isoni
 

Dipartimento di Informatica, Universita di Pisa, Italy
{baiardi, tonelli, isoni}@di.unipi.it

 

 

Abstract

Stochastic gradient descent (SGD) is one of the most applied machine learning algorithms in unreliable large-scale decentralized environments. In this type of environment data privacy is a fundamental concern. The most popular way to investigate this topic is based on the framework of differential privacy. However, many important implementation details and the performance of differentially private SGD variants have not yet been completely addressed. Here, we analyze a set of distributed differentially private SGD implementations in a system, where every private data record is stored separately by an autonomous node. The examined SGD methods apply only local computations and communications contain only protected information in a differentially private manner. A key middleware service these implementations require is the single random walk service, where a single random walk is maintained in the face of different failure scenarios. First we propose a robust implementation for the decentralized single random walk service and then perform experiments to evaluate the proposed random walk service as well as the private SGD implementations. Our main conclusion here is that the proposed differentially private SGD implementations can approximate the performance of their original noise-free variants in faulty decentralized environments, provided the algorithm parameters are set properly.

Keywords: decentralized differential privacy, stochastic gradient descent, machine learning, random walks

 

+: Corresponding author: Fabrizio Baiardi
Dipartimento di Informatica, Università di Pisa, Lrgo B.Pontecorvo 3, 56127 Pisa, Italy,
Tel: +390502212762

 

Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications (JoWUA)
Vol. 7, No. 2, pp. 41-59, June 2016 [pdf]