Performance evaluation of a differentially-private neural network for cloud computing
ND Hoefer, SAS Monroy - … Conference on Big Data (Big Data), 2018 - ieeexplore.ieee.org
ND Hoefer, SAS Monroy
2018 IEEE International Conference on Big Data (Big Data), 2018•ieeexplore.ieee.orgDue to the large computational cost of data classification using deep learning, resource-
limited devices, eg, smart phones, PCs, etc., offload their classification tasks to a cloud
server, which offers extensive hardware resources. Unfortunately, since the cloud is an
untrusted third-party, users may be reluctant to share their private data with the cloud for
data classification. Differential privacy has been proposed as a way of securely classifying
data at the cloud using deep learning. In this approach, users conceal their data before …
limited devices, eg, smart phones, PCs, etc., offload their classification tasks to a cloud
server, which offers extensive hardware resources. Unfortunately, since the cloud is an
untrusted third-party, users may be reluctant to share their private data with the cloud for
data classification. Differential privacy has been proposed as a way of securely classifying
data at the cloud using deep learning. In this approach, users conceal their data before …
Due to the large computational cost of data classification using deep learning, resource-limited devices, e.g., smart phones, PCs, etc., offload their classification tasks to a cloud server, which offers extensive hardware resources. Unfortunately, since the cloud is an untrusted third-party, users may be reluctant to share their private data with the cloud for data classification. Differential privacy has been proposed as a way of securely classifying data at the cloud using deep learning. In this approach, users conceal their data before uploading it to the cloud using a local obfuscation deep learning model, which is based on a data classification model hosted by the cloud. However, as the obfuscation model assumes that the pre-trained model at the cloud is static, it leads to significant performance degradation under realistic classification models that are constantly being updated. In this paper, we investigate the performance of differentially-private data classification under a dynamic pre-trained model, and a constant obfuscation model. We find that the classification performance decreases as the pre-trained model evolves. We then investigate the classification performance under an obfuscation model that is updated alongside the pre-trained model. We find that with a modest computational effort the obfuscation model can be updated to significantly improve the classification performance. under a dynamic pre-trained model.
ieeexplore.ieee.org
Showing the best result for this search. See all results