Distributed Outsourced Privacy‐Preserving Gradient Descent Methods among Multiple Parties
Z Tan, H Zhang, P Hu, R Gao - Security and Communication …, 2021 - Wiley Online Library
Z Tan, H Zhang, P Hu, R Gao
Security and Communication Networks, 2021•Wiley Online LibraryThe Internet of Things (IoT) is one of the latest internet evolutions. Cloud computing is an
important technique which realizes the computational demand of largely distributed IoT
devices/sensors by employing various machine learning models. Gradient descent methods
are widely employed to find the optimal coefficients of a machine learning model in the cloud
computing. Commonly, the data are distributed among multiple data owners, whereas the
target function is held by the model owner. The model owner can train its model over data …
important technique which realizes the computational demand of largely distributed IoT
devices/sensors by employing various machine learning models. Gradient descent methods
are widely employed to find the optimal coefficients of a machine learning model in the cloud
computing. Commonly, the data are distributed among multiple data owners, whereas the
target function is held by the model owner. The model owner can train its model over data …
The Internet of Things (IoT) is one of the latest internet evolutions. Cloud computing is an important technique which realizes the computational demand of largely distributed IoT devices/sensors by employing various machine learning models. Gradient descent methods are widely employed to find the optimal coefficients of a machine learning model in the cloud computing. Commonly, the data are distributed among multiple data owners, whereas the target function is held by the model owner. The model owner can train its model over data owner’s data and provide predictions. However, the dataset or the target function’s confidentiality may not be kept in secret during computations. Thus, security threats and privacy risks arise. To address the data and model’s privacy mentioned above, we present two new outsourced privacy‐preserving gradient descent (OPPGD) method schemes over horizontally or vertically partitioned data among multiple parties, respectively. Compared to previously proposed solutions, our methods improve in comprehensiveness in a more general scene. The data privacy and the model privacy are preserved during the whole learning and prediction procedures. In addition, the execution performance evaluation demonstrates that our schemes can help the model owner to optimize its target function and provide exact prediction with high efficiency and accuracy.
Wiley Online Library
Showing the best result for this search. See all results