SecDD: Efficient and Secure Method for Remotely Training Neural Networks (Student Abstract)
Abstract
We leverage what are typically considered the worst qualities of deep learning algorithms - high computational cost, requirement for large data, no explainability, high dependence on hyper-parameter choice, overfitting, and vulnerability to adversarial perturbations - in order to create a method for the secure and efficient training of remotely deployed neural networks over unsecure channels.
Cite
Text
Sucholutsky and Schonlau. "SecDD: Efficient and Secure Method for Remotely Training Neural Networks (Student Abstract)." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I18.17945Markdown
[Sucholutsky and Schonlau. "SecDD: Efficient and Secure Method for Remotely Training Neural Networks (Student Abstract)." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/sucholutsky2021aaai-secdd/) doi:10.1609/AAAI.V35I18.17945BibTeX
@inproceedings{sucholutsky2021aaai-secdd,
title = {{SecDD: Efficient and Secure Method for Remotely Training Neural Networks (Student Abstract)}},
author = {Sucholutsky, Ilia and Schonlau, Matthias},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {15897-15898},
doi = {10.1609/AAAI.V35I18.17945},
url = {https://mlanthology.org/aaai/2021/sucholutsky2021aaai-secdd/}
}