%0 Conference Proceedings %T CRYPTOGRU: Low Latency Privacy-Preserving Text Analysis With GRU %A Feng, Bo %A Lou, Qian %A Jiang, Lei %A Fox, Geoffrey %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F feng-etal-2021-cryptogru %X Homomorphic encryption (HE) and garbled circuit (GC) provide the protection for users’ privacy. However, simply mixing the HE and GC in RNN models suffer from long inference latency due to slow activation functions. In this paper, we present a novel hybrid structure of HE and GC gated recurrent unit (GRU) network, , for low-latency secure inferences. replaces computationally expensive GC-based tanh with fast GC-based ReLU, and then quantizes sigmoid and ReLU to smaller bit-length to accelerate activations in a GRU. We evaluate with multiple GRU models trained on 4 public datasets. Experimental results show achieves top-notch accuracy and improves the secure inference latency by up to 138\times over one of the state-of-the-art secure networks on the Penn Treebank dataset. %R 10.18653/v1/2021.emnlp-main.156 %U https://aclanthology.org/2021.emnlp-main.156 %U https://doi.org/10.18653/v1/2021.emnlp-main.156 %P 2052-2057