Michael Kayser


2020

pdf bib
Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings
Prafull Prakash | Saurabh Kumar Shashidhar | Wenlong Zhao | Subendhu Rongali | Haidar Khan | Michael Kayser
Findings of the Association for Computational Linguistics: EMNLP 2020

The current state-of-the-art task-oriented semantic parsing models use BERT or RoBERTa as pretrained encoders; these models have huge memory footprints. This poses a challenge to their deployment for voice assistants such as Amazon Alexa and Google Assistant on edge devices with limited memory budgets. We propose to learn compositional code embeddings to greatly reduce the sizes of BERT-base and RoBERTa-base. We also apply the technique to DistilBERT, ALBERT-base, and ALBERT-large, three already compressed BERT variants which attain similar state-of-the-art performances on semantic parsing with much smaller model sizes. We observe 95.15% 98.46% embedding compression rates and 20.47% 34.22% encoder compression rates, while preserving >97.5% semantic parsing performances. We provide the recipe for training and analyze the trade-off between code embedding sizes and downstream performances.

2018

pdf bib
The Alexa Meaning Representation Language
Thomas Kollar | Danielle Berry | Lauren Stuart | Karolina Owczarzak | Tagyoung Chung | Lambert Mathias | Michael Kayser | Bradford Snow | Spyros Matsoukas
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 3 (Industry Papers)

This paper introduces a meaning representation for spoken language understanding. The Alexa meaning representation language (AMRL), unlike previous approaches, which factor spoken utterances into domains, provides a common representation for how people communicate in spoken language. AMRL is a rooted graph, links to a large-scale ontology, supports cross-domain queries, fine-grained types, complex utterances and composition. A spoken language dataset has been collected for Alexa, which contains ∼20k examples across eight domains. A version of this meaning representation was released to developers at a trade show in 2016.

2015

pdf bib
Deep Neural Language Models for Machine Translation
Thang Luong | Michael Kayser | Christopher D. Manning
Proceedings of the Nineteenth Conference on Computational Natural Language Learning

2014

pdf bib
Faster Phrase-Based Decoding by Refining Feature State
Kenneth Heafield | Michael Kayser | Christopher D. Manning
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

2012

pdf bib
Unsupervised Morphology Rivals Supervised Morphology for Arabic MT
David Stallard | Jacob Devlin | Michael Kayser | Yoong Keok Lee | Regina Barzilay
Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)