%0 Conference Proceedings %T Compositional Task-Oriented Parsing as Abstractive Question Answering %A Zhao, Wenting %A Arkoudas, Konstantine %A Sun, Weiqi %A Cardie, Claire %Y Carpuat, Marine %Y de Marneffe, Marie-Catherine %Y Meza Ruiz, Ivan Vladimir %S Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, United States %F zhao-etal-2022-compositional %X Task-oriented parsing (TOP) aims to convert natural language into machine-readable representations of specific tasks, such as setting an alarm. A popular approach to TOP is to apply seq2seq models to generate linearized parse trees. A more recent line of work argues that pretrained seq2seq2 models are better at generating outputs that are themselves natural language, so they replace linearized parse trees with canonical natural-language paraphrases that can then be easily translated into parse trees, resulting in so-called naturalized parsers. In this work we continue to explore naturalized semantic parsing by presenting a general reduction of TOP to abstractive question answering that overcomes some limitations of canonical paraphrasing. Experimental results show that our QA-based technique outperforms state-of-the-art methods in full-data settings while achieving dramatic improvements in few-shot settings. %R 10.18653/v1/2022.naacl-main.328 %U https://aclanthology.org/2022.naacl-main.328 %U https://doi.org/10.18653/v1/2022.naacl-main.328 %P 4418-4427