%0 Conference Proceedings %T Weakly Supervised Text-to-SQL Parsing through Question Decomposition %A Wolfson, Tomer %A Deutch, Daniel %A Berant, Jonathan %Y Carpuat, Marine %Y de Marneffe, Marie-Catherine %Y Meza Ruiz, Ivan Vladimir %S Findings of the Association for Computational Linguistics: NAACL 2022 %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, United States %F wolfson-etal-2022-weakly %X Text-to-SQL parsers are crucial in enabling non-experts to effortlessly query relational data. Training such parsers, by contrast, generally requires expertise in annotating natural language (NL) utterances with corresponding SQL queries. In this work, we propose a weak supervision approach for training text-to-SQL parsers. We take advantage of the recently proposed question meaning representation called QDMR, an intermediate between NL and formal query languages. Given questions, their QDMR structures (annotated by non-experts or automatically predicted), and the answers, we are able to automatically synthesize SQL queries that are used to train text-to-SQL models. We test our approach by experimenting on five benchmark datasets. Our results show that the weakly supervised models perform competitively with those trained on annotated NL-SQL data. Overall, we effectively train text-to-SQL parsers, while using zero SQL annotations. %R 10.18653/v1/2022.findings-naacl.193 %U https://aclanthology.org/2022.findings-naacl.193 %U https://doi.org/10.18653/v1/2022.findings-naacl.193 %P 2528-2542