%0 Conference Proceedings %T Structurally Diverse Sampling for Sample-Efficient Training and Comprehensive Evaluation %A Gupta, Shivanshu %A Singh, Sameer %A Gardner, Matt %Y Goldberg, Yoav %Y Kozareva, Zornitsa %Y Zhang, Yue %S Findings of the Association for Computational Linguistics: EMNLP 2022 %D 2022 %8 December %I Association for Computational Linguistics %C Abu Dhabi, United Arab Emirates %F gupta-etal-2022-structurally %X A growing body of research has demonstrated the inability of NLP models to generalize compositionally and has tried to alleviate it through specialized architectures, training schemes, and data augmentation, among other approaches. In this work, we study a different approach: training on instances with diverse structures. We propose a model-agnostic algorithm for subsampling such sets of instances from a labeled instance pool with structured outputs. Evaluating on both compositional template splits and traditional IID splits of 5 semantic parsing datasets of varying complexity, we show that structurally diverse training using our algorithm leads to comparable or better generalization than prior algorithms in 9 out of 10 dataset-split type pairs. In general, we find structural diversity to consistently improve sample efficiency compared to random train sets. Moreover, we show that structurally diverse sampling yields comprehensive test sets that are a lot more challenging than IID test sets. Finally, we provide two explanations for improved generalization from diverse train sets: 1) improved coverage of output substructures, and 2) a reduction in spurious correlations between these substructures. %R 10.18653/v1/2022.findings-emnlp.365 %U https://aclanthology.org/2022.findings-emnlp.365 %U https://doi.org/10.18653/v1/2022.findings-emnlp.365 %P 4966-4979