%0 Conference Proceedings %T CxLM: A Construction and Context-aware Language Model %A Tseng, Yu-Hsiang %A Shih, Cing-Fang %A Chen, Pin-Er %A Chou, Hsin-Yu %A Ku, Mao-Chang %A Hsieh, Shu-Kai %Y Calzolari, Nicoletta %Y Béchet, Frédéric %Y Blache, Philippe %Y Choukri, Khalid %Y Cieri, Christopher %Y Declerck, Thierry %Y Goggi, Sara %Y Isahara, Hitoshi %Y Maegaard, Bente %Y Mariani, Joseph %Y Mazo, Hélène %Y Odijk, Jan %Y Piperidis, Stelios %S Proceedings of the Thirteenth Language Resources and Evaluation Conference %D 2022 %8 June %I European Language Resources Association %C Marseille, France %F tseng-etal-2022-cxlm %X Constructions are direct form-meaning pairs with possible schematic slots. These slots are simultaneously constrained by the embedded construction itself and the sentential context. We propose that the constraint could be described by a conditional probability distribution. However, as this conditional probability is inevitably complex, we utilize language models to capture this distribution. Therefore, we build CxLM, a deep learning-based masked language model explicitly tuned to constructions’ schematic slots. We first compile a construction dataset consisting of over ten thousand constructions in Taiwan Mandarin. Next, an experiment is conducted on the dataset to examine to what extent a pretrained masked language model is aware of the constructions. We then fine-tune the model specifically to perform a cloze task on the opening slots. We find that the fine-tuned model predicts masked slots more accurately than baselines and generates both structurally and semantically plausible word samples. Finally, we release CxLM and its dataset as publicly available resources and hope to serve as new quantitative tools in studying construction grammar. %U https://aclanthology.org/2022.lrec-1.683 %P 6361-6369