Character Jacobian: Modeling Chinese Character Meanings with Deep Learning Model

Yu-Hsiang Tseng, Shu-Kai Hsieh


Abstract
Compounding, a prevalent word-formation process, presents an interesting challenge for computational models. Indeed, the relations between compounds and their constituents are often complicated. It is particularly so in Chinese morphology, where each character is almost simultaneously bound and free when treated as a morpheme. To model such word-formation process, we propose the Notch (NOnlinear Transformation of CHaracter embeddings) model and the character Jacobians. The Notch model first learns the non-linear relations between the constituents and words, and the character Jacobians further describes the character’s role in each word. In a series of experiments, we show that the Notch model predicts the embeddings of the real words from their constituents but helps account for the behavioral data of the pseudowords. Moreover, we also demonstrated that character Jacobians reflect the characters’ meanings. Taken together, the Notch model and character Jacobians may provide a new perspective on studying the word-formation process and morphology with modern deep learning.
Anthology ID:
2022.coling-1.14
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
152–162
Language:
URL:
https://aclanthology.org/2022.coling-1.14
DOI:
Bibkey:
Cite (ACL):
Yu-Hsiang Tseng and Shu-Kai Hsieh. 2022. Character Jacobian: Modeling Chinese Character Meanings with Deep Learning Model. In Proceedings of the 29th International Conference on Computational Linguistics, pages 152–162, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Character Jacobian: Modeling Chinese Character Meanings with Deep Learning Model (Tseng & Hsieh, COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.14.pdf