Difficult for Whom? A Study of Japanese Lexical Complexity

Adam Nohejl, Akio Hayakawa, Yusuke Ide, Taro Watanabe


Abstract
The tasks of lexical complexity prediction (LCP) and complex word identification (CWI) commonly presuppose that difficult-to-understand words are shared by the target population. Meanwhile, personalization methods have also been proposed to adapt models to individual needs. We verify that a recent Japanese LCP dataset is representative of its target population by partially replicating the annotation. By another reannotation we show that native Chinese speakers perceive the complexity differently due to Sino-Japanese vocabulary. To explore the possibilities of personalization, we compare competitive baselines trained on the group mean ratings and individual ratings in terms of performance for an individual. We show that the model trained on a group mean performs similarly to an individual model in the CWI task, while achieving good LCP performance for an individual is difficult. We also experiment with adapting a finetuned BERT model, which results only in marginal improvements across all settings.
Anthology ID:
2024.tsar-1.8
Volume:
Proceedings of the Third Workshop on Text Simplification, Accessibility and Readability (TSAR 2024)
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Matthew Shardlow, Horacio Saggion, Fernando Alva-Manchego, Marcos Zampieri, Kai North, Sanja Štajner, Regina Stodden
Venue:
TSAR
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
69–81
Language:
URL:
https://aclanthology.org/2024.tsar-1.8
DOI:
Bibkey:
Cite (ACL):
Adam Nohejl, Akio Hayakawa, Yusuke Ide, and Taro Watanabe. 2024. Difficult for Whom? A Study of Japanese Lexical Complexity. In Proceedings of the Third Workshop on Text Simplification, Accessibility and Readability (TSAR 2024), pages 69–81, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Difficult for Whom? A Study of Japanese Lexical Complexity (Nohejl et al., TSAR 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.tsar-1.8.pdf