Alignment via Mutual Information

Shinjini Ghosh, Yoon Kim, Ramon Fernandez Astudillo, Tahira Naseem, Jacob Andreas


Abstract
Many language learning tasks require learners to infer correspondences between data in two modalities. Often, these alignments are many-to-many and context-sensitive. For example, translating into morphologically rich languages requires learning not just how words, but morphemes, should be translated; words and morphemes may have different meanings (or groundings) depending on the context in which they are used. We describe an information-theoretic approach to context-sensitive, many-to-many alignment. Our approach first trains a masked sequence model to place distributions over missing spans in (source, target) sequences. Next, it uses this model to compute pointwise mutual information between source and target spans conditional on context. Finally, it aligns spans with high mutual information. We apply this approach to two learning problems: character-based word translation (using alignments for joint morphological segmentation and lexicon learning) and visually grounded reference resolution (using alignments to jointly localize referents and learn word meanings). In both cases, our proposed approach outperforms both structured and neural baselines, showing that conditional mutual information offers an effective framework for formalizing alignment problems in general domains.
Anthology ID:
2023.conll-1.32
Volume:
Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL)
Month:
December
Year:
2023
Address:
Singapore
Editors:
Jing Jiang, David Reitter, Shumin Deng
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
488–497
Language:
URL:
https://aclanthology.org/2023.conll-1.32
DOI:
10.18653/v1/2023.conll-1.32
Bibkey:
Cite (ACL):
Shinjini Ghosh, Yoon Kim, Ramon Fernandez Astudillo, Tahira Naseem, and Jacob Andreas. 2023. Alignment via Mutual Information. In Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL), pages 488–497, Singapore. Association for Computational Linguistics.
Cite (Informal):
Alignment via Mutual Information (Ghosh et al., CoNLL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.conll-1.32.pdf