Few-Shot Coreference Resolution with Semantic Difficulty Metrics and In-Context Learning

Nguyen Xuan Phuc, Dang Van Thin


Abstract
This paper presents our submission to the CRAC 2025 Shared Task on Multilingual Coreference Resolution in the LLM track. We propose a prompt-based few-shot coreference resolution system where the final inference is performed by Grok-3 using in-context learning. The core of our methodology is a difficulty- aware sample selection pipeline that leverages Gemini Flash 2.0 to compute semantic diffi- culty metrics, including mention dissimilarity and pronoun ambiguity. By identifying and selecting the most challenging training sam- ples for each language, we construct highly informative prompts to guide Grok-3 in predict- ing coreference chains and reconstructing zero anaphora. Our approach secured 3rd place in the CRAC 2025 shared task.
Anthology ID:
2025.crac-1.13
Volume:
Proceedings of the Eighth Workshop on Computational Models of Reference, Anaphora and Coreference
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Maciej Ogrodniczuk, Michal Novak, Massimo Poesio, Sameer Pradhan, Vincent Ng
Venue:
CRAC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
149–153
Language:
URL:
https://aclanthology.org/2025.crac-1.13/
DOI:
Bibkey:
Cite (ACL):
Nguyen Xuan Phuc and Dang Van Thin. 2025. Few-Shot Coreference Resolution with Semantic Difficulty Metrics and In-Context Learning. In Proceedings of the Eighth Workshop on Computational Models of Reference, Anaphora and Coreference, pages 149–153, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Few-Shot Coreference Resolution with Semantic Difficulty Metrics and In-Context Learning (Phuc & Thin, CRAC 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.crac-1.13.pdf