Enhancing Pre-Trained Generative Language Models with Question Attended Span Extraction on Machine Reading Comprehension

Lin Ai, Zheng Hui, Zizhou Liu, Julia Hirschberg


Anthology ID:
2024.emnlp-main.560
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10046–10063
Language:
URL:
https://aclanthology.org/2024.emnlp-main.560/
DOI:
10.18653/v1/2024.emnlp-main.560
Bibkey:
Cite (ACL):
Lin Ai, Zheng Hui, Zizhou Liu, and Julia Hirschberg. 2024. Enhancing Pre-Trained Generative Language Models with Question Attended Span Extraction on Machine Reading Comprehension. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 10046–10063, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Enhancing Pre-Trained Generative Language Models with Question Attended Span Extraction on Machine Reading Comprehension (Ai et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.560.pdf
Software:
 2024.emnlp-main.560.software.zip