Building OCR/NER Test Collections

Dawn Lawrie, James Mayfield, David Etter


Abstract
Named entity recognition (NER) identifies spans of text that contain names. Many researchers have reported the results of NER on text created through optical character recognition (OCR) over the past two decades. Unfortunately, the test collections that support this research are annotated with named entities after optical character recognition (OCR) has been run. This means that the collection must be re-annotated if the OCR output changes. Instead by tying annotations to character locations on the page, a collection can be built that supports OCR and NER research without requiring re-annotation when either improves. This means that named entities are annotated on the transcribed text. The transcribed text is all that is needed to evaluate the performance of OCR. For NER evaluation, the tagged OCR output is aligned to the transcriptions the aligned files, creating modified files of each, which are scored. This paper presents a methodology for building such a test collection and releases a collection of Chinese OCR-NER data constructed using the methodology. The paper provides performance baselines for current OCR and NER systems applied to this new collection.
Anthology ID:
2020.lrec-1.570
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
4639–4646
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.570
DOI:
Bibkey:
Cite (ACL):
Dawn Lawrie, James Mayfield, and David Etter. 2020. Building OCR/NER Test Collections. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 4639–4646, Marseille, France. European Language Resources Association.
Cite (Informal):
Building OCR/NER Test Collections (Lawrie et al., LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.570.pdf