NNOSE: Nearest Neighbor Occupational Skill Extraction

Mike Zhang, Rob Goot, Min-Yen Kan, Barbara Plank


Abstract
The labor market is changing rapidly, prompting increased interest in the automatic extraction of occupational skills from text. With the advent of English benchmark job description datasets, there is a need for systems that handle their diversity well. We tackle the complexity in occupational skill datasets tasks—combining and leveraging multiple datasets for skill extraction, to identify rarely observed skills within a dataset, and overcoming the scarcity of skills across datasets. In particular, we investigate the retrieval-augmentation of language models, employing an external datastore for retrieving similar skills in a dataset-unifying manner. Our proposed method, Nearest Neighbor Occupational Skill Extraction (NNOSE) effectively leverages multiple datasets by retrieving neighboring skills from other datasets in the datastore. This improves skill extraction without additional fine-tuning. Crucially, we observe a performance gain in predicting infrequent patterns, with substantial gains of up to 30% span-F1 in cross-dataset settings.
Anthology ID:
2024.eacl-long.35
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
589–608
Language:
URL:
https://aclanthology.org/2024.eacl-long.35
DOI:
Bibkey:
Cite (ACL):
Mike Zhang, Rob Goot, Min-Yen Kan, and Barbara Plank. 2024. NNOSE: Nearest Neighbor Occupational Skill Extraction. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 589–608, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
NNOSE: Nearest Neighbor Occupational Skill Extraction (Zhang et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-long.35.pdf