SkillSpan: Hard and Soft Skill Extraction from English Job Postings

Mike Zhang, Kristian Jensen, Sif Sonniks, Barbara Plank


Abstract
Skill Extraction (SE) is an important and widely-studied task useful to gain insights into labor market dynamics. However, there is a lacuna of datasets and annotation guidelines; available datasets are few and contain crowd-sourced labels on the span-level or labels from a predefined skill inventory. To address this gap, we introduce SKILLSPAN, a novel SE dataset consisting of 14.5K sentences and over 12.5K annotated spans. We release its respective guidelines created over three different sources annotated for hard and soft skills by domain experts. We introduce a BERT baseline (Devlin et al., 2019). To improve upon this baseline, we experiment with language models that are optimized for long spans (Joshi et al., 2020; Beltagy et al., 2020), continuous pre-training on the job posting domain (Han and Eisenstein, 2019; Gururangan et al., 2020), and multi-task learning (Caruana, 1997). Our results show that the domain-adapted models significantly outperform their non-adapted counterparts, and single-task outperforms multi-task learning.
Anthology ID:
2022.naacl-main.366
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4962–4984
Language:
URL:
https://aclanthology.org/2022.naacl-main.366
DOI:
10.18653/v1/2022.naacl-main.366
Bibkey:
Cite (ACL):
Mike Zhang, Kristian Jensen, Sif Sonniks, and Barbara Plank. 2022. SkillSpan: Hard and Soft Skill Extraction from English Job Postings. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4962–4984, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
SkillSpan: Hard and Soft Skill Extraction from English Job Postings (Zhang et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.366.pdf
Video:
 https://aclanthology.org/2022.naacl-main.366.mp4
Code
 Kaleidophon/deep-significance +  additional community code
Data
SkillSpan