Jacob Solawetz


2024

pdf bib
Arcee’s MergeKit: A Toolkit for Merging Large Language Models
Charles Goddard | Shamane Siriwardhana | Malikeh Ehghaghi | Luke Meyers | Vladimir Karpukhin | Brian Benedict | Mark McQuade | Jacob Solawetz
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track

The rapid growth of open-source language models provides the opportunity to merge model checkpoints, combining their parameters to improve performance and versatility. Advances in transfer learning have led to numerous task-specific models, which model merging can integrate into powerful multitask models without additional training. MergeKit is an open-source library designed to support this process with an efficient and extensible framework suitable for any hardware. It has facilitated the merging of thousands of models, contributing to some of the world’s most powerful open-source model checkpoints. The library is accessible at: https://github.com/arcee-ai/mergekit.

2021

pdf bib
LSOIE: A Large-Scale Dataset for Supervised Open Information Extraction
Jacob Solawetz | Stefan Larson
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume

Open Information Extraction (OIE) systems seek to compress the factual propositions of a sentence into a series of n-ary tuples. These tuples are useful for downstream tasks in natural language processing like knowledge base creation, textual entailment, and natural language understanding. However, current OIE datasets are limited in both size and diversity. We introduce a new dataset by converting the QA-SRL 2.0 dataset to a large-scale OIE dataset LSOIE. Our LSOIE dataset is 20 times larger than the next largest human-annotated OIE dataset. We construct and evaluate several benchmark OIE models on LSOIE, providing baselines for future improvements on the task. Our LSOIE data, models, and code are made publicly available.