Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes

Noémien Kocher, Christian Scuito, Lorenzo Tarantino, Alexandros Lazaridis, Andreas Fischer, Claudiu Musat


Abstract
In sequence modeling tasks the token order matters, but this information can be partially lost due to the discretization of the sequence into data points. In this paper, we study the imbalance between the way certain token pairs are included in data points and others are not. We denote this a token order imbalance (TOI) and we link the partial sequence information loss to a diminished performance of the system as a whole, both in text and speech processing tasks. We then provide a mechanism to leverage the full token order information—Alleviated TOI—by iteratively overlapping the token composition of data points. For recurrent networks, we use prime numbers for the batch size to avoid redundancies when building batches from overlapped data points. The proposed method achieved state of the art performance in both text and speech related tasks.
Anthology ID:
K19-1083
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
890–899
Language:
URL:
https://aclanthology.org/K19-1083
DOI:
10.18653/v1/K19-1083
Bibkey:
Cite (ACL):
Noémien Kocher, Christian Scuito, Lorenzo Tarantino, Alexandros Lazaridis, Andreas Fischer, and Claudiu Musat. 2019. Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 890–899, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes (Kocher et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1083.pdf
Code
 nkcr/overlap-ml
Data
IEMOCAPWikiText-103WikiText-2