Deepak Mittal
2021
The Effect of Pretraining on Extractive Summarization for Scientific Documents
Yash Gupta
|
Pawan Sasanka Ammanamanchi
|
Shikha Bordia
|
Arjun Manoharan
|
Deepak Mittal
|
Ramakanth Pasunuru
|
Manish Shrivastava
|
Maneesh Singh
|
Mohit Bansal
|
Preethi Jyothi
Proceedings of the Second Workshop on Scholarly Document Processing
Large pretrained models have seen enormous success in extractive summarization tasks. In this work, we investigate the influence of pretraining on a BERT-based extractive summarization system for scientific documents. We derive significant performance improvements using an intermediate pretraining step that leverages existing summarization datasets and report state-of-the-art results on a recently released scientific summarization dataset, SciTLDR. We systematically analyze the intermediate pretraining step by varying the size and domain of the pretraining corpus, changing the length of the input sequence in the target task and varying target tasks. We also investigate how intermediate pretraining interacts with contextualized word embeddings trained on different domains.
2014
An Error Analysis Tool for Natural Language Processing and Applied Machine Learning
Apoorv Agarwal
|
Ankit Agarwal
|
Deepak Mittal
Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: System Demonstrations
Search
Co-authors
- Apoorv Agarwal 1
- Ankit Agarwal 1
- Yash Gupta 1
- Pawan Sasanka Ammanamanchi 1
- Shikha Bordia 1
- show all...