Overprotective Training Environments Fall Short at Testing Time: Let Models Contribute to Their Own Training

Alberto Testoni, Raffaella Bernardi


Anthology ID:
2020.clicit-1.49
Volume:
Proceedings of the Seventh Italian Conference on Computational Linguistics (CLiC-it 2020)
Month:
March
Year:
2020
Address:
Bologna, Italy
Editors:
Johanna Monti, Felice Dell'Orletta, Fabio Tamburini
Venue:
CLiC-it
SIG:
Publisher:
CEUR Workshop Proceedings
Note:
Pages:
320–325
Language:
URL:
https://aclanthology.org/2020.clicit-1.49/
DOI:
Bibkey:
Cite (ACL):
Alberto Testoni and Raffaella Bernardi. 2020. Overprotective Training Environments Fall Short at Testing Time: Let Models Contribute to Their Own Training. In Proceedings of the Seventh Italian Conference on Computational Linguistics (CLiC-it 2020), pages 320–325, Bologna, Italy. CEUR Workshop Proceedings.
Cite (Informal):
Overprotective Training Environments Fall Short at Testing Time: Let Models Contribute to Their Own Training (Testoni & Bernardi, CLiC-it 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.clicit-1.49.pdf