Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models Dan Iter author Kelvin Guu author Larry Lansing author Dan Jurafsky author 2020-07 text Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics Dan Jurafsky editor Joyce Chai editor Natalie Schluter editor Joel Tetreault editor Association for Computational Linguistics Online conference publication iter-etal-2020-pretraining 10.18653/v1/2020.acl-main.439 https://aclanthology.org/2020.acl-main.439/ 2020-07 4859 4870