%0 Conference Proceedings %T Optimizing Differentiable Relaxations of Coreference Evaluation Metrics %A Le, Phong %A Titov, Ivan %Y Levy, Roger %Y Specia, Lucia %S Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017) %D 2017 %8 August %I Association for Computational Linguistics %C Vancouver, Canada %F le-titov-2017-optimizing %X Coreference evaluation metrics are hard to optimize directly as they are non-differentiable functions, not easily decomposable into elementary decisions. Consequently, most approaches optimize objectives only indirectly related to the end goal, resulting in suboptimal performance. Instead, we propose a differentiable relaxation that lends itself to gradient-based optimisation, thus bypassing the need for reinforcement learning or heuristic modification of cross-entropy. We show that by modifying the training objective of a competitive neural coreference system, we obtain a substantial gain in performance. This suggests that our approach can be regarded as a viable alternative to using reinforcement learning or more computationally expensive imitation learning. %R 10.18653/v1/K17-1039 %U https://aclanthology.org/K17-1039 %U https://doi.org/10.18653/v1/K17-1039 %P 390-399