Learning Structural Kernels for Natural Language Processing

Daniel Beck, Trevor Cohn, Christian Hardmeier, Lucia Specia


Abstract
Structural kernels are a flexible learning paradigm that has been widely used in Natural Language Processing. However, the problem of model selection in kernel-based methods is usually overlooked. Previous approaches mostly rely on setting default values for kernel hyperparameters or using grid search, which is slow and coarse-grained. In contrast, Bayesian methods allow efficient model selection by maximizing the evidence on the training data through gradient-based methods. In this paper we show how to perform this in the context of structural kernels by using Gaussian Processes. Experimental results on tree kernels show that this procedure results in better prediction performance compared to hyperparameter optimization via grid search. The framework proposed in this paper can be adapted to other structures besides trees, e.g., strings and graphs, thereby extending the utility of kernel-based methods.
Anthology ID:
Q15-1033
Volume:
Transactions of the Association for Computational Linguistics, Volume 3
Month:
Year:
2015
Address:
Cambridge, MA
Editors:
Michael Collins, Lillian Lee
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
461–473
Language:
URL:
https://aclanthology.org/Q15-1033
DOI:
10.1162/tacl_a_00151
Bibkey:
Cite (ACL):
Daniel Beck, Trevor Cohn, Christian Hardmeier, and Lucia Specia. 2015. Learning Structural Kernels for Natural Language Processing. Transactions of the Association for Computational Linguistics, 3:461–473.
Cite (Informal):
Learning Structural Kernels for Natural Language Processing (Beck et al., TACL 2015)
Copy Citation:
PDF:
https://aclanthology.org/Q15-1033.pdf
Video:
 https://aclanthology.org/Q15-1033.mp4
Data
WMT 2014