Jeffrey Lidz


2017

pdf bib
Learning an Input Filter for Argument Structure Acquisition
Laurel Perkins | Naomi Feldman | Jeffrey Lidz
Proceedings of the 7th Workshop on Cognitive Modeling and Computational Linguistics (CMCL 2017)

How do children learn a verb’s argument structure when their input contains nonbasic clauses that obscure verb transitivity? Here we present a new model that infers verb transitivity by learning to filter out non-basic clauses that were likely parsed in error. In simulations with child-directed speech, we show that this model accurately categorizes the majority of 50 frequent transitive, intransitive and alternating verbs, and jointly learns appropriate parameters for filtering parsing errors. Our model is thus able to filter out problematic data for verb learning without knowing in advance which data need to be filtered.