Don't Stop Me Now! Using Global Dynamic Oracles to Correct Training Biases of Transition-Based Dependency Parsers
Résumé
This paper formalizes a sound extension of dynamic oracles to global
training, in the frame of transition-based dependency parsers. By
dispensing with the pre-computation of references, this extension
widens the training strategies that can be entertained for such
parsers; we show this by revisiting two standard training
procedures, early-update and max-violation, to correct
some of their search space sampling biases. Experimentally, on the
SPMRL treebanks, this improvement increases the similarity between
the train and test distributions and yields performance improvements
up to 0.7 UAS, without any computation overhead.
Domaines
Informatique et langage [cs.CL]Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|
Loading...