首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing
【24h】

Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

机译:与可分辨率解配器的潜在树学习:转移减少解析和图表解析

获取原文

摘要

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.
机译:潜在树立学习模型代表根据诱导的解析树的单词来表示句子,所有这些都是基于下游任务。这些模型通常优于使用(外部提供)语法树以驱动组成顺序的基线。这项工作有助于(a)基于移位减少解析的新潜在树立学习模型,具有竞争力的下游性能和非琐碎的诱导树木,以及(b)对我们的转移 - 减少模型和图表学习的树木的分析基于模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号