首页> 外文会议>Workshop on the relevance of linguistic structure in neural architectutes for NLP 2018 >Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing
【24h】

Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

机译:具有可分辨解析器的潜在树学习:Shift-Reduce解析和图表解析

获取原文
获取原文并翻译 | 示例

摘要

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.
机译:潜在树学习模型通过根据诱导的语法分析树组成单词来表示句子,所有这些都基于下游任务。这些模型通常胜过使用(外部提供的)语法树来驱动合成顺序的基线。这项工作有助于(a)基于移位减少解析的新的潜在树学习模型,具有竞争性的下游性能和非平凡的诱导树,以及(b)对通过移位减少模型和图表学习的树进行分析基于模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号