首页> 外文会议>International Conference on Computational Linguistics >BERT-based Cohesion Analysis of Japanese Texts
【24h】

BERT-based Cohesion Analysis of Japanese Texts

机译:基于BERT的日语文本凝聚力分析

获取原文

摘要

The meaning of natural language text is supported by cohesion among various kinds of entities, including coreference relations, predicate-argument structures, and bridging anaphora relations. However, predicate-argument structures for nominal predicates and bridging anaphora relations have not been studied well, and their analyses have been still very difficult. Recent advances in neural networks, in particular, self training-based language models including BERT (Devlin et al., 2019), have significantly improved many natural language processing tasks, making it possible to dive into the study on analysis of cohesion in the whole text. In this study, we tackle an integrated analysis of cohesion in Japanese texts. Our results significantly outperformed existing studies in each task, especially about 10 to 20 point improvement both for zero anaphora and coreference resolution. Furthermore, we also showed that coreference resolution is different in nature from the other tasks and should be treated specially.
机译:自然语言文本的含义由各种实体中的凝聚力支持,包括练习关系,谓词争论结构和桥接性关系。然而,谓词谓词和桥接桥接关系的谓词论证结构尚未得到很好地研究,他们的分析仍然非常困难。特别是神经网络的最新进展,特别是基于自培训的语言模型,包括BERT(Devlin等,2019),具有显着提高了许多自然语言处理任务,使得可以潜入整体内聚力分析的研究文本。在这项研究中,我们解决了日文文本中的凝聚力的综合分析。我们的结果显着优化了每个任务的现有研究,特别是对于零安差拉和练习决策尤其是10至20点改进。此外,我们还表明Coreference解决方案与其他任务的性质有所不同,并且应特别地对待。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号