首页> 外文期刊>Computer speech and language >A label-oriented loss function for learning sentence representations
【24h】

A label-oriented loss function for learning sentence representations

机译:用于学习句子表示的标签导向损耗功能

获取原文
获取原文并翻译 | 示例
       

摘要

Neural network methods which leverage word-embedding obtained from unsupervised learning models have been widely adopted in many natural language processing (NLP) tasks, including sentiment analysis and sentence classification. Existing sentence representation generation approaches which serve for classification tasks generally rely on complex deep neural networks but relatively simple loss functions, such as cross entropy loss function. These approaches cannot produce satisfactory separable sentence representations because the usage of cross entropy may ignore the sentiment and semantic information of the labels. To extract useful information from labels for improving the distinguishability of the obtained sentence representations, this paper proposes a label-oriented loss function. The proposed loss function takes advantage of the word-embeddings of labels to guide the production of meaningful sentence representations which serve for downstream classification tasks. Compared with existing end-to-end approaches, the evaluation experiments on several datasets illustrate that using the proposed loss function can achieve competitive and even better classification results.
机译:利用无监督学习模型获得的神经网络方法已经广泛采用了许多自然语言处理(NLP)任务,包括情感分析和句子分类。用于分类任务的现有句子表示生成方法通常依赖于复杂的深度神经网络,而是相对简单的损耗功能,例如跨熵损失函数。这些方法不能产生令人满意的可分离句子表示,因为交叉熵的使用可能忽略标签的情绪和语义信息。为了从标签中提取有用的信息来提高所获得的句子表示的区分性,提出了一种面向标签的损失功能。拟议的损失函数利用标签的单词嵌入来指导有意义的句子陈述的生产,这些句子表现为下游分类任务。与现有的端到端方法相比,若干数据集上的评估实验说明了使用所提出的损失功能可以实现竞争性和更好的分类结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号