【24h】

Incorporating Context-Relevant Knowledge into Convolutional Neural Networks for Short Text Classification

机译:将上下文相关知识纳入卷积神经网络以进行短文本分类

获取原文

摘要

Some text classification methods don't work well on short texts due to the data sparsity. What's more, they don't fully exploit context-relevant knowledge. In order to tackle these problems, we propose a neural network to incorporate context-relevant knowledge into a convolutional neural network for short text classification. Our model consists of two modules. The first module utilizes two layers to extract concept and context features respectively and then employs an attention layer to extract those context-relevant concepts. The second module utilizes a convolutional neural network to extract high-level features from the word and the context-relevant concept features. The experimental results on three datasets show that our proposed model outperforms the state-of-the-art models.
机译:由于数据稀疏性,某些文本分类方法在短文本上不起作用。 更重要的是,他们并没有完全利用上下文相关的知识。 为了解决这些问题,我们提出了一个神经网络,将上下文相关知识纳入卷积神经网络以进行短文本分类。 我们的模型由两个模块组成。 第一模块利用两层分别提取概念和上下文特征,然后采用注意层提取这些上下文相关概念。 第二模块利用卷积神经网络从单词和上下文相关概念特征中提取高级功能。 在三个数据集上的实验结果表明,我们所提出的模型优于最先进的模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号