首页> 外文期刊>International Journal of Electrical and Computer Engineering >Text classification based on gated recurrent unit combines with support vector machine
【24h】

Text classification based on gated recurrent unit combines with support vector machine

机译:基于门控复发单元的文本分类与支持向量机相结合

获取原文
           

摘要

As the amount of unstructured text data that humanity produce largely and a lot of texts are grows on the Internet, so the one of the intelligent technique is require processing it and extracting different types of knowledge from it. Gated recurrent unit (GRU) and support vector machine (SVM) have been successfully used to Natural Language Processing (NLP) systems with comparative, remarkable results. GRU networks perform well in sequential learning tasks and overcome the issues of “vanishing and explosion of gradients in standard recurrent neural networks (RNNs) when captureing long-term dependencies. In this paper, we proposed a text classification model based on improved approaches to this norm by presenting a linear support vector machine (SVM) as the replacement of Softmax in the final output layer of a GRU model. Furthermore, the cross-entropy function shall be replaced with a margin-based function. Empirical results present that the proposed GRU-SVM model achieved comparatively better results than the baseline approaches BLSTM-C, DABN.
机译:作为人类在很大程度上和大量文本的非结构化文本数据的数量在互联网上增长,因此智能技术之一需要处理它并从中提取不同类型的知识。门控复发单元(GRU)和支持向量机(SVM)已成功用于具有比较,显着的结果的自然语言处理(NLP)系统。 GRU网络在顺序学习任务中表现良好,并在捕获长期依赖性时克服“在标准经常性神经网络(RNNS)中的消失和爆发渐变”问题。在本文中,我们提出了一种基于本规范的改进方法的文本分类模型,通过呈现线性支持向量机(SVM)作为在GRU模型的最终输出层中替换Softmax。此外,跨熵函数应替换为基于边缘的功能。经验结果表明,所提出的GRU-SVM模型比Blstm-C,DABN的基线达到比较良好。

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号