首页> 外文期刊>Mathematical Problems in Engineering: Theory, Methods and Applications >Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification
【24h】

Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification

机译:Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification

获取原文
获取原文并翻译 | 示例
           

摘要

Document classification is a fundamental problem in natural language processing. Deep learning has demonstrated great success in this task. However, most existing models do not involve the sentence structure as a text semantic feature in the architecture and pay less attention to the contexting importance of words and sentences. In this paper, we present a new model based on a sparse recurrent neural network and self-attention mechanism for document classification. Subsequently, we analyze three variant models of GRU and LSTM for evaluating the sparse model in different datasets. Extensive experiments demonstrate that our model obtains competitive performance and outperforms previous models.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号