首页> 外文期刊>Artificial life and robotics >Heterogeneous recurrent neural networks for natural language model
【24h】

Heterogeneous recurrent neural networks for natural language model

机译:异质经常性的自然语言模型神经网络

获取原文
获取原文并翻译 | 示例
           

摘要

Neural networks for language model are proposed and their performances are explored. The proposed network consists of two recurrent networks of which structures are different to each other. Both networks accept words as their inputs, translate their distributed representation, and produce the probabilities of words to occur from their sequence of input words. Performances for the proposed network are investigated through constructions for language models, as compared with a single recurrent neural and a long short-term memory network.
机译:提出了语言模型的神经网络,并探索了他们的表演。所提出的网络由两个经常性网络组成,其中结构彼此不同。两个网络都接受单词作为其输入,转换其分布式表示,并产生从其输入单词序列发生的单词的概率。与语言模型的结构相比,研究了所提出的网络的性能,与单一反复性神经和长短期内存网络相比。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号