首页> 外文会议>International Joint Conference on Neural Networks >Words Are Not Temporal Sequences of Characters
【24h】

Words Are Not Temporal Sequences of Characters

机译:单词不是字符的时间序列

获取原文

摘要

Language modeling is a valuable component of generative natural language processing (NLP) tasks, and benefits from explicit representations of the inherent hierarchies in language. We investigate a commonly used architecture that captures the concept that words are built from characters, and modify the word encoding mechanism to use a feed forward neural network rather than a recurrent neural network (RNN). This feed forward architecture facilitates increased performance and a reduction in the number of parameters over models that use common RNN implementations. We investigate whether word representations benelit from position- invariant features in the characters, and lind that fixed-position representations are sufficient.
机译:语言建模是生成自然语言处理(NLP)任务的重要组成部分,并且受益于语言固有层次结构的显式表示。我们研究了一种常用的体系结构,该体系结构捕获了从字符构建单词的概念,并修改了单词编码机制以使用前馈神经网络而不是递归神经网络(RNN)。与使用常见RNN实现的模型相比,此前馈体系结构可提高性能并减少参数数量。我们调查单词表示是否受益于字符中的位置不变特征,并认为固定位置表示就足够了。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号