首页> 外文会议>2017 IEEE 28th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications >Activation functions of deep neural networks for polar decoding applications
【24h】

Activation functions of deep neural networks for polar decoding applications

机译:深度神经网络的激活功能,用于极性解码应用

获取原文
获取原文并翻译 | 示例

摘要

Among various deep neural network (DNN) components, this paper studies the activation functions especially for deep feed-forward networks with applications to channel decoding problems of polar code. In line with our previous study, this paper considers the ReLU (Rectified Linear Unit) and its variants for activation functions of DNN. We devise a new ReLU variant, called Sloped ReLU, by varying the slope of the ReLU for the positive domain range. This is analogous to tree architectures between the likelihood function in successive decoding of channel codes and the activation function in DNN. Our numerical results show that the polar decoding performance with the Sloped ReLU improves as the slope increases, up to a certain level. We believe that the idea of utilizing this analogy for determining activation functions of DNN can be applied to other decoding problems as well, which remains as a future work.
机译:在各种深度神经网络(DNN)组件中,本文专门针对深度前馈网络研究激活函数,并将其应用于极性码的信道解码问题。与我们之前的研究一致,本文考虑了ReLU(整流线性单位)及其DNN激活函数的变体。我们通过改变正域范围内ReLU的斜率,设计了一种新的ReLU变体,称为Sloped ReLU。这类似于连续解码通道码中的似然函数与DNN中的激活函数之间的树结构。我们的数值结果表明,倾斜ReLU的极性解码性能会随着斜率的增加而提高,直至达到一定水平。我们相信,利用这种类比来确定DNN的激活功能的想法也可以应用于其他解码问题,这仍然是未来的工作。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号