...
首页> 外文期刊>IEEE Transactions on Neural Networks >Absolute Exponential Stability of Recurrent Neural Networks With Generalized Activation Function
【24h】

Absolute Exponential Stability of Recurrent Neural Networks With Generalized Activation Function

机译:具有广义激活函数的递归神经网络的绝对指数稳定性

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, the recurrent neural networks (RNNs) with a generalized activation function class is proposed. In this proposed model, every component of the neuron''s activation function belongs to a convex hull which is bounded by two odd symmetric piecewise linear functions that are convex or concave over the real space. All of the convex hulls are composed of generalized activation function classes. The novel activation function class is not only with a more flexible and more specific description of the activation functions than other function classes but it also generalizes some traditional activation function classes. The absolute exponential stability (AEST) of the RNN with a generalized activation function class is studied through three steps. The first step is to demonstrate the global exponential stability (GES) of the equilibrium point of original RNN with a generalized activation function being equivalent to that of RNN under all vertex functions of convex hull. The second step transforms the RNN under every vertex activation function into neural networks under an array of saturated linear activation functions. Because the GES of the equilibrium point of three systems are equivalent, the next stability analysis focuses on the GES of the equilibrium point of RNN system under an array of saturated linear activation functions. The last step is to study both the existence of equilibrium point and the GES of the RNN under saturated linear activation functions using the theory of $M$-matrix. In the end, a two-neuron RNN with a generalized activation function is constructed to show the effectiveness of our results.
机译:本文提出了具有广义激活函数类的递归神经网络(RNN)。在这个提出的模型中,神经元激活函数的每个分量都属于一个凸包,它由两个在实空间上凸或凹的奇数对称分段线性函数限定。所有凸包均由广义激活函数类组成。新颖的激活函数类不仅具有比其他函数类更灵活,更具体的激活函数描述,而且还推广了一些传统的激活函数类。通过三个步骤研究了具有广义激活函数类的RNN的绝对指数稳定性(AEST)。第一步是证明原始RNN平衡点的全局指数稳定性(GES),其广义激活函数与凸壳所有顶点函数下的RNN相等。第二步将每个顶点激活函数下的RNN转换为一系列饱和线性激活函数下的神经网络。由于三个系统的平衡点的GES是等价的,因此下一个稳定性分析集中在一系列饱和线性激活函数下的RNN系统的平衡点的GES。最后一步是使用$ M $-矩阵理论研究饱和线性激活函数下RNN的平衡点和GES的存在。最后,构造了具有广义激活函数的两神经元RNN,以显示我们的结果的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号