...
首页> 外文期刊>Neural processing letters >Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations
【24h】

Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations

机译:一类具有不连续神经元激活的递归延迟神经网络的全局输出收敛性

获取原文
获取原文并翻译 | 示例
           

摘要

This paper studies the global output convergence of a class of recurrent delayed neural networks with time-varying inputs. We consider non-decreasing activations which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumptions of Lipschitz continuity and boundedness on the activation functions, which are usually required in most of the existing works. Due to the possible discontinuities of the activations functions, we introduce a suitable notation of limit to study the convergence of the output of the recurrent delayed neural networks. Under suitable assumptions on the interconnection matrices and the time-varying inputs, we establish a sufficient condition for global output convergence of this class of neural networks. The convergence results are useful in solving some optimization problems and in the design of recurrent delayed neural networks with discontinuous neuron activations.
机译:本文研究了一类时变输入的递归时滞神经网络的全局输出收敛性。我们考虑非递减激活,该激活也可能具有跳跃不连续性,以便对神经元放大器增益非常高且趋于无穷大的理想情况建模。特别是,我们放弃了激活函数上Lipschitz连续性和有界性的假设,这在大多数现有作品中通常是必需的。由于激活函数可能不连续,我们引入了一个合适的极限表示法来研究递归延迟神经网络输出的收敛性。在关于互连矩阵和时变输入的适当假设下,我们为此类神经网络的全局输出收敛建立了充分条件。收敛结果对于解决一些优化问题以及在具有不连续神经元激活的递归延迟神经网络的设计中很有用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号