首页> 外文学位 >Dynamical Analysis of Complex-valued Recurrent Neural Networks with Time-delays.
【24h】

Dynamical Analysis of Complex-valued Recurrent Neural Networks with Time-delays.

机译:具有时滞的复值递归神经网络的动力学分析。

获取原文
获取原文并翻译 | 示例

摘要

In this thesis, the complex-valued recurrent neural network models are presented and their dynamical behaviors, including global stability and periodic oscillation, multistability and multiperiodicity, are studied. Some simulation results are given to illustrate the obtained results and an application on complex-valued associative memory is provided.;As an extension of the real-valued recurrent neural networks, the states, connection weights and activation functions of complex-valued recurrent neural networks are complex-valued. It can deal with complex-valued signals and greatly improve the capability of solving some problems than the real-valued recurrent neural networks. In recent years, complex-valued recurrent neural networks have been widely applied and studied in machine learning, engineering optimization, image processing, pattern recognition, etc.;As is well known, the analysis and application of recurrent neural networks rely crucially on the equilibrium points or periodic orbits of the neural networks. This thesis focuses on the dynamical behaviors of complex-valued recurrent neural networks. The activation functions plays an important role in the dynamical behaviors of recurrent neural networks. In the complex domain, there are various types of activation functions with different properties from those in the real domain, which results in different dynamical behaviors of the complex-valued recurrent neural networks. In this thesis we mainly consider two types of complex-valued activation functions. For different types of activation functions, we use different approaches to analyze the dynamical behaviors of the relevant neural networks.;We first analyze the global stability and periodicity of the complex-valued recurrent neural networks. By using Lyapunov approach and the techniques of M-matrix and linear matrix inequality (LMI), we obtain the existence and uniqueness condition of the complex-valued recurrent neural networks and the sufficient conditions of the global stability and exponential periodicity of the neural networks. Next, we study a complex-valued recurrent neural network with one step piecewise linear activation functions and obtain the sufficient conditions of the mulistability and multistability of the neural networks. Finally, we use the discretization technique to obtain the discrete-time analogue of continuous-time complex-valued recurrent neural network models and study its dynamical behaviors by using discrete-time Lyapunov function and difference method.
机译:本文提出了复值递归神经网络模型,研究了它们的动力学行为,包括整体稳定性和周期振动,多重稳定性和多重周期性。给出了一些仿真结果来说明所获得的结果,并提供了在复值联想记忆上的应用。;作为实值递归神经网络的扩展,复值递归神经网络的状态,连接权重和激活函数是复数值。与实值循环神经网络相比,它可以处理复数值信号并大大提高了解决某些问题的能力。近年来,复值递归神经网络已在机器学习,工程优化,图像处理,模式识别等领域得到广泛应用和研究;众所周知,递归神经网络的分析和应用至关重要地依赖于平衡。神经网络的点或周期轨道。本文主要研究复值递归神经网络的动力学行为。激活函数在循环神经网络的动力学行为中起重要作用。在复杂域中,存在各种类型的激活函数,它们的属性与实际域中的属性不同,这导致了复值递归神经网络的不同动力学行为。在本文中,我们主要考虑两种类型的复值激活函数。对于不同类型的激活函数,我们使用不同的方法来分析相关神经网络的动力学行为。我们首先分析复值循环神经网络的全局稳定性和周期性。通过使用Lyapunov方法以及M矩阵和线性矩阵不等式(LMI)的技术,我们获得了复值循环神经网络的存在和唯一性条件,以及该神经网络的全局稳定性和指数周期性的充分条件。接下来,我们研究具有一步分段线性激活函数的复值递归神经网络,并获得了神经网络的可变性和多重稳定性的充分条件。最后,我们利用离散化技术获得了连续时间复数值递归神经网络模型的离散时间模拟,并使用离散时间Lyapunov函数和差分方法研究了其动力学行为。

著录项

  • 作者

    Hu, Jin.;

  • 作者单位

    The Chinese University of Hong Kong (Hong Kong).;

  • 授予单位 The Chinese University of Hong Kong (Hong Kong).;
  • 学科 Automotive engineering.
  • 学位 Ph.D.
  • 年度 2013
  • 页码 166 p.
  • 总页数 166
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号