【24h】

Sequential Learning Algorithm of Neural Networks Systems for Time Series

机译:时间序列神经网络系统的顺序学习算法

获取原文
获取原文并翻译 | 示例

摘要

This article describes a new structure to create a RBF neural network that uses regression weights to replace the constant weights normally used. These regression weights are assumed to be functions of input variables. In this way the number of hidden units within a RBF neural network is reduced. A new type of nonlinear function is proposed: the pseudo-gaussian function. With this, the neural system gains flexibility, as the neurons possess an activation field that does not necessarily have to be symmetric with respect to the centre or to the location of the neuron in the input space. In addition to this new structure, we propose a sequential learning algorithm, which is able to adapt the structure of the network; with this, it is possible to create new hidden units and also to detect and remove inactive units. We have presented conditions to increase or decrease the number of neurons, based on the novelty of the data and on the overall behaviour of the neural system, (for example, pruning the hidden units that have lowest relevance to the neural system using Orthogonal Least Squares (OLS) and other operators), respectively. The feasibility of the evolution and learning capability of the resulting algorithm for the neural network is demonstrated by predicting time series.
机译:本文介绍了一种用于创建RBF神经网络的新结构,该结构使用回归权重代替通常使用的恒定权重。假定这些回归权重是输入变量的函数。这样,减少了RBF神经网络中隐藏单元的数量。提出了一种新型的非线性函数:伪高斯函数。这样,神经系统获得了灵活性,因为神经元拥有一个激活场,该激活场不必相对于输入空间中神经元的中心或位置对称。除了这种新结构之外,我们还提出了一种顺序学习算法,该算法能够适应网络的结构。这样,可以创建新的隐藏单元,还可以检测和删除不活动的单元。我们基于数据的新颖性和神经系统的整体行为,提出了增加或减少神经元数量的条件(例如,使用正交最小二乘修剪与神经系统相关性最低的隐藏单元) (OLS)和其他运算符)。通过预测时间序列,证明了所得算法用于神经网络的进化和学习能力的可行性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号