【24h】

Analysis of using RLS in neural fuzzy systems

机译:在神经模糊系统中使用RLS的分析

获取原文

摘要

In this study, we continue our analysis on the use of RLS in neural fuzzy systems. The recursive least square (RLS) algorithms can have great learning performance for neural fuzzy networks. From our previous work, it can be observed that the advantages of using RLS instead of using BP are not so obvious. For the use of forgetting factor in RLS, the idea is to account for the effects of the change in the premise part. In this study, we have observed that the use of a forgetting factor can still have some advantages when the premise part is fixed. The idea is similar to the used of Widrow-Hoff learning concept in the backpropagation learning algorithm. From our experiments, a strong forgetting factor (smaller value) can let the consequent part trace the error in the learning phase. But the testing error becomes very large. When the system capacity is sufficient, a forgetting factor will improve both in the learning phase and in the testing phase. Finally, the initial value of the covariance matrix is considered. The learning capacity will rise when the initial value increases. But it will increase the error tracing phenomenon in the consequent part too. But it is opposite in a system with less learning capacity.
机译:在这项研究中,我们将继续分析RLS在神经模糊系统中的使用。递归最小二乘(RLS)算法对于神经模糊网络可以具有很好的学习性能。从我们以前的工作中可以看出,使用RLS而不是BP的优势不是很明显。为了在RLS中使用遗忘因子,其想法是考虑前提部分中变化的影响。在这项研究中,我们观察到,当前提部分固定时,使用遗忘因子仍然可以具有一些优势。这个想法类似于反向传播学习算法中Widrow-Hoff学习概念的使用。根据我们的实验,强大的遗忘因子(较小的值)可以让后续部分在学习阶段跟踪错误。但是测试错误变得非常大。当系统容量足够时,在学习阶段和测试阶段的遗忘因素都将得到改善。最后,考虑协方差矩阵的初始值。当初始值增加时,学习能力将提高。但这也会在随后的部分中增加错误跟踪现象。但是在学习能力较低的系统中则相反。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号