首页> 外文学位 >Development and evaluation of multilayer perceptron training algorithms.
【24h】

Development and evaluation of multilayer perceptron training algorithms.

机译:多层感知器训练算法的开发和评估。

获取原文
获取原文并翻译 | 示例

摘要

This dissertation is about the evaluation of different advanced neural network training algorithms and development of new algorithms. Through the research, we could obtain detailed properties of the existing algorithms, develop some testing methods, and build new algorithms. The algorithms we start out with are output weight optimization hidden weight optimization (OWO-HWO) and full conjugate gradient (FCG) methods.; The tasks we encountered are as follows: (1) developing optimal learning factors for algorithms, (2) developing tests for learning factors, (3) developing methods for evaluating training algorithms, (4) investigating error functions to build fusing methods, and (5) developing algorithms which alternately use different types algorithms.; The reason we develop optimal learning factor (OLF) is that some existing learning factors are based on empirical method and the others take a lot of time to calculate. Our OLF serves as a standard to which other learning factor methods are compared. Using our new testing methods for learning factors, we can compare different type learning factors.; From experimentations, we found out that FCG is working better on random data and OWO-HWO is working better on correlated data. From this observation we devise alternating algorithms. Adaptive alternating algorithm is working a little better than fixed alternating algorithms.; From the performance observations and examination of algorithms, we find gradient based methods such as FCG are affected by input biases. We find that OWO-HWO is immune to input biases and variances when net control is applied.; Even though we experiment with different fusing methods, first and second fusing methods were found out to have problems. Third fusing algorithm based on OWO-HWO and FCG properties using normalized and Karhounen-Loève transformed data turn out to be successful.
机译:本文的目的是对不同的高级神经网络训练算法进行评估,并开发新的算法。通过研究,我们可以获得现有算法的详细属性,开发一些测试方法,并建立新的算法。我们开始的算法是输出权重优化,隐藏权重优化(OW​​O-HWO)和全共轭梯度(FCG)方法。我们遇到的任务如下:(1)开发算法的最佳学习因子;(2)开发学习因子的测试;(3)开发评估训练算法的方法;(4)研究误差函数以构建融合方法;以及( 5)开发交替使用不同类型算法的算法;我们开发最佳学习因子(OLF)的原因是,一些现有的学习因子基于经验方法,而另一些则需要大量时间来计算。我们的OLF是与其他学习因素方法进行比较的标准。使用我们针对学习因素的新测试方法,我们可以比较不同类型的学习因素。从实验中我们发现,FCG在随机数据上工作得更好,而OWO-HWO在相关数据上工作得更好。根据这一观察,我们设计了交替算法。自适应交替算法比固定交替算法要好一些。从性能观察和算法检查中,我们发现基于梯度的方法(例如FCG)受输入偏置的影响。我们发现,当应用净控制时,OWO-HWO不受输入偏差和方差的影响。即使我们尝试了不同的定影方法,也发现第一和第二种定影方法存在问题。事实证明,使用归一化和Karhounen-Loève转换数据的基于OWO-HWO和FCG属性的第三种融合算法是成功的。

著录项

  • 作者

    Kim, Tae-Hoon.;

  • 作者单位

    The University of Texas at Arlington.;

  • 授予单位 The University of Texas at Arlington.;
  • 学科 Engineering Electronics and Electrical.
  • 学位 Ph.D.
  • 年度 2001
  • 页码 89 p.
  • 总页数 89
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 无线电电子学、电信技术;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号