首页> 外文会议>International Joint Conference on Neural Networks >Unsupervised Pre-training on Improving the Performance of Neural Network in Regression
【24h】

Unsupervised Pre-training on Improving the Performance of Neural Network in Regression

机译:改善神经网络回归性能的无监督预训练

获取原文

摘要

The paper aims to empirically analyse the performance of the prediction capability of Artificial Neural Network by applying a pre-training mechanism. The pre-training used here is same as the training of Deep Belief Network where the network is formed by stacking Restricted Boltzmann Machine one above the other successively. A different set of experiments are performed to understand in what scenario pre-trained ANN performed better than randomly initialised ANN. The results of experiments showed that pre-trained model performed better than randomly initialised ANN in terms of generalised error, computational units required and most importantly robust to change in hyperparameters such as learning rate and model architecture. The only cost is in additional time involved in the pre-training phase. Further, the learned knowledge in pretraining, which is stored as weights in ANN, are analysed using Hinton diagram. The analysis could provide clear picture of the pre-training that learned some of the hidden characteristics of the data.
机译:本文旨在通过应用一种预训练机制来对人工神经网络的预测能力的性能进行实证分析。这里使用的预训练与深度信任网络的训练相同,在深度信任网络中,网络是通过将限制性玻尔兹曼机一个接一个地堆叠而形成的。进行了不同的实验,以了解在什么情况下预训练的ANN比随机初始化的ANN更好。实验结果表明,在广义误差,所需的计算单位以及最重要的是在诸如学习率和模型体系结构等超参数变化方面,预训练模型的性能要优于随机初始化的ANN。唯一的成本是在预训练阶段要花费额外的时间。此外,使用欣顿图分析了在预训练中学到的知识,这些知识作为权重存储在ANN中。该分析可以清楚地了解学习数据的某些隐藏特征的预训练。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号