首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Feature Selection Using a Neural Framework With Controlled Redundancy
【24h】

Feature Selection Using a Neural Framework With Controlled Redundancy

机译:使用具有受控冗余的神经框架进行特征选择

获取原文
获取原文并翻译 | 示例
           

摘要

We first present a feature selection method based on a multilayer perceptron (MLP) neural network, called feature selection MLP (FSMLP). We explain how FSMLP can select essential features and discard derogatory and indifferent features. Such a method may pick up some useful but dependent (say correlated) features, all of which may not be needed. We then propose a general scheme for dealing with feature selection with “controlled redundancy” (CoR). The proposed scheme, named as FSMLP-CoR, can select features with a controlled redundancy both for classification and function approximation/prediction type problems. We have also proposed a new more effective training scheme named mFSMLP-CoR. The idea is general in nature and can be used with other learning schemes also. We demonstrate the effectiveness of the algorithms using several data sets including a synthetic data set. We also show that the selected features are adequate to solve the problem at hand. Here, we have considered a measure of linear dependency to control the redundancy. The use of nonlinear measures of dependency, such as mutual information, is straightforward. Here, there are some advantages of the proposed schemes. They do not require explicit evaluation of the feature subsets. Here, feature selection is integrated into designing of the decision-making system. Hence, it can look at all features together and pick up whatever is necessary. Our methods can account for possible nonlinear subtle interactions between features, as well as that between features, tools, and the problem being solved. They can also control the level of redundancy in the selected features. Of the two learning schemes, mFSMLP-CoR, not only improves the performance of the system, but also significantly reduces the dependency of the network's behavior on the initialization of connection weights.
机译:我们首先提出一种基于多层感知器(MLP)神经网络的特征选择方法,称为特征选择MLP(FSMLP)。我们将说明FSMLP如何选择基本功能并丢弃贬义和冷漠的功能。这样的方法可以获取一些有用的但依赖的(例如相关的)特征,可能不需要所有这些特征。然后,我们提出了一种用于处理具有“受控冗余”(CoR)的特征选择的通用方案。提议的方案称为FSMLP-CoR,可以为分类和函数逼近/预测类型问题选择具有受控冗余的特征。我们还提出了一种新的更有效的培训计划,名为mFSMLP-CoR。这个想法本质上是通用的,也可以与其他学习方案一起使用。我们使用包括综合数据集在内的数个数据集论证了算法的有效性。我们还表明,所选功能足以解决当前的问题。在这里,我们已经考虑了线性相关性的措施来控制冗余。诸如相互信息之类的非线性相关性度量的使用很简单。这里,提出的方案具有一些优点。他们不需要显式评估功能子集。在这里,特征选择被集成到决策系统的设计中。因此,它可以一起查看所有功能并选择必要的功能。我们的方法可以解决要素之间以及要素,工具与要解决的问题之间可能存在的非线性微妙相互作用。他们还可以控制所选功能的冗余级别。在这两种学习方案中,mFSMLP-CoR不仅提高了系统的性能,而且还大大降低了网络行为对连接权重初始化的依赖性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号