首页> 外文会议>International Joint Conference on Neural Networks >Robust estimation for subspace based classifiers
【24h】

Robust estimation for subspace based classifiers

机译:基于子空间的分类器的鲁棒估计

获取原文

摘要

The nearest subspace classifier (NSC) assumes that the samples of every class lie on a separate subspace and it is possible to classify a test sample by computing the distance between the test sample and the subspaces. The sparse representation based classification (SRC) generalizes the NSC - it assumes that the samples of any class can lie on a union of subspaces. By calculating the distance between the test sample and these subspaces, one can classify the test sample. Both NSC and SRC hinge on the assumption that the distance between the test sample and correct subspace will be small and approximately Normally distributed. Based on this assumption, these studies proposed using an l2-norm measure. It is well known that l2-norm is sensitive to outliers (large deviations at few locations). In order to make the NSC and SRC robust and improve their performance we propose to employ the l1-norm based distance measure. Experiments on benchmark classification problems, face recognition and character recognition show that the proposed method indeed improves upon the basic versions of NSC and SRC; in fact our proposed robust NSC and robust SRC yield even better results than support vector machine and neural network.
机译:最近的子空间分类器(NSC)假设每个类别的样本都位于单独的子空间上,并且可以通过计算测试样本与子空间之间的距离来对测试样本进行分类。基于稀疏表示的分类(SRC)概括了NSC-假定任何类别的样本都可以位于子空间的并集上。通过计算测试样本与这些子空间之间的距离,可以对测试样本进行分类。 NSC和SRC都基于这样的假设,即测试样本和正确子空间之间的距离将很小并且近似于正态分布。基于此假设,这些研究建议使用l2-范数度量。众所周知,l2-范数对离群值敏感(在少数位置存在较大偏差)。为了使NSC和SRC健壮并改善其性能,我们建议采用基于l1-norm的距离度量。在基准分类问题,人脸识别和字符识别方面的实验表明,该方法确实在NSC和SRC的基本版本上有所改进。实际上,我们提出的鲁棒NSC和鲁棒SRC产生的结果甚至比支持向量机和神经网络更好。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号