...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >1D-LDA vs. 2D-LDA: When is vector-based linear discriminant analysis better than matrix-based?
【24h】

1D-LDA vs. 2D-LDA: When is vector-based linear discriminant analysis better than matrix-based?

机译:1D-LDA与2D-LDA:基于矢量的线性判别分析何时比基于矩阵的线性判别分析更好?

获取原文
获取原文并翻译 | 示例
           

摘要

Recent advances have shown that algorithms with (21)) matrix-based representation perform better than the traditional (I D) vector-based ones. In particular, 2D-LDA has been widely reported to outperform 1D-LDA. However, would the matrix-based linear discriminant analysis be always superior and when would 1D-LDA be better! In this paper, we investigate into these questions and have a comprehensive comparison between 1D-LDA and 2D-LDA in theory and in experiments. We analyze the heteroscedastic problem in 2D-LDA and formulate mathematical equalities to explore the relationship between 1D-LDA and 2D-LDA; then we point out potential problems in 2D-LDA. It is shown that 2D-LDA has eliminated the information contained in the covariance, information between different local geometric structures, such as the rows or the columns, which is useful for discriminant feature extraction, whereas 1D-LDA could preserve such information. Interestingly, this new finding indicates that 1D-LDA is able to gain higher Fisher score than 2D-LDA in some extreme case. Furthermore, sufficient conditions on which 2D-LDA would be Bayes optimal for two-class classification problem are derived and comparison with 1D-LDA in this aspect is also analyzed. This could help understand how 2D-LDA is expected to achieve at its best, further discover its relationship with 1D-LDA, and well support other findings. After the theoretical analysis, comprehensive experimental results are reported by fairly and extensively comparing 1D-LDA with 2D-LDA. In contrast to the existing view that some 2D-LDA based algorithms would perform better than 1D-LDA when the number of training samples for each class is small or when the number of discriminant features used is small, we show that it is not always true and show that some standard 1D-LDA based algorithms could perform better in those cases on some challenging data sets. (c) 2007 Elsevier Ltd. All rights reserved.
机译:最近的进展表明,具有(21))的基于矩阵表示的算法比传统的(I D)基于矢量的算法具有更好的性能。特别是,广泛报道2D-LDA优于1D-LDA。但是,基于矩阵的线性判别分析将总是更好,而一维LDA何时会更好!在本文中,我们将对这些问题进行调查,并对1D-LDA和2D-LDA在理论和实验上进行全面比较。我们分析了2D-LDA中的异方差问题,并建立了数学等式,以探索1D-LDA和2D-LDA之间的关系;然后指出2D-LDA中的潜在问题。结果表明2D-LDA消除了协方差中包含的信息,即不同局部几何结构(如行或列)之间的信息,这对于区分特征十分有用,而1D-LDA可以保留此类信息。有趣的是,这一新发现表明,在某些极端情况下,1D-LDA可以获得比2D-LDA更高的Fisher分数。此外,推导了将2D-LDA对两类分类问题进行贝叶斯最优的充分条件,并分析了在这一方面与1D-LDA的比较。这可能有助于了解2D-LDA有望如何达到最佳状态,进一步发现其与1D-LDA的关系,并很好地支持其他发现。经过理论分析,通过公平,广泛地比较1D-LDA和2D-LDA来报告全面的实验结果。与现有观点相反,当每个类别的训练样本数量很少或使用的判别特征数量较少时,某些基于2D-LDA的算法将比1D-LDA更好,我们证明了它并不总是正确的并表明,在某些情况下,在某些具有挑战性的数据集上,基于标准1D-LDA的算法可能会表现更好。 (c)2007 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号