...
首页> 外文期刊>Neural processing letters >A Fast Simplified Fuzzy ARTMAP Network
【24h】

A Fast Simplified Fuzzy ARTMAP Network

机译:快速简化的模糊ARTMAP网络

获取原文
获取原文并翻译 | 示例
           

摘要

We present an algorithmic variant of the simplified fuzzy ARTMAP (SFAM) network, whose structure resembles those of feed-forward networks. Its difference with Kasuba's model is discussed, and their performances are compared on two benchmarks. We show that our algorithm is much faster than Kasuba's algorithm, and by increasing the number of training samples, the difference in speed grows enormously. The performances of the SFAM and the MLP (multilayer perceptrpn) are compared on three problems: the two benchmarks, and the Farsi optical character recognition (OCR) problem. For training the MLP two different variants of the backpropagation algorithm are used: the BPLRF algorithm (backpropagation with plummeting learning rale factor) for the benchmarks, and the BST algorithm (backpropagation with selective training) for the Farsi OCR problem. The results obtained on all of the three case studies with the MLP and the SFAM, embedded in their customized systems, show that the SFAM's convergence in fast-training mode, is faster than that of MLP, and online operation of the MLP is faster than that of the SFAM. On the benchmark problems the MLP has much better recognition rate than the SFAM. On the Farsi OCR problem, the recognition error of the SFAM is higher than that of the MLP on ill-engineered datasets, but equal on well-engineered ones. The flexible configuration of the SFAM, i.e. its capability to increase the size of the network in order to learn new patterns, as well as its simple parameter adjustment, remain unchallenged by the MLP.
机译:我们提出了一种简化的模糊ARTMAP(SFAM)网络的算法变体,其结构类似于前馈网络。讨论了它与Kasuba模型的区别,并在两个基准上比较了它们的性能。我们证明了我们的算法比Kasuba的算法快得多,并且通过增加训练样本的数量,速度上的差异大大增加。在三个问题上比较了SFAM和MLP(多层感知器)的性能:两个基准和波斯光学字符识别(OCR)问题。为了训练MLP,使用了反向传播算法的两种不同变体:用于基准的BPLRF算法(具有下降学习罗素的反向传播)和用于法西OCR问题的BST算法(具有选择性训练的反向传播)。通过将MLP和SFAM嵌入其定制系统中的所有三个案例研究获得的结果表明,在快速训练模式下SFAM的收敛速度比MLP快,并且MLP的在线运行速度比MLP快。 SFAM。在基准问题上,MLP的识别率比SFAM好得多。在波斯语OCR问题上,在设计不良的数据集上,SFAM的识别误差高于在MLP上的识别误差,但在设计良好的数据集上,则等于MLP。 SFAM的灵活配置,即其增加网络规模以学习新模式的能力以及其简单的参数调整,仍然不受MLP的挑战。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号