首页> 外文会议>International Conference on Natural Computation;ICNC '09 >A Novel Clustering Method Combining Heuristics and Information Theorem
【24h】

A Novel Clustering Method Combining Heuristics and Information Theorem

机译:启发式和信息定理相结合的新型聚类方法

获取原文

摘要

Many data mining tasks require the unsupervised partitioning of a data set into clusters. However, in many case we do not really know any prior knowledge about the clusters, for example, the density or the shape. This paper addresses two major issues associated with conventional competitive learning, namely, sensitivity to initialization and difficulty in determining the number of clusters. Many methods exist for such clustering, but most of then have assumed hyper-ellipsoidal clusters. Many heuristically proposed competitive learning methods and its variants, are somewhat ad hoc without any theoretical support. Under above considerations, we propose an algorithm named as Entropy guided Splitting Competitive Learning (ESCL) in the information theorem framework. Simulations show that minimization of partition entropy can be used to guide the competitive learning process, so to estimate the number and structure of probable data generators.
机译:许多数据挖掘任务要求将数据集无监督地划分为集群。但是,在许多情况下,我们实际上并不了解有关聚类的任何先验知识,例如密度或形状。本文讨论了与常规竞争性学习相关的两个主要问题,即对初始化的敏感性和确定簇数的难度。存在许多用于这种聚类的方法,但是大多数方法都采用了超椭圆形聚类。许多启发式提议的竞争性学习方法及其变体在没有任何理论支持的情况下都是临时的。基于以上考虑,我们在信息定理框架中提出了一种称为熵指导的分裂竞争学习(ESCL)的算法。仿真表明,分区熵的最小化可以用来指导竞争学习过程,从而估计可能的数据生成器的数量和结构。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号