首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Learning With Mixed Hard/Soft Pointwise Constraints
【24h】

Learning With Mixed Hard/Soft Pointwise Constraints

机译:混合硬/软逐点约束学习

获取原文
获取原文并翻译 | 示例
           

摘要

A learning paradigm is proposed and investigated, in which the classical framework of learning from examples is enhanced by the introduction of hard pointwise constraints, i.e., constraints imposed on a finite set of examples that cannot be violated. Such constraints arise, e.g., when requiring coherent decisions of classifiers acting on different views of the same pattern. The classical examples of supervised learning, which can be violated at the cost of some penalization (quantified by the choice of a suitable loss function) play the role of soft pointwise constraints. Constrained variational calculus is exploited to derive a representer theorem that provides a description of the functional structure of the optimal solution to the proposed learning paradigm. It is shown that such an optimal solution can be represented in terms of a set of support constraints, which generalize the concept of support vectors and open the doors to a novel learning paradigm, called support constraint machines. The general theory is applied to derive the representation of the optimal solution to the problem of learning from hard linear pointwise constraints combined with soft pointwise constraints induced by supervised examples. In some cases, closed-form optimal solutions are obtained.
机译:提出并研究了一种学习范式,其中通过引入硬性的点状约束(即,施加在不可违反的有限示例集合上的约束)来增强从示例学习的经典框架。例如,当需要分类器对相同模式的不同视图起作用的一致决定时,就会出现这种约束。监督学习的经典示例可能会受到一些惩罚(通过选择合适的损失函数来量化)的代价而被违反,起到了软点约束的作用。利用约束变分演算来推导一个表示定理,该定理提供了对所提出的学习范式的最佳解决方案的功能结构的描述。结果表明,这种最佳解决方案可以用一组支持约束条件来表示,这些约束条件概括了支持向量的概念,并为称为支持约束机的新型学习范式打开了大门。应用一般理论从硬线性点约束与有监督实例引起的软点约束相结合,得出学习问题的最优解的表示。在某些情况下,可以获得封闭形式的最优解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号