【24h】

Smooth ε-Insensitive Regression by Loss Symmetrization

机译:通过损失对称化实现平滑的ε不敏感回归

获取原文
获取原文并翻译 | 示例

摘要

We describe a framework for solving regression problems by reduction to classification. Our reduction is based on symmetrization of margin-based loss functions commonly used in boosting algorithms, namely, the logistic loss and the exponential loss. Our construction yields a smooth version of the ε-insensitive hinge loss that is used in support vector regression. A byproduct of this construction is a new simple form of regularization for boosting-based classification and regression algorithms. We present two parametric families of batch learning algorithms for minimizing these losses. The first family employs a log-additive update and is based on recent boosting algorithms while the second family uses a new form of additive update. We also describe and analyze online gradient descent (GD) and exponentiated gradient (EG) algorithms for the ε-insensitive logistic loss. Our regression framework also has implications on classification algorithms, namely, a new additive batch algorithm for the log-loss and exp-loss used in boosting.
机译:我们描述了通过归类来解决回归问题的框架。我们的减少是基于升压算法中常用的基于余量的损失函数的对称化,即逻辑损失和指数损失。我们的构造产生了ε不敏感铰链损失的平滑形式,用于支持向量回归。这种构造的副产品是用于基于Boosting的分类和回归算法的正则化的一种新的简单形式。我们提出了两个参数化的批次学习算法系列,以最大程度地减少这些损失。第一个家族使用对数加性更新,并且基于最新的增强算法,而第二个家族使用新形式的加性更新。我们还描述和分析ε不敏感逻辑损失的在线梯度下降(GD)和指数梯度(EG)算法。我们的回归框架还对分类算法有影响,即,用于增强的对数损失和exp损失的新的加法批处理算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号