首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications
【24h】

Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications

机译:平滑神经网络用于约束非Lipschitz优化及其应用

获取原文
获取原文并翻译 | 示例
       

摘要

In this paper, a smoothing neural network (SNN) is proposed for a class of constrained non-Lipschitz optimization problems, where the objective function is the sum of a nonsmooth, nonconvex function, and a non-Lipschitz function, and the feasible set is a closed convex subset of $BBR^{n}$. Using the smoothing approximate techniques, the proposed neural network is modeled by a differential equation, which can be implemented easily. Under the level bounded condition on the objective function in the feasible set, we prove the global existence and uniform boundedness of the solutions of the SNN with any initial point in the feasible set. The uniqueness of the solution of the SNN is provided under the Lipschitz property of smoothing functions. We show that any accumulation point of the solutions of the SNN is a stationary point of the optimization problem. Numerical results including image restoration, blind source separation, variable selection, and minimizing condition number are presented to illustrate the theoretical results and show the efficiency of the SNN. Comparisons with some existing algorithms show the advantages of the SNN.
机译:针对一类约束非Lipschitz优化问题,提出了一种平滑神经网络(SNN),其目标函数是一个非光滑,非凸函数和一个非Lipschitz函数的和,其可行集为$ BBR ^ {n} $的闭合凸子集。使用平滑近似技术,通过微分方程对提出的神经网络进行建模,可以轻松实现。在可行集目标函数的水平有界条件下,我们证明了在可行集中任何初始点的SNN解的整体存在性和一致有界性。在平滑函数的Lipschitz属性下提供了SNN解决方案的唯一性。我们表明,SNN解的任何累积点都是优化问题的固定点。数值结果包括图像复原,盲源分离,变量选择和最小化条件数,以说明理论结果并显示SNN的效率。与一些现有算法的比较显示了SNN的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号