首页> 外文会议>Annual Computational Neuroscience Meeting(CNS'02); 20020721-20020725; Chicago,IL; US >Learning recurrent neural models with minimal complexity from neural tuning
【24h】

Learning recurrent neural models with minimal complexity from neural tuning

机译:通过神经调整以最小的复杂度学习循环神经模型

获取原文
获取原文并翻译 | 示例

摘要

A learning algorithm for the estimation of the structure of nonlinear recurrent neural models from neural tuning data is presented. The proposed method combines support vector regression with additional constraints that result from a stability analysis of the dynamics of the fitted network model. The optimal solution can be determined from a single convex optimization problem that can be solved with semidefinite programming techniques. The method successfully estimates the feed-forward and the recurrent connectivity structure of neural field models using as data only examples of stable stationary solutions of the neural dynamics. The class of neural models that can be learned is quite general. The only a priori assumptions are the translation invariance and the smoothness of the feed-forward and recurrent spatial connectivity profile. The efficiency of the method is illustrated by comparing it with estimates based on radial basis function networks and support vector regression.
机译:提出了一种从神经调节数据估计非线性递归神经模型结构的学习算法。所提出的方法将支持向量回归与附加约束相结合,附加约束是由拟合网络模型的动力学稳定性分析得出的。可以从可以用半定编程技术解决的单个凸优化问题中确定最佳解决方案。该方法仅使用神经动力学的稳定平稳解的示例作为数据,成功地估计了神经场模型的前馈和递归连通性结构。可以学习的神经模型类别非常笼统。唯一的先验假设是平移不变性,前馈和递归空间连通性分布图的平滑度。通过将其与基于径向基函数网络和支持向量回归的估计值进行比较,说明了该方法的效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号