首页> 外文期刊>SIAM Journal on Scientific Computing >NEARLY OPTIMAL PRECONDITIONED METHODS FOR HERMITIAN EIGENPROBLEMS UNDER LIMITED MEMORY. PART I: SEEKING ONE EIGENVALUE
【24h】

NEARLY OPTIMAL PRECONDITIONED METHODS FOR HERMITIAN EIGENPROBLEMS UNDER LIMITED MEMORY. PART I: SEEKING ONE EIGENVALUE

机译:有限内存下的埃尔米特特征问题的一种近乎最佳的前提方法。第一部分:寻求一个特征值

获取原文
获取原文并翻译 | 示例
           

摘要

Large, sparse, Hermitian eigenvalue problems are still some of the most computationally challenging tasks. Despite the need for a robust, nearly optimal preconditioned iterative method that can operate under severe memory limitations, no such method has surfaced as a clear winner. In this research we approach the eigenproblem from the nonlinear perspective, which helps us develop two nearly optimal methods. The first extends the recent Jacobi–Davidson conjugate gradient (JDCG) method to JDQMR, improving robustness and efficiency. The second method, generalized-Davidson+1 (GD+1), utilizes the locally optimal conjugate gradient recurrence as a restarting technique to achieve almost optimal convergence. We describe both methods within a unifying framework and provide theoretical justification for their near optimality. A choice between the most efficient of the two can be made at runtime. Our extensive experiments confirm the robustness, the near optimality, and the efficiency of our multimethod over other state-of-the-art methods.
机译:大型,稀疏的Hermitian特征值问题仍然是一些计算上最具挑战性的任务。尽管需要一种可以在严格的内存限制下运行的健壮,近乎最佳的预处理迭代方法,但尚无此类方法成为明显的赢家。在这项研究中,我们从非线性角度解决了本征问题,这有助于我们开发两种接近最优的方法。第一种方法将最新的Jacobi–Davidson共轭梯度(JDCG)方法扩展到JDQMR,从而提高了鲁棒性和效率。第二种方法是广义Davidson + 1(GD + 1),它利用局部最优共轭梯度递归作为一种重启技术来实现几乎最优的收敛。我们在统一的框架内描述这两种方法,并为它们的接近最优提供理论依据。可以在运行时在两者中最有效的之间进行选择。我们广泛的实验证实了我们的多方法相对于其他最新方法的稳健性,接近最优性和效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号