PDF] Steepest Descent and Conjugate Gradient Methods with Variable Preconditioning

Por um escritor misterioso
Last updated 27 setembro 2024
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
It is shown that the CG method with variable preconditioning under this assumption may not give improvement, compared to the steepest descent (SD) method, and a new elegant geometric proof of the SD convergence rate bound is given. We analyze the conjugate gradient (CG) method with variable preconditioning for solving a linear system with a real symmetric positive definite (SPD) matrix of coefficients $A$. We assume that the preconditioner is SPD on each step, and that the condition number of the preconditioned system matrix is bounded above by a constant independent of the step number. We show that the CG method with variable preconditioning under this assumption may not give improvement, compared to the steepest descent (SD) method. We describe the basic theory of CG methods with variable preconditioning with the emphasis on “worst case” scenarios, and provide complete proofs of all facts not available in the literature. We give a new elegant geometric proof of the SD convergence rate bound. Our numerical experiments, comparing the preconditioned SD and CG methods, not only support and illustrate our theoretical findings, but also reveal two surprising and potentially practically important effects. First, we analyze variable preconditioning in the form of inner-outer iterations. In previous such tests, the unpreconditioned CG inner iterations are applied to an artificial system with some fixed preconditioner as a matrix of coefficients. We test a different scenario, where the unpreconditioned CG inner iterations solve linear systems with the original system matrix $A$. We demonstrate that the CG-SD inner-outer iterations perform as well as the CG-CG inner-outer iterations in these tests. Second, we compare the CG methods using a two-grid preconditioning with fixed and randomly chosen coarse grids, and observe that the fixed preconditioner method is twice as slow as the method with random preconditioning.
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
PDF] Comparison of steepest descent method and conjugate gradient
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
The Conjugate Gradient Method
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
Preconditioned Conjugate Gradient Descent (ILU)
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
Conjugate gradient method - Wikipedia
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
Descent carefully on a gradient!. In machine learning, gradient
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
cg
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
Preconditioned steepest descent-like methods for symmetric
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
A link between the steepest descent method and fixed-point
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
Conjugate Gradient Method - an overview
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
Mathematics, Free Full-Text
PDF] Steepest Descent and Conjugate Gradient Methods with Variable  Preconditioning
PDF) Algorithm 809: PREQN: Fortran 77 subroutines for

© 2014-2024 praharacademy.in. All rights reserved.