Very large systems of linear
equations cannot be solved by standard dense
matrix techniques. However, approximate solutions
with a high degree of accuracy can often be
obtained by projection methods (or Krylov space
methods), which build increasingly more accurate
approximations to the solution by iteration.
The original problem is thus solved by projecting
it onto a subspace of smaller dimension. The
projection involves the definition of a trial
solution and a weighting function space. By
selecting these spaces appropriately, several
important algorithms are obtained. Out of these,
we will describe the Gauss-Jacoby and Gauss-Seidel
relaxation techniques, the method of steepest
descent and more Krylov subspace methods such
as GMRES and conjugate gradients. |