Suppose I have a symmetric $N times N$ matrix A which has a one-dimensional Nullspace $N$. A is positive definite on $N^bot$. In my case $N$ is the space of constant vectors (i.e. generated by the all-one vector).
I have to solve the problem $Ax = b$, with $b in R(A)$ which has infinitely many solutions. I am looking for the minimum norm solution. The matrix $A$ is very large and sparse, direct methods aren't really an option. The rank-deficient least squares algorithms I have seen also appear to be prohibitive.
I was solving a non-singular version of this problem with Conjugate Gradient. Is there anyway I can modify the algorithm to solve this particular problem?
EDIT: Boiling the problem down to the bone the question is if $A$ is positive semi-definite with exactly one 0 eigenvalue, does CG work?
Thanks.
No comments:
Post a Comment