Registered Member
|
I have a positive definite matrix A of which I already computed the cholesky decomposition: A=LDL^T. For some vector x, I would like to compute S^{-1}x, where S is a square root of A. For now, I do
Eigen::SelfadjointEigenSolver<Eigen::MatrixXd> es(A); Eigen::MatrixXd Si(es.operatorInverseSqrt()); return Si*get_x(); (Is this a stable way to do this computation? I though computing inverses was a bad thing in general.) Is there a way to use the already performed LDLT decomposition? I feel it is possible, because that's what's actually happening behind the scenes in LDLT::solve()! |
Registered Member
|
I do not understand your last sequence. LDLT::solve() uses triangular solves (aka forward and back substitution). How does this help in computing the inverse square root?
Regarding stability, I believe that eigenvalue decompositions of symmetric matrices are well conditioned, so I do not expect any problems there. If your matrix is close to singular (i.e., its eigenvalues vary over many orders of magnitude) there may be an issue but I do not think you can do much better than computing its eigenvalue decomposition. |
Registered Member
|
Well since LDL^T can be put in the form LL^T, so I thought L would be a square root, but I might be mistaken.
|
Registered Member
|
I see. The L factor in the Cholesky decomposition A = L^T T is indeed sometimes called a square root; I had forgotten about that. However, the square root computed by operatorInverseSqrt() is a matrix X such that A = X^2. These are different. You should be able to use the LDLT decomposition to compute L^(-1), but not X^(-1).
|
Registered Member
|
Okay. So this leads me to two points
- L and X have the same squared eigenvalues: L and L^T have the same eigenvalues, and LL^T = A = XX have the same eigenvalues, qed. - I only consider the principal square root, i.e. the unique square root of A which has positive eigenvalues. Is that what is calculated by operatorInverseSqrt() ? Can we still use the LDLT decomposition to do the calculation above? If so how? |
Registered Member
|
I think there may be something possible. If you compute the (left) polar decomposition of L, meaning that you write it as L = PU with P symmetric and positive definite and U orthogonal then P is precisely the square root of A: P^2 = P U U^T P^T = L L^T = A. Actually, I just had a look in Nick Higham's book "Functions of Matrices" and this is the method he recommends to compute the square root of symmetric positive definite matrices.
Unfortunately, Eigen does not have an implementation for computing the polar decomposition. It can be obtained from the SVD, but computing the SVD of a general matrix is more expensive than computing the eigenvalue decomposition of A. So this method will involve a fair bit of work. operatorInverseSqrt() does indeed compute the principal square root. If A = V D V^(-1) is the eigendecomposition of A, then the diagonal entries of D are positive, so you can compute X = V D^(-1/2) V^(-1) which is the inverse square root of A. This is what operatorInverseSqrt() does. I don't think that the Cholesky factor L and the square root X have the same eigenvalues. While L and L^T have indeed the same eigenvalues, the eigenvalues of L L^T are not simply the eigenvalues of L squared. |
Registered Member
|
Thanks a lot for your answer! Maybe the polar decomposition is a good feature request for eigen...
|
Registered users: Baidu [Spider], Bing [Bot], Google [Bot]