Registered Member
|
Hi,
Can somebody advise me the recommended way to invert a (positive definite) symmetric matrix? The obvious way would be to use SelfAdjointEigenSolver and compute the inverse as V D^-1 V^t. However, the docs don't seem to state that the computed eigenvector matrix V is orthogonal, so it seems possible that this would be incorrect if there are repeated eigenvalues. Can anyone advise? (The docs for operatorInverseSqrt make me think it must actually be ok.) (To be clear, I want the inverse so I can print it out, not to use it to solve anything.) Many thanks for any help with this! Gavin. |
Registered Member
|
Even if you do not want to solve a linear system with your matrix,
you can still use any decomposition suitable for your matrix : http://eigen.tuxfamily.org/dox-devel/gr ... gebra.html and use it to solve a system like A * Ai = I where Ai is the inverse and I the Identity matrix. If your matrix is SPD, use the Cholesky decomposition
|
Registered Member
|
Hi,
That makes sense, thanks. (I have been using the LDLT rather than LLT because information on [1] seemed to say it was more robust. Is there a reason to prefer LLT?) Best wishes, Gavin. [1] http://eigen.tuxfamily.org/dox/TopicLin ... tions.html |
Registered Member
|
LLT is faster than LDLT but it requires the matrix to be positive definite, otherwise it may fail to produce the decomposition
If you're not sure about that property of your matrix, then you may consider using LDLT (if the matrix is at least semi definite ) or PartialPivLU (more robust) |
Registered Member
|
Registered users: Bing [Bot], Google [Bot], Sogou [Bot]