Registered Member
|
Hi all,
in this algorithm described in https://kar.kent.ac.uk/2782/1/A_Kullback-Leibler.pdf Part VI.C says that they decompose covariance matrices P into a square unitary matrix U and a diagonal matrix D such that
holds, while I assume that they used the shorthand
They also reference how they do this. They cite work by Rao (C. R. Rao, Linear Statistical Inference and its Applications, 2nd ed. Wiley, 1973), luckily they name the exact part they reference: Sec. 1c.3(ii). The referenced paragraph can be found here: Image uploaded here . Unfortunately, I neither get what exactly Rao is doing, nor what the "small variation" of Runnalls is. Rao seems to be using the apostrophe for matrix transposition (this might be hard to read in the image). I assume that I should be capable of doing this using eigen as several other transforms are already there. Can anyone shed a light on this? Is eigen capable of doing this transform? |
Moderator
|
From a very quick look, this seems to be a generalized eigenvalue decomposition using a pair of symmetric matrices, with one being even positive definite. This is readily available in Eigen, see: http://eigen.tuxfamily.org/dox/classEig ... olver.html
|
Registered Member
|
Thank you for that quick answer!
As I understand the docs, Eigen produces a triangular matrix. However, the authors of the paper state in a footnote that the resulting unitary matrix means only determinant unity and that this matrix is not triangular in general. So I don't see which decomposition in Eigen produces a non-triangular but unitary matrix U while D is a diagonal matrix of only positive elements. Do I need to do other computations with the triangular matrix to make it unitary? Also, that same unitary matrix U should later also decompose a second covariance P_2 matrix with a different diagonal matrix D_2. Best regards weaktransform |
Moderator
|
No no, GeneralizedSelfAdjointEigenSolver generates a matrix of eigenvectors U which is unitary wrt the metric B, that is, given two symmetric matrices A and B, it generates U and D such that:
The triangular factors L are only used internally to cast this generalized problem to a standard one. This triangular factor corresponds to the matrix T in your book. |
Registered Member
|
OK, using your equations as (1) and (2): By multiplying (1) with U^T from left and then using (2) I would yield
Then I multiply (3) with U^T from left and U from right to get an expression for A, which leads me to
which I find quite confusing. I assume that I do something wrong here [EDIT:] Additional information To go back, what I want to achieve: Reading more into the topic tells me that what I probably want to do is simultaneous diagonalization (Wikipedia Entry) while the invertible matrix P should also be unitary in my case. I understand that covariance matrices should always be normal matrices, and being positive definite (in my case) they should also have only positive eigenvalues. Positive definite matrices further have an eigendecomposition P^-1 D P (Wikipedia here). This unitary matrix P (in Wikipedia characterization no. 1) is basically complex. Runnalls doesn't mention in the abovementioned paper whether the matrix in question is real-valued or complex. |
Moderator
|
Note that U^T U is not equal to the identity !! It's U^T B U that is equal to I.
The result of the GeneralizedSelfAdjointEigenSolver matches the equation 3.9 with Lambda <-> D and R <-> U. |
Registered Member
|
Thank you again for your answer! This is exactly the point:
I don't see that the resulting matrix U from GeneralizedSelfAdjointEigenSolver is unitary. From the linked Runnalls PDF (above) the relevant part is excerpted here (with footnote glued on): That is, I need a decomposition with a unitary matrix U (and therefore U^T U = I) while the D matrices are both different from I. |
Moderator
|
That's an easy step, for instance, once you got the generalized eigenvalue decomposition of A and B, then you can compute the SVD of teh resulting eigenvectors U using, e.g., Eigen::JacobiSVD to get:
U = V * S * W^T with both V and W unitary, and S diagonal. Then, it is easy to see that: B = V * inv(S)^2 * V^T and A = V * (inv(S)^2*D) * V^T If U is quite large (>100), then instead of JacobiSVD you can use BDCSVD from the devel branch. |
Registered Member
|
Nice! Thank you very much!
I knew that this should be doable with Eigen. However, I didn't think of applying a second decomposition to the solution |
Registered users: Bing [Bot], Google [Bot], Yahoo [Bot]