This forum has been archived. All content is frozen. Please use KDE Discuss instead.

How to calculate covariance matrix for a least squares

Tags: None
(comma "," separated)
takeuchi
Registered Member
Posts
3
Karma
0
Hello,
I am solving a linear least squares problem, Ax = b, in my code as follows:
Code: Select all
A.jacobiSvd(ComputeThinU | ComputeThinV).solve(b)

I need to calculate a covariance matrix, Inv(A.transpose() * A), for the solution and trying several ways as follows:
Code: Select all
1:  (A.transpose() * A).inverse();
2:  (A.transpose() * A).ldlt().solve( Matrix<long double,Dynamic,Dynamic>::Identity(N,N)); N = (A.transpose()*A).rows()
3:  (A.transpose() * A).jacobiSvd(ComputeThinU | ComputeThinV).solve(Matrix<long double,Dynamic,Dynamic>::Identity(N,N));
4:  (A.transpose() * A).fullPivHouseholderQr().inverse();
5:  (A.transpose() * A).fullPivLu().solve(Matrix<long double,Dynamic,Dynamic>::Identity(N,N));

I got a solution only in case 1 & case 2, but I got erroneous results for case 3, 4, 5.
I learned that using inverse() (case 1) is not a good way and other decomposition methods should be used in the tutorial page:
https://eigen.tuxfamily.org/dox-devel/group__TutorialLinearAlgebra.html
I learned that case 3,4, 5 have highest accuracy and can be used for all types of matrices, so I cannot understand why these methods don't work for my case.
Could you please let me know why I cannot use case 3,4,5 and let me know if there is a criteria how to choose a decomposition method for my case?

Thank you very much in advance!
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
Sorry for late reply. In case this still bother you, here is a self-contained example showing all method produces the same result:

Code: Select all
#include <iostream>
#include <Eigen/Dense>
using namespace Eigen;
using namespace std;

int main()
{
  MatrixXd A(10,5), I1, I2, I3, I4, I5;
  A.setRandom();
  int N = (A.transpose()*A).rows();
  I1 = (A.transpose() * A).inverse();
  I2 = (A.transpose() * A).ldlt().solve( MatrixXd::Identity(N,N));

  I3 = (A.transpose() * A).jacobiSvd(ComputeThinU | ComputeThinV).solve(MatrixXd::Identity(N,N));
  I4 = (A.transpose() * A).fullPivHouseholderQr().inverse();
  I5 = (A.transpose() * A).fullPivLu().solve(MatrixXd::Identity(N,N));

  MatrixXd I = MatrixXd::Identity(N,N);
  MatrixXd C = A.transpose()*A;
  cout << ((C*I1)-I).norm() / N << " "
       << ((C*I2)-I).norm() / N << " "
       << ((C*I3)-I).norm() / N << " "
       << ((C*I4)-I).norm() / N << " "
       << ((C*I5)-I).norm() / N << "\n";

  cout << I1 << "\n\n";
  cout << I2 << "\n\n";
  cout << I3 << "\n\n";
  cout << I4 << "\n\n";
  cout << I5 << "\n\n";
}


So we'll need to get your real matrix entries to reproduce.


Bookmarks



Who is online

Registered users: Bing [Bot], daret, Google [Bot], sandyvee, Sogou [Bot]