This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Extracting Q matrix from SparseQR

Tags: None
(comma "," separated)
User avatar
alecjacobson
Registered Member
Posts
26
Karma
0

Extracting Q matrix from SparseQR

Sat Sep 14, 2013 2:34 pm
I'm trying to use the SparseQR class, but I'm having trouble extracting the Q matrix as a sparse matrix (as per the documentation).

Here's my small reproducible example:


Code: Select all
#include <Eigen/Sparse>
#include <vector>
int main(int argc,char * argv[])
{
  using namespace Eigen;
  using namespace std;
  vector<Triplet<double> > ijv;
  ijv.push_back(Triplet<double>(1,0,1));
  ijv.push_back(Triplet<double>(375,0,1));
  SparseMatrix<double> A(377,1);
  A.setFromTriplets(ijv.begin(),ijv.end());
  // Perform QR
  SparseQR<SparseMatrix<double>, COLAMDOrdering<int> > qr(A);
  assert(qr.info() == Success);
  // Extract Q matrix
  SparseMatrix<double> Q;
  Q = qr.matrixQ();
  return 0;
}


But upon running this I get the following error from the Q = qr.matrixQ() line:
Code: Select all
Assertion failed: (m_qr.m_Q.cols() == m_other.rows() && "Non conforming object sizes"), function evalTo, file /opt/local//include/eigen3/Eigen/src/SparseQR/SparseQR.h, line 581.


Am I calling this incorrectly? I would like to extract Q as a A.rows() by A.rows() sparse matrix. Is there another way to do it?
User avatar
alecjacobson
Registered Member
Posts
26
Karma
0
Seems like it could be an issue that A is rectangular. Replacing SparseMatrix<double> A(377,1); with SparseMatrix<double> A(377,377); makes the code above work.
User avatar
alecjacobson
Registered Member
Posts
26
Karma
0
If I ignore the warning (compile with -DNDEBUG) then it seems to work. In the sense that Q * R = A. But Q seems to be effectively dense! The first rank() rows of Q are sparse numerically, but the SparseMatrix seems to be actually storing all of the zeros.

So for the example above, the following:

Code: Select all
cout<<Q.nonZeros()<<endl;
Q.prune(0.0);
cout<<Q.nonZeros()<<endl;


Prints:

Code: Select all
1129
377
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
User avatar
alecjacobson
Registered Member
Posts
26
Karma
0

Re: Extracting Q matrix from SparseQR

Sun Sep 22, 2013 12:40 pm
Thanks. This helps. However, it seems that extracting matrixQ still results in an effectively dense operation. I tracked it down to the SparseQR_QProduct::evalTo in SparseQR/SparseQR.h around line 563. There are two nested for loops and a dot product at the innermost level. This dot product (though sparse) seems effectively dense. Causing this evalTo to be O(n^3) where n is the number of cols in Q rather than what I hoped would be more like O(nnz) where nnz is the number of non zeros in Q.

Here's a concrete example (forgive the inline matrix entries, but it gives a real-world example):
Code: Select all
#include <Eigen/Sparse>
#include <vector>

// Matrix entries
const double IJV[508*3] = {2,1,1,374,1,1,2,2,1,374,2,1,3,3,1,373,3,1,2,4,-1,3,4,1,3,5,-1,87,5,1,5,6,-1,82,6,1,6,7,-1,105,7,1,7,8,-1,111,8,1,8,9,-1,117,9,1,9,10,-1,125,10,1,10,11,-1,204,11,1,11,12,-1,300,12,1,12,13,-1,345,13,1,13,14,-1,212,14,1,15,15,-1,252,15,1,17,16,-1,251,16,1,19,17,-1,330,17,1,21,18,-1,237,18,1,23,19,-1,333,19,1,24,20,-1,334,20,1,25,21,-1,239,21,1,27,22,-1,326,22,1,28,23,-1,286,23,1,29,24,-1,278,24,1,29,25,-1,280,25,1,30,26,-1,273,26,1,33,27,-1,131,27,1,35,28,-1,316,28,1,36,29,-1,360,29,1,39,30,-1,356,30,1,43,31,-1,315,31,1,44,32,-1,312,32,1,44,33,-1,314,33,1,45,34,-1,160,34,1,45,35,-1,208,35,1,46,36,-1,157,36,1,46,37,-1,158,37,1,47,38,-1,94,38,1,50,39,-1,151,39,1,64,40,-1,127,40,1,66,41,-1,108,41,1,66,42,-1,112,42,1,68,43,-1,69,43,1,68,44,-1,95,44,1,69,45,-1,81,45,1,71,46,-1,203,46,1,64,47,1,74,47,-1,74,48,-1,126,48,1,75,49,-1,86,49,1,77,50,-1,130,50,1,80,51,-1,85,51,1,70,52,1,81,52,-1,6,53,1,82,53,-1,5,54,1,83,54,-1,84,55,-1,96,55,1,5,56,1,85,56,-1,86,57,-1,96,57,1,4,58,1,87,58,-1,77,59,1,90,59,-1,90,60,-1,130,60,1,77,61,1,93,61,-1,93,62,-1,100,62,1,94,63,-1,99,63,1,84,64,1,95,64,-1,95,65,-1,96,65,1,95,66,-1,104,66,1,6,67,1,96,67,-1,87,68,1,97,68,-1,97,69,-1,98,69,1,83,70,1,98,70,-1,33,71,1,99,71,-1,100,72,-1,150,72,1,101,73,-1,145,73,1,101,74,1,102,74,-1,102,75,-1,104,75,1,104,76,-1,105,76,1,7,77,1,105,77,-1,107,78,-1,109,78,1,67,79,1,108,79,-1,108,80,-1,110,80,1,109,81,-1,145,81,1,110,82,-1,147,82,1,8,83,1,111,83,-1,112,84,-1,113,84,1,112,85,-1,147,85,1,8,86,1,113,86,-1,113,87,1,114,87,-1,114,88,-1,115,88,1,115,89,-1,117,89,1,9,90,1,117,90,-1,9,91,1,122,91,-1,10,92,1,125,92,-1,121,93,1,126,93,-1,124,94,1,126,94,-1,65,95,1,127,95,-1,121,96,1,127,96,-1,10,97,1,128,97,-1,129,98,-1,231,98,1,90,99,1,132,99,-1,132,100,1,133,100,-1,133,101,1,134,101,-1,141,102,-1,189,102,1,145,103,-1,146,103,1,111,104,1,146,104,-1,113,105,1,147,105,-1,148,106,-1,249,106,1,148,107,-1,250,107,1,124,108,1,149,108,-1,128,109,1,149,109,-1,131,110,1,150,110,-1,49,111,1,151,111,-1,155,112,-1,187,112,1,155,113,-1,307,113,1,158,114,-1,159,114,1,158,115,-1,207,115,1,159,116,-1,160,116,1,160,117,-1,309,117,1,160,118,-1,310,118,1,15,119,1,161,119,-1,30,120,1,185,120,-1,186,121,-1,187,121,1,186,122,-1,275,122,1,187,123,-1,274,123,1,186,124,1,188,124,-1,189,125,-1,206,125,1,197,126,-1,205,126,1,198,127,-1,305,127,1,199,128,-1,203,128,1,202,129,-1,296,129,1,203,130,-1,305,130,1,11,131,1,204,131,-1,205,132,-1,306,132,1,202,133,1,206,133,-1,33,134,1,207,134,-1,209,135,-1,309,135,1,129,136,1,210,136,-1,213,137,-1,318,137,1,214,138,-1,221,138,1,215,139,-1,217,139,1,216,140,-1,218,140,1,217,141,-1,223,141,1,214,142,1,218,142,-1,220,143,-1,227,143,1,221,144,-1,223,144,1,220,145,1,222,145,-1,223,146,-1,229,146,1,222,147,1,224,147,-1,25,148,1,225,148,-1,225,149,-1,228,149,1,226,150,-1,232,150,1,227,151,-1,229,151,1,226,152,1,228,152,-1,228,153,-1,238,153,1,229,154,-1,230,155,-1,234,156,-1,233,157,1,238,157,-1,24,158,1,239,158,-1,234,159,1,242,159,-1,244,160,-1,247,160,1,244,161,-1,248,161,1,245,162,-1,334,162,1,21,163,1,246,163,-1,21,164,1,247,164,-1,163,165,1,250,165,-1,250,166,-1,253,166,1,18,167,1,251,167,-1,254,168,-1,256,168,1,254,169,-1,336,169,1,252,170,1,255,170,-1,255,171,-1,256,171,1,252,172,1,256,172,-1,29,173,1,273,173,-1,274,174,-1,279,174,1,274,175,1,275,175,-1,276,176,-1,284,176,1,277,177,-1,283,177,1,277,178,-1,293,178,1,28,179,1,278,179,-1,276,180,1,279,180,-1,280,181,-1,281,181,1,281,182,-1,285,182,1,276,183,1,283,183,-1,282,184,1,284,184,-1,285,185,-1,286,185,1,27,186,1,286,186,-1,288,187,-1,291,187,1,289,188,-1,322,188,1,287,189,1,290,189,-1,288,190,1,290,190,-1,289,191,1,291,191,-1,213,192,1,292,192,-1,292,193,-1,321,193,1,276,194,1,293,194,-1,295,195,-1,297,195,1,283,196,1,296,196,-1,297,197,-1,298,197,1,298,198,-1,302,198,1,300,199,-1,301,199,1,301,200,-1,303,200,1,287,201,1,302,201,-1,292,202,1,303,202,-1,303,203,-1,346,203,1,204,204,1,305,204,-1,295,205,1,306,205,-1,277,206,1,307,206,-1,207,207,1,308,207,-1,309,208,-1,310,208,1,144,209,1,310,209,-1,209,210,1,311,210,-1,309,211,1,311,211,-1,43,212,1,312,212,-1,313,213,-1,361,213,1,314,214,-1,317,214,1,42,215,1,315,215,-1,315,216,-1,374,216,1,34,217,1,316,217,-1,316,218,1,317,218,-1,211,219,1,318,219,-1,319,220,-1,320,220,1,215,221,1,320,221,-1,319,222,1,321,222,-1,216,223,1,322,223,-1,322,224,1,323,224,-1,324,225,-1,325,225,1,216,226,1,325,226,-1,219,227,1,325,227,-1,26,228,1,326,228,-1,224,229,1,327,229,-1,231,230,1,328,230,-1,331,231,-1,234,232,1,332,232,-1,22,233,1,333,233,-1,55,234,1,340,234,-1,13,235,1,345,235,-1,13,236,1,347,236,-1,348,237,-1,361,237,1,340,238,1,350,239,1,354,240,1,355,240,-1,38,241,1,356,241,-1,48,242,1,357,242,-1,90,243,1,357,243,-1,359,244,-1,369,244,1,35,245,1,360,245,-1,356,246,1,363,246,-1,363,247,-1,372,247,1,40,248,1,365,248,-1,364,249,1,365,249,-1,365,250,-1,367,250,1,37,251,1,366,251,-1,367,252,-1,372,252,1,364,253,1,368,253,-1,369,254,-1,370,254,1,366,255,1,372,255,-1,37,256,1,373,256,-1,41,257,1,375,257,-1,};

int main(int argc,char * argv[])
{
  using namespace Eigen;
  using namespace std;
  vector<Triplet<double> > ijv;
  for(int c=0;c<508*3;c+=3)
  {
    ijv.push_back(Triplet<double>(IJV[c]-1,IJV[c+1]-1,IJV[c+2]));
  }
  SparseMatrix<double> A(375,257);
  A.setFromTriplets(ijv.begin(),ijv.end());
  // Perform QR
  SparseQR<SparseMatrix<double>, COLAMDOrdering<int> > qr(A);
  assert(qr.info() == Success);
  // Extract Q matrix
  SparseMatrix<double> Q;
  // Bottleneck seems to be SparseQR_QProduct::evalTo in SparseQR/SparseQR.h around line 563.
  Q = qr.matrixQ();
  return 0;
}
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
You're right. I've updated the related bug entry so that we don't forget: http://eigen.tuxfamily.org/bz/show_bug.cgi?id=596, but I cannot say when this will be implemented.
User avatar
alecjacobson
Registered Member
Posts
26
Karma
0
Any update on this sparse Q extraction problem? Still itching for a solution. It'd be nice not to rely on external libraries for sparse QR. Thanks.
Radek D.
Registered Member
Posts
1
Karma
0
Hi,

I also have a problem recovering sparse Q from the solver. In my case I can't even get the assignment to compile as the
SparseQR<SparseMatrix<double>, NaturalOrdering<int> >::QRMatrixType and SparseMatrix<double> are not compatible.

Code: Select all
SparseMatrix<double> Q = solver.matrixQ();

This line of code gives me the following error.
Code: Select all
error C2440: 'initializing': cannot convert from 'Eigen::SparseQRMatrixQReturnType<Eigen::SparseQR<Eigen::SparseMatrix<Scalar,0,int>,Eigen::NaturalOrdering<int>>>' to 'Eigen::SparseMatrix<Scalar,0,int>'
        with
        [
           Scalar=double
        ]
 main.cpp(288): note: No constructor could take the source type, or constructor overload resolution was ambiguous


I work on Windows and I use MSVC 14.0. Is there any workaround for this? I tried the following:

Code: Select all
SparseMatrix<double> I(solver.matrixQ().rows(), solver.matrixQ().rows());
I.setIdentity();
SparseMatrix<double> Q = I*solver.matrixQ(); // does not compile

This has a similar template incompatibility issue.

Code: Select all
SparseMatrix<double> O(solver.matrixQ().rows(), solver.matrixQ().cols());
solver.matrixQ().addTo(O);

This one compiles takes forever and does not seem to finish. (I tried to let it run overnight).

I really need to recover the Q matrix as I need to save it to a file for a later use as I have a big matrix (around 42000 rows and 20000 cols with quite irregular sparsity pattern) and the factorization takes a long time.


Bookmarks



Who is online

Registered users: Bing [Bot], blue_bullet, Google [Bot], rockscient, Yahoo [Bot]