rif
Registered Member

Given a matrix A and b, is this the fastest way to do an SVDbased linear solve for Ax = b?
Eigen::JacobiSVD<MatrixXd, Eigen::HouseholderQRPreconditioner> svd( A, Eigen::ComputeThinU  Eigen::ComputeThinV); MatrixXd x = svd.solve(b); It is so slow I cannot imagine I am doing this correctly. In particular, compared to another implementation we have of an internal library that wraps ATLAS, I am seeing much worse scaling in the size, and by the time the matrix gets to be 800 rows and 400 columns, the eigen3 version is over 20x slower, and it only gets worse from there. Yes, I do have optimizations turned on, and in general I've been pleased with the speed of eigen3's vector operations. Any thoughts or advice on this are appreciated. 
rif
Registered Member

I investigated further, and my current opinion is that it seems incredibly slow because it is incredibly slow. Looking at my Demmel, he mentions that Jacobi algorithms are very accurate, but are the slowest available for eigenvalue and SVD computations.
I instrumented the JacobiSVD.h file, and as expected basically all the time is spent in the Jacobi iterations, basically none in the preconditioner or anything else. I benchmarked a least squares solve [which is what I need to do right now] using QR instead of SVD. On a random 2k by 1k matrix, the SVD approach took 167 times longer than the QR. For least squares this is fine, the QR gives me plenty of accuracy for my purposes, but the SVD is often very useful in numerical linear algebra. I think Eigen really needs a faster SVD, or at the very least, some big warnings around it in the documentation. 
ggael
Moderator

Yes we are aware of that. In the past we had a port of JAMA's implementation and also the one of Numerical Recipes, but both were really too bad (in term of accuracies) for our tastes so we dropped them.

rif
Registered Member

Ah.
Out of curiosity, are there benchmarks or other records of this around? How accurate does it have to be? Would a good implementation of a modern fast SVD algorithm probably be accepted. In my opinion, not having a usable SVD is a major red flag. 
jitseniesen
Registered Member

The old SVD implementation that we used to have was removed because of issues with the quality of the code. I think the underlying algorithm was acceptable, even if it is not quite as accurate as using twosided Jacobi. With many algorithms, and especially with SVD, there is quite some distance between the algorithm as written down for instance in a text book and a good implementation.

rif
Registered Member

I'm wondering if it makes sense to at least update http://eigen.tuxfamily.org/dox/Tutorial ... astsquares, which starts by saying that SVD is the best way to solve least squares, and then gives an example. I would suggest that given that the QR is over 100x times faster at moderate speeds, this section could use a rewrite.
Are there any other surprises people know of in eigen3? Anything else besides the SVD that turns out to be very slow? I haven't tried it exhaustively, but this was a pretty big [and unpleasant] surprise. 
ggael
Moderator

You are right the tutorial should warn about that. Note that this page starts with "don't miss our special page on this topic." pointing the following table:
http://eigen.tuxfamily.org/dox/TopicLin ... tions.html that gives a global overview of Eigen's decompositions. 
zoharl
Registered Member

I wonder if there are any news on this front. I am using JacobiSVD to compute the pseudo inverse of 1640x1640, and it takes 8 minutes.

ggael
Moderator

yes, that's rather slow! There is a start in the devel branch, in unsupported/Eigen/SVD (replacement of Eigen/SVD) with a BDCSVD which provides the same API and current twice the performance. It still relies on JacobiSVD, that is why it is only twice as fast.

zoharl
Registered Member

Okay, so I'll guess I'll stick for now with

Registered users: Baidu [Spider], Bing [Bot], davidemme, Exabot [Bot], francescor, Google [Bot], google01103, Kver, Mamarok, piper, TheraHedwig, whatthefunk, Yahoo [Bot]