This forum has been archived. All content is frozen. Please use KDE Discuss instead.

JacobiSVD U matrix not same as LAPACK/MATLAB

Tags: None
(comma "," separated)
dtran11
Registered Member
Posts
2
Karma
0
I am trying to use the JacobiSVD to compute the U matrix. When I compare the results with LAPACK's dgesdd_ method, the singular values are the same but the U matrix are different. Some values will only differ in the sign but some are just totally different. Is there a more compatible SVD class I can use that will give me the same results as LAPACK or Matlab?

Thanks.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
The singular vectors are not unique, so there is indeed little chance you get exactly the sames (unless you use exactly the same algorithm). Why would you like the same as lapack?
dtran11
Registered Member
Posts
2
Karma
0
Thanks for your fast response. Currently our software runs on the PC (Windows) and on an embedded Linux system. The PC version uses LAPACK. It would be nice for the two platform to be consistent. LAPACK's complex build makes it undesirable on an embedded build so we decided to try out Eigen. Maybe it is time to replace LAPACK on the PC version. Do you know how Eigen svd compares with LAPACK's svd in regards to performance?

Thanks.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
That depends on the size of your problems. If you are dealing with small problems, then I'd guess that Eigen's JacobiSVD is faster, otherwise the LAPACK version should be faster because it uses a Householder based algorithm. On the other hand the Jacobi algorithm is more accurate. This also depends on whether you are using the reference BLAS/LAPACK (slow) or optimized ones.
fastmat
Registered Member
Posts
1
Karma
0
The SVD decompsition result is not as same as MATLAB. For example:
A =
1 3 2
3 1 2
5 2 2

1) with Matlab, the result is:
U =
-0.4061 0.9129 0.0415
-0.4971 -0.1826 -0.8483
-0.7668 -0.3651 0.5280

S =
7.38648 0 0
0 2.44949 0
0 0 0.6632

V =
-0.7759 -0.5963 0.2059
-0.4399 0.7454 0.5010
-0.4522 0.2981 -0.8406

2)but with Eigen, the result is:
U =
0.40613 0.912871 -0.0415321
0.497123 -0.182574 0.848255
0.766764 -0.365148 -0.527958

S =
7.3865 0 0
0 2.4495 0
0 0 0.663236

V =
0.77592 -0.596285 -0.205895
0.439863 0.745356 -0.500964
0.452182 0.298142 0.84062

The difference is serious!

Thanks.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
not that different (the opposite signs is irrelevant). You can check which one is more accurate by reconstructing the matrix:

(U * S.asDiagonal() * V.transpose() - A).norm()


Bookmarks



Who is online

Registered users: Bing [Bot], Google [Bot], Yahoo [Bot]