This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Different results by new and old version of Eigen

Tags: None
(comma "," separated)
jhhung
Registered Member
Posts
2
Karma
0
Hi,
I've found that the result generated by "linearRegression" in latest version of Eigen's package, is probably not correct as the old one.

It is possibly resulted from my faulty use, but I think it is worthwhile for you guys to check it out.

By using the exemplary data given by the tutorials:

Vector3d points[5];
points[0] = Vector3d( 3.02, 6.89, -4.32 );
points[1] = Vector3d( 2.01, 5.39, -3.79 );
points[2] = Vector3d( 2.41, 6.01, -4.01 );
points[3] = Vector3d( 2.09, 5.55, -3.86 );
points[4] = Vector3d( 2.58, 6.32, -4.10 );

The correct result should be like what shown in the webpage, where a=0.495, b=-1.927, and c=-2.906 (check by R package).

However, when using the same function of the latest version, the result becomes: a = -1.364, b=-5.55334, and c=-12.9623, whereas the older version will still give the right answer.

I tried several data set, and compared the results by R package, Eigen_1.X, and Eigen_2.X, and found that R and Eigen_1.X always agree with each other, however Eigen_2.X does not.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
indeed, the difference is that in Eigen 2; linearRegression calls fit hyperplane which performs a total least square (LS) fit instead of an ordinary LS fit. With a total LS fit the returned plane equation is closer to the input points in the sense of the orthogonal Euclidean distance,

Actually, I think the linearRegression function is not really useful and should be removed. Indeed:

- either you have a functional dataset f(x_i) = f_i (x_i in R^n-1) and you want to perform an ordinary LS fit in which case you can directly call any linear solver (LU, LLT, QR, SVD)

- or you have an arbitrary dataset {x_i}, x_i in R^n, and then there is no reason to favor one direction from the other => you want to do a total LS => you call fitHyperplane and directly work with the hyperplane equation, e.g., you can interpret the hyper plane equation as a rotation + translation transformation, apply it to the input data, and then you can approximate your dataset as a linear functional dataset by the trivial plane equation a=0, b=0, c=0.

For my curiosity, could you give us the respective R code ? I'd like to see the API of the R linear regression function...
jhhung
Registered Member
Posts
2
Karma
0
Thanks for the reply, I will try linear solver instead.
The R code is simply lm(formula = y ~ x + z + w).
User avatar
bjacob
Registered Member
Posts
658
Karma
3
ggael wrote:Actually, I think the linearRegression function is not really useful and should be removed. Indeed:


Good points, i agree...


Join us on Eigen's IRC channel: #eigen on irc.freenode.net
Have a serious interest in Eigen? Then join the mailing list!


Bookmarks



Who is online

Registered users: Bing [Bot], daret, Google [Bot], sandyvee, Sogou [Bot]