Reply to topic

Weighted Least Squares

waterTheOwl
Registered Member
Posts
2
Karma
0

Weighted Least Squares

Thu Apr 11, 2013 3:23 am
Hello,

For a project I'm doing I need to do weighted LS (a lot of them). My first try was to implement the direct straitforwards calculation :

A.X=Y with the weights contained in a diagonal matrix W

a weighted least squares solution is x=inv(A'*W*A)*A'*W*Y (in matlab notations : A' <=> transpose of A / inv(X) <=> inverse of X)

I read that this wasn't a robbust way to do it because of rounding errors, so I wondered If you had any idea of a better way to do this calculation ?

Here is the code I'm using :

Code: Select all
//I'm fitting a surface z=a.x^2+b.y^2+c.x.y+d.x+e.y+f into 3D points stored in pointsTofit. The output is X=[a;b;c;d;e;f]
void Mesh_toolbox::fitSurfaceWithLS(Eigen::Matrix3Xd &pointsToFit, Eigen::VectorXd &X, std::vector<double> &weights)
{
     Eigen::MatrixXd A(pointsToFit.cols(),6);     //the equation to solve : A.x=Y
     Eigen::VectorXd Y=Eigen::VectorXd::Zero(pointsToFit.cols());
     Eigen::MatrixXd W=Eigen::MatrixXd::Zero(pointsToFit.cols(),pointsToFit.cols());

     for(int k=0;k<weights.size();++k)     
          W(k,k)=weights[k];

     for(int i=0;i<pointsToFit.cols();++i)
     {
          double x,y,z;
          x=pointsToFit(0,i);  y=pointsToFit(1,i);  z=pointsToFit(2,i);

          Y(i)=z;

          A(i,0)=x*x;
          A(i,1)=y*y;
          A(i,2)=x*y;
          A(i,3)=x;
          A(i,4)=y;
          A(i,5)=1;
     }

     X=(A.transpose()*W*A).inverse()*A.transpose()*W*Y;
}



I'd really appreciate your opinion on this.
User avatar ggael
Moderator
Posts
2204
Karma
15
OS

Re: Weighted Least Squares

Thu Apr 11, 2013 6:20 am
You can form the problem:

W * A * x = W * b

and then do a LS solve using either a QR or SVD, e.g.:

x = (W*A).householderQr().solve(W*b);
waterTheOwl
Registered Member
Posts
2
Karma
0

Re: Weighted Least Squares

Fri Apr 12, 2013 1:20 am
Thanks ! It's working perfectly.
User avatar Anders Sj??gren
Registered Member
Posts
3
Karma
0

Re: Weighted Least Squares

Fri Apr 26, 2013 8:55 am
If I'm not incorrect
b_hat=inv(A'*W*A)*A'*W*Y
is the solution to the weighted LS problem
b_hat=argmin_b (Ax-b)'W(Ax-b) .

However
(Ax-b)'W(Ax-b) = (W^(1/2)*(Ax-b))'*(W^(1/2)(Ax-b)) = ||W^(1/2)*(Ax-b)||_2 = ||W^(1/2)*Ax - W^(1/2)b||_2

Thus the standard way of transforming the problem into a non-weighted LS problem is to multiply A and b with W^(1/2) (i.e. the square root of W,) and not with W itself.
For more info see: http://en.wikipedia.org/wiki/Least_squa ... st_squares

Multiplying by W thus seems to in effect applying the square of the weights? Are you sure this is not what happens in the suggested solution.

Also, from a efficiency perspective, it seems wasteful to use a full dense matrix to represent the diagonal weight matrix. I haven't used it myself, but DiagonalMatrix seems promising http://eigen.tuxfamily.org/dox/classEigen_1_1DiagonalMatrix.html.

 
Reply to topic

Bookmarks



Who is online

Registered users: afiestas, arojas, avishekk, Baidu [Spider], Bing [Bot], bjoernbalazs, Cris70, edmael, einar, epsi1on, Exabot [Bot], ggael, Google [Bot], google01103, Hans, ivan, jgrulich, joshaughnessy, jstaniek, koriun, La Ninje, Majestic-12 [Bot], MSNbot Media, ooker, progdan, scummos, sinclair, Sogou [Bot], TheraHedwig, Tioz, Yahoo [Bot]