Registered Member
|
Hello,
For a project I'm doing I need to do weighted LS (a lot of them). My first try was to implement the direct straitforwards calculation : A.X=Y with the weights contained in a diagonal matrix W a weighted least squares solution is x=inv(A'*W*A)*A'*W*Y (in matlab notations : A' <=> transpose of A / inv(X) <=> inverse of X) I read that this wasn't a robbust way to do it because of rounding errors, so I wondered If you had any idea of a better way to do this calculation ? Here is the code I'm using :
I'd really appreciate your opinion on this. |
Moderator
|
You can form the problem:
W * A * x = W * b and then do a LS solve using either a QR or SVD, e.g.: x = (W*A).householderQr().solve(W*b); |
Registered Member
|
|
Registered Member
|
If I'm not incorrect
b_hat=inv(A'*W*A)*A'*W*Y is the solution to the weighted LS problem b_hat=argmin_b (Ax-b)'W(Ax-b) . However (Ax-b)'W(Ax-b) = (W^(1/2)*(Ax-b))'*(W^(1/2)(Ax-b)) = ||W^(1/2)*(Ax-b)||_2 = ||W^(1/2)*Ax - W^(1/2)b||_2 Thus the standard way of transforming the problem into a non-weighted LS problem is to multiply A and b with W^(1/2) (i.e. the square root of W,) and not with W itself. For more info see: http://en.wikipedia.org/wiki/Least_squa ... st_squares Multiplying by W thus seems to in effect applying the square of the weights? Are you sure this is not what happens in the suggested solution. Also, from a efficiency perspective, it seems wasteful to use a full dense matrix to represent the diagonal weight matrix. I haven't used it myself, but DiagonalMatrix seems promising http://eigen.tuxfamily.org/dox/classEigen_1_1DiagonalMatrix.html. |
Registered users: Bing [Bot], Google [Bot], Sogou [Bot]