This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Sparse matrices

Tags: None
(comma "," separated)
cn_cologne
Registered Member
Posts
3
Karma
0
OS

Sparse matrices

Fri Jun 24, 2011 9:53 am
I think Eigen's dense matrix part is very good. Unfortunatly,
the current sparse matrix implementation is missing in my view some
key features, like accessing sub-matrices, swapping rows etc. I think the sparse type should have most of the functionality of the dense matrix type. ( For example Boost's uBLAS does follow this principle.) I do not see the point in forbidding direct write access via .(i,j) and granting it via .coeffRef(i,j). Another point is, that a direct sparse LU decomposition without reordering is exactly the same as in the dense case. The algorithm should work for a dense and sparse type.

best regards,
Carsten
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS

Re: Sparse matrices

Fri Jun 24, 2011 12:35 pm
the problem is that many operations that are doable on a dense representation would be extremely slow on sparse one. Boost uBLAS is a good example of that where most of the operations it provides should not be used because they are damn slow. For instance addressing a row of a column major sparse matrix is a no go. If you don't see why, have a look at how sparse matrices are represented. For the design of the Sparse module, I tried to expose only the operations which can be efficiently implemented. That's why there is no easy operator(i,j) which should be useful for debugging purpose only. Then you see that a sparse and dense LU decomposition implementations are fundamentally different. Another fundamental difference is that without fill-in reordering, a direct sparse decomposition has little chance to complete without exceeding the memory, and even if it does it would take a large amount of time to compute and use for the actual solving....
cn_cologne
Registered Member
Posts
3
Karma
0
OS

Re: Sparse matrices

Fri Jun 24, 2011 2:39 pm
@ggael:

i agree that certain sparse operations can be slow (depending on the underlying data)and uBLAS is a good example for that. On the other hand, in my case i need certain features that can be slow, such as sub-matrices. Otherwise my code gets very clumsy. I would prefer a sparse representation, that offers alot of features and a decent performance.

>That's why there is no easy operator(i,j) which should be useful for >debugging purpose only.
I don't get your point. You can access the data via coeffRef(i,j), why
not use directly operator(i,j)?

>Another fundamental difference is that without fill-in reordering, a >direct >sparse decomposition has little chance to complete without >exceeding the >memory, and even if it does it would take a large amount of >time to compute >and use for the actual solving....

well i don't agree in general. There are cases, were the fill-in is not at all severe. Any ways, if you want to implement a direct solver you need to
perform a LU decomposition at a certain stage. At this point, doing it with eigen is a bit difficult.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS

Re: Sparse matrices

Sat Jun 25, 2011 8:21 am
Could you be more specific about your use case such that we have some concrete materials to implement proper and efficient solutions?

Also note that you can already address a set of inner vectors.


Bookmarks



Who is online

Registered users: Bing [Bot], Google [Bot], q.ignora, watchstar