tonyhenz
Registered Member

Hi all eigen people,
Recently i have met a problem in using eigen library to sovle large sparse matrix. In my code, there is a integer number "N", then the final matrix A (Ax=b) is of dimension N^2 * N^2, in which only N^3 elements are nonzero, so i decide to use sparsematrix to store and solve this matrix equations. At the very beginning i have tested small sparse matrix, to say N up to 100, the BiCGSTAB is definitely OK. But when comes to my really problem, i need to calculate the problem with N up to for example 1000, I found the BiCGSTAB is quite slow, for example, the running program with N=481, and it takes more than 3 hours, so far not finished. So i am afraid there are some problem with large sparse matrix. ================================================= BiCGSTAB<SpMat> solver; VectorXcd pillp = solver.compute(A).solve(b); ================================================= However, i have already used the SparseLU instead, ================================================= SparseLU<SpMat,COLAMDOrdering<int> > solver(A); VectorXcd pillp = solver.solve(b); ================================================= it takes 1 hour to solve the problem with N=757. But if i set N=481 it will show some error, as ================================================================================= ../eigen3/Eigen/src/Core/PlainObjectBase.h:265: void Eigen::PlainObjectBase<Derived>::resize (typename Eigen::internal::traits<T>:$$ts<T>::Index) [with Derived = Eigen::Matrix<int, 0x00000000000000001, 1, 0, 0x00000000000000001, 1>]: Assertion `((SizeAtCompileTime == D$$e == Dynamic && (MaxSizeAtCompileTime==Dynamic  size<=MaxSizeAtCompileTime))  SizeAtCompileTime == size) && size>=0' failed. =================================================================================== when solve large N problem, to say N=1000, the SparseLU will not work also, the error output like STL bad_alloc, I think it might be due to the meomery issue. I have checked that the program always stoped at memory cames up to 24 GB, but my computer has 256GB memory. Best Regards, Tony 
Dee33
Registered Member

Yes, seems to be a memory allocation problem. Note that even if you have 256Gb on your system, the maximum memory you can access is restricted by the maximum value of the array indexes (std::numeric_limits<int>::max() * sizeof(Scalar) )
You can try to access more memory using long int or long long :

tonyhenz
Registered Member

Hi Dee33, Thanks. It is a very promoted suggestion. I think i will have a try. Best regards, Tony 
Registered users: alexbel, andreas_k, Baidu [Spider], Bing [Bot], cmoi, davidemme, Exabot [Bot], flherne, Google [Bot], gosku, Horrendus, jdonovan, Lekensteyn, redbot, rumangerst, toad, TSDgeos, Yahoo [Bot]