Registered Member
|
Hello,
I'm trying to extend Eigen to use MAGMA (or CUBLAS where possible) as backend in the same way it already does with MKL. I'm a bit stuck in the Assign_MAGMA.h I see the need for it with MKL because of all the special unary operators MKL has but I can't figure how to make the Assign_MAGMA.h given that MAGMA does not have special definitions for unary operators. On the other hand MAGMA defines basic types e.g. magmaFloatComplex, magmaDoubleComplex etc but maybe this is not relevant for extending Assign.h? TIA, Best regards, Giovanni |
Moderator
|
This Assign_MKL.h is for the VML, so not related to MAGMA. Also if MAGMA expose a LAPACK compatible API then it might be simpler to improve the current BLAS/LAPACKe interface which is currently limited to MKL to make it compatible with any BLAS/LAPACKe.
|
Moderator
|
BTW, are you willing to share your MAGMA support layer?
|
Registered Member
|
Hello ggael,
I'd be happy to share it and collaborate on GitHub, I will create a new project and upload my progress so far. I have mix feelings as to whether have the project contain only the files relevant to the MAGMA backend or a full copy of Eigen. In the actual project I'm working on I use Eigen from the header files so I don't need and have not updated Eigen build-related files i.e. CMake or Makefiles. Although I'm a big enemy of copy-paste in this case of extending Eigen to use MAGMA as backend it kind of makes sense for me. The copied *_MAGMA.h files by being independent have minimal impact on Eigen and I can easily take new Eigen release updates without turning each time into a painful merging exercise. Another thing, I try to use CUBLAS directly where possible but via the MAGMA APIs e.g. in the General Matrix Matrix product I do that. Another thing, the MKL and MAGMA codes for achieving the same thing look completely different i.e. MAGMA requires different initialization, different memory management API's different types, they will really not be feasible to interleave into a single macro. Another note. I was contemplating the possibility of whether to use CULA or MAGMA, after running similar benchmarks MAGMA beats CULA performance-wise in all workloads. Best regards, Giovanni PS: I will follow up on the GitHub project once I get it. |
Moderator
|
This sounds all great. Looking forward to try it!
|
Registered Member
|
OK details in separate thread viewtopic.php?f=74&t=112065
|
Registered users: Bing [Bot], Google [Bot], Sogou [Bot], Yahoo [Bot]