Registered Member
|
All,
I've moved to only have fanless devices in my office. I love working in an environment without that kind of noise. However, that means I have relatively low-power devices too. I have quite a few boxes in the basement, I have a server rack, actually... Obviously, the future will see ever more powerful devices that can be made fanless, but I'm pretty sure, my server rack will always be more powerful. Also, I have a gigabit Ethernet network between these devices, meaning that certain operations could have been made faster by letting another host in the LAN participate in the processing. I do this already a bit, both Digikam and Amarok uses a MySQL db on a server in the basement. It could do much more though. My suggestion is therefore to implement some generic functionality that can help applications offload processing to a different host in the LAN. These hosts should be able to come and go, and it should be possible to configure them centrally, not on a per-application basis. There are several ways this could be done. One possibility is to seek inspiration in DVD::Rip, which uses a cluster to transcode. This relies on the user having a NFS infrastructure defined, and SSH to distribute load. This is probably the simplest to implement, and it can work for relatively large volumes of distributed data, especially if the processing can occur on the NFS server, but it places a burden on the user to set up everything correctly. Another possibility is to implement a custom protocol (or indeed, use MPI), so that very little configuration needs to be done on participating hosts. If all data needs to be shipped back and forth, it would probably handle less data though. Perhaps the multithreading framework could be extended? The third thing that could be supported is to have all indexing processes and stuff, things that typically run in the background, run on a remote host. They might well be indexing a remote NFS host anyway. I know it has been discussed having this kind of capabililty in kdenlive, but I can imagine that it could be useful for for example Digikam and Brasero (which could do the same thing DVD::Rip does), and others that occasionally have some heavy processing. It could also open more opportunities, like in office environments that could share processing power. Oh, BTW, can you imagine a Beowulf cluster of KDE desktops? |
Registered Member
|
Certainly an interesting idea. I don't know how far it will go though.
Most desktop apps are supposed to be pretty light-weight, and what kind of background processing goes on and what kind of effort would be required to send that off to other servers depends heavily on the application and the task being performed. I suspect the effort to implement this feature may be more effort than is worth (especially since most KDE desktop users only have one machine, or otherwise don't have servers readily available to use for desktop app background processing).
airdrik, proud to be a member of KDE forums since 2008-Dec.
|
Registered Member
|
Yeah, it might. I think that the most reasonable way to start this would to be that those applications that would benefit most for it should look into their common needs. For me, that'd be digikam and kdenlive.
That's true, but my thought was that if you have a many computers, like in an office, you could have a single server providing computing power to many fanless desktops. I would have liked that, it would have made cubicles slightly more bearable... |
Registered Member
|
When looking at specific applications, I can definitely see kdenlive benefiting (if it doesn't already have this functionality) as offloading the rendering to a render farm seems like it should be a standard feature for a video processing application which aspires to anything more than casual use.
Digikam could also benefit when doing bulk image processing operations on a decently-sized library of images. Of course a tech-savvy user could probably turn this into a couple of scripts (using e.g. imagemagick) which could be distributed and kicked off on the processing server/cluster in a few different ways. Having that built into the app would certainly simplify that for the user.
airdrik, proud to be a member of KDE forums since 2008-Dec.
|
Registered Member
|
Folks,
I tried this in my last video project, a 2hr travel documentary. What I did is to create proxies for ffmpeg and mlt to be configured in kdenlive and do stuff like proxy generation on another host. Technically, this is not so super-complex, but it was some effort to get both machined configured that it works smoothly. Finally, the response time on my workstation running kdelive was far better, especially after importing a set video files and starting a bulk of proxy-gen's. If there is interest, I might clean & polish the project, create a project on github and make it public. Kind regards Josef |
Registered Member
|
I'd love to have this documented! Sounds really useful. |
Registered Member
|
Yeah, I'd love to see how this is done!
|
Registered Member
|
Why? Because I am rendering a 4K one hour-long video on an i5-2400 and I have another 7-10 of then doing nothing.
Right now I start and go to bed. If there is a mistake it takes another day. A week later I have a finished product and another one to start. It would be of great help if I could use more than one PC to render. |
Registered Member
|
At this point, I would recommend firing off a separate post for adding this functionality specifically to Kdenlive (and separate posts for other applications such as Digikam) as I feel like this general-purpose solution request isn't going anywhere.
(if it turns out that there are a few applications which want similar functionality that want to share code which results in a general purpose solution then great)
airdrik, proud to be a member of KDE forums since 2008-Dec.
|
Registered users: Bing [Bot], daret, Google [Bot], Sogou [Bot]