Registered Member
|
I'm seeding and downloading hundreds of torrents. When I start ktorrent 2.0.2, it initiates loads of kio_http processes, which seem to consume a lot of memory; sometimes all of it, making my machine quite unresponsive. Is this normal?
Last edited by J on Wed Oct 11, 2006 8:33 pm, edited 2 times in total.
|
Moderator
|
|
Registered Member
|
|
Moderator
|
|
Registered Member
|
|
Moderator
|
|
Registered Member
|
Currently, I am seeding 347 and downloading 10 torrents. So does this mean that ktorrent initiates about 357 connections to the tracker at the same time AND by doing so also opens 357 kio_http processes?
Personally I think this is a bit too much. Can't the announces be queued in some way? |
Moderator
|
|
Registered Member
|
There really ought to be a limit for requests per second per tracker. My KTorrent 2.0.3 client just might have been a reason why one tracker is suspended, likely because the bandwidth limit was exceeded. Please fix this.
KTorrent 2.0.3 just kept bombing the site with request even thou the tracker returned an HTML error page stating "This Account Has Been Suspended. Please contact the billing/support department ASAP!".
Thank you KTorrent developers!
_________________ "Thou shalt not steal." - STOP PIRACY NOW! |
Moderator
|
I think 2.0.2 after a couple of failed attempts waits some time to try again. I guess if you have a couple of 100 torrents all using the same tracker, this could be quite a bit of bandwith, when you start them all at once. |
Registered Member
|
IMO, ktorrent should NOT use kio for this.
Something with a nice queue, and possibly a priority for each item, maybe even items that depend on others. They should also be tieable to torrents, and the torrents should be able to show the state ie: "Announcing", "Checking Data", "Downloading", "Scraping", etc. Thats one thing I've always thought ktorrent lacked, a solid "job queue" system so ktorrent and individual torrents can queue up multiple actions, and schedule them properly, ie, everything should wait for a "data check" (manual or otherwise), other data checks should probably wait for any running check, N jobs of type "scrape/tracker query" should be run at a time (configurable), "disk allocate" (fully allocate space for torrent data, fixes fragmentation, but uses up way more disk space to start off with, also makes things work better with samba and fat32 iirc) jobs should only be done one at a time (though it seems ktorrent isn't actually fully allocating on my machine anymore, I thought it was for a while), things like this. if its something that can take some time, make it a job, and queue it. It may be quite a task depending on ktorrents structure.. but it will allow a lot more flexible handling of "tasks" or "jobs". And allowing plugins to access the queue (add, remove, etc) would be a huge bonus (the rss plugin wouldn't have to roll its own background download handling, it'd just have to tell ktorrent to start a "fetch url" job). Now it may not make total sense to make a single queue, it might be easier and simpler to make a few different queues for different things, like downloading standalone http files, data checking, etc. Though a single queue with multiple "slots" and item dependencies would achieve the same thing, and mean only one global queue to worry about. |
Moderator
|
Euhm, kio jobs can be scheduled, we just don't use the scheduling functionality at the moment. KIO is exactly what we need.
You are confusing 2 distinctly different things here. Tracker handling and data checking is completely different and they do not interfere with each other. So I see no reason to have one block the other. Putting them in the same queue doesn't make any sense because of this. Yes the data check is messy, but that will be dealt with in the future. And the RSS plugin does not roll it's own background downloading, it uses KIO::Job's to do this. The same damn KIO::Job's we use to download torrent files and to do HTTP tracker requests. All we need is a bit of scheduling to prevent this kind of stuff. |
Registered Member
|
Don't believe I said they did. Personally I think something as complex as a p2p application needs a proper job system. but maybe thats just me. And KIO can only handle IO related jobs, so basing all job queues off of it probably won't work. Something where a user can easily configure the minute handling of all tasks the program needs to do. I would hope to see something similar to other torrent programs where you can see other "jobs" that are active on a torrent, in the listview. But hey, it may be an overly complicated idea. I just prefer this sort of setup in programs, vs a string of interdependant signals and slots that can trigger each other in various ways and times. |
Moderator
|
My intention is to move to a threaded model where each torrent runs in it's own thread, which makes things like preallocating and data checking easier (just do it in the torrent thread), and we can even check multiple torrents at the same time. The tracker handling does not interfere, it just adds a bunch of IP addresses to a list and that's it, so they can run at the same time as the data checking. Your idea is not overly complicated, I just prefer a different approach. |
Registered Member
|
Not that thats a very good idea first off, checking slogs the disk, and ktorrent's checker seems to be slower than most. With my idea, things would run in background threads anyhow which is fine. I just like the idea of a single flexible job system where even a gui to view and configure all jobs is simple. |
Registered users: bartoloni, Bing [Bot], Evergrowing, Google [Bot], q.ignora, watchstar