This forum has been archived. All content is frozen. Please use KDE Discuss instead.

[URGENT!!!] Memory issues and tracker bombing

Tags: None
(comma "," separated)
J
Registered Member
Posts
86
Karma
0
I'm seeding and downloading hundreds of torrents. When I start ktorrent 2.0.2, it initiates loads of kio_http processes, which seem to consume a lot of memory; sometimes all of it, making my machine quite unresponsive. Is this normal?

Last edited by J on Wed Oct 11, 2006 8:33 pm, edited 2 times in total.
George
Moderator
Posts
5421
Karma
1

Sun Sep 24, 2006 6:48 pm
These processes are for announcing to the tracker. So this is pretty normal.
J
Registered Member
Posts
86
Karma
0

Mon Sep 25, 2006 3:39 am
Couldn't this be somehow limited?
George
Moderator
Posts
5421
Karma
1

Mon Sep 25, 2006 4:22 pm
I have to look into it, this is KDE code and I'm not quite sure how these processes get started and stopped. Tracker requests are usually 30 minutes apart, so there shouldn't be much of these running at the same time.
J
Registered Member
Posts
86
Karma
0

Mon Sep 25, 2006 5:35 pm
Well, this does only happen on startup. When ktorrent has been running for some time, it usually has about 4 kio_http processes present at any moment.
George
Moderator
Posts
5421
Karma
1

Tue Sep 26, 2006 6:08 pm
If 100 torrents start at the same time, then they all would have to do an announce.
J
Registered Member
Posts
86
Karma
0

Tue Sep 26, 2006 6:32 pm
Currently, I am seeding 347 and downloading 10 torrents. So does this mean that ktorrent initiates about 357 connections to the tracker at the same time AND by doing so also opens 357 kio_http processes?

Personally I think this is a bit too much. Can't the announces be queued in some way?
George
Moderator
Posts
5421
Karma
1

Wed Sep 27, 2006 3:59 pm
We didn't knew this was going to be a problem for extreme amount of torrents.

I believe it is possible to queue these kio_http jobs. I will look into this.
J
Registered Member
Posts
86
Karma
0

Wed Oct 11, 2006 8:32 pm
There really ought to be a limit for requests per second per tracker. My KTorrent 2.0.3 client just might have been a reason why one tracker is suspended, likely because the bandwidth limit was exceeded. Please fix this.

KTorrent 2.0.3 just kept bombing the site with request even thou the tracker returned an HTML error page stating "This Account Has Been Suspended. Please contact the billing/support department ASAP!". :(


Thank you KTorrent developers! :)
_________________
"Thou shalt not steal." - STOP PIRACY NOW!
George
Moderator
Posts
5421
Karma
1

Thu Oct 12, 2006 5:36 pm
J wrote:There really ought to be a limit for requests per second per tracker. My KTorrent 2.0.3 client just might have been a reason why one tracker is suspended, likely because the bandwidth limit was exceeded. Please fix this.

KTorrent 2.0.3 just kept bombing the site with request even thou the tracker returned an HTML error page stating "This Account Has Been Suspended. Please contact the billing/support department ASAP!". :(


:-) I think 2.0.2 after a couple of failed attempts waits some time to try again. I guess if you have a couple of 100 torrents all using the same tracker, this could be quite a bit of bandwith, when you start them all at once.
imported4-Tomasu
Registered Member
Posts
302
Karma
0

Sun Oct 15, 2006 12:50 pm
IMO, ktorrent should NOT use kio for this.

Something with a nice queue, and possibly a priority for each item, maybe even items that depend on others. They should also be tieable to torrents, and the torrents should be able to show the state ie: "Announcing", "Checking Data", "Downloading", "Scraping", etc.

Thats one thing I've always thought ktorrent lacked, a solid "job queue" system so ktorrent and individual torrents can queue up multiple actions, and schedule them properly, ie, everything should wait for a "data check" (manual or otherwise), other data checks should probably wait for any running check, N jobs of type "scrape/tracker query" should be run at a time (configurable), "disk allocate" (fully allocate space for torrent data, fixes fragmentation, but uses up way more disk space to start off with, also makes things work better with samba and fat32 iirc) jobs should only be done one at a time (though it seems ktorrent isn't actually fully allocating on my machine anymore, I thought it was for a while), things like this. if its something that can take some time, make it a job, and queue it.

It may be quite a task depending on ktorrents structure.. but it will allow a lot more flexible handling of "tasks" or "jobs". And allowing plugins to access the queue (add, remove, etc) would be a huge bonus (the rss plugin wouldn't have to roll its own background download handling, it'd just have to tell ktorrent to start a "fetch url" job).

Now it may not make total sense to make a single queue, it might be easier and simpler to make a few different queues for different things, like downloading standalone http files, data checking, etc. Though a single queue with multiple "slots" and item dependencies would achieve the same thing, and mean only one global queue to worry about.
George
Moderator
Posts
5421
Karma
1

Mon Oct 16, 2006 6:52 pm
Euhm, kio jobs can be scheduled, we just don't use the scheduling functionality at the moment. KIO is exactly what we need.

You are confusing 2 distinctly different things here. Tracker handling and data checking is completely different and they do not interfere with each other. So I see no reason to have one block the other. Putting them in the same queue doesn't make any sense because of this.

Yes the data check is messy, but that will be dealt with in the future.

And the RSS plugin does not roll it's own background downloading, it uses KIO::Job's to do this. The same damn KIO::Job's we use to download torrent files and to do HTTP tracker requests.

All we need is a bit of scheduling to prevent this kind of stuff.
imported4-Tomasu
Registered Member
Posts
302
Karma
0

Mon Oct 16, 2006 8:23 pm
Tracker handling and data checking is completely different and they do not interfere with each other.
Don't believe I said they did.

Personally I think something as complex as a p2p application needs a proper job system. but maybe thats just me. And KIO can only handle IO related jobs, so basing all job queues off of it probably won't work.

Something where a user can easily configure the minute handling of all tasks the program needs to do.

I would hope to see something similar to other torrent programs where you can see other "jobs" that are active on a torrent, in the listview.

But hey, it may be an overly complicated idea. I just prefer this sort of setup in programs, vs a string of interdependant signals and slots that can trigger each other in various ways and times.
George
Moderator
Posts
5421
Karma
1

Tue Oct 17, 2006 5:18 pm
Tomasu wrote:But hey, it may be an overly complicated idea. I just prefer this sort of setup in programs, vs a string of interdependant signals and slots that can trigger each other in various ways and times.


My intention is to move to a threaded model where each torrent runs in it's own thread, which makes things like preallocating and data checking easier (just do it in the torrent thread), and we can even check multiple torrents at the same time.

The tracker handling does not interfere, it just adds a bunch of IP addresses to a list and that's it, so they can run at the same time as the data checking.

Your idea is not overly complicated, I just prefer a different approach.
imported4-Tomasu
Registered Member
Posts
302
Karma
0

Tue Oct 17, 2006 11:46 pm
and we can even check multiple torrents at the same time
Not that thats a very good idea ;) first off, checking slogs the disk, and ktorrent's checker seems to be slower than most.

so they can run at the same time as the data checking.
With my idea, things would run in background threads anyhow ;)

Your idea is not overly complicated, I just prefer a different approach.
which is fine. I just like the idea of a single flexible job system where even a gui to view and configure all jobs is simple.


Bookmarks



Who is online

Registered users: bartoloni, Bing [Bot], Evergrowing, Google [Bot], q.ignora, watchstar