This forum has been archived. All content is frozen. Please use KDE Discuss instead.

2+ GB Virtual memory 1G+ resident memory usage

Tags: None
(comma "," separated)
imported4-brendon
Registered Member
Posts
7
Karma
0
I've had to give up on using ktorrent (Qt: 4.3.4, KDE: 4.0.3, KTorrent: 3.0.1) because of the huge memory usage. I've got a good list of 400+meg torrents that are queued up for downloading and a bunch seeding until they reach the limit. I played with the rules for number seeding, downloading, speeds, etc. but couldn't find anything that really made a difference after a few hours of running.

The problem is two fold, massive memory usage which, while not the end of the world, is bad. The second seems to be that the program continuously accesses the memory, causing the machine to start thrashing. Consequently, when ktorrent is started, it downloads at say 60kB/s but after a couple of hours, it can only manage 1-2kB/s dues to all the disk access.

Plus, the machine is unusable :shock: due to everything else getting paged out all the time.

This morning I switched all the torrents to Azureus, which is using 1.3G virtual memory but only 94 Meg of real memory. The machine is completely usable and Azureus is maintaining 180kB/s download and 90kB/s upload (limited).

Is there anything I can do to help debug this?
George
Moderator
Posts
5421
Karma
1

Tue Jun 10, 2008 6:01 pm
How many torrents are you running simultaneously ?
imported4-brendon
Registered Member
Posts
7
Karma
0

Tue Jun 10, 2008 9:36 pm
At the moment Azureus has 22 torrents open, most are seeding. With ktorrent I had far fewer torrents open, about 10.

The memory use didn't seem very sensitive to the number of torrents, in other words the memory usage would grow to huge at a rate that did not seem to depend on the number of torrents, after that number exceeded some threshold. If, for example, a torrent ended and I removed it, the memory usage did not decline.
George
Moderator
Posts
5421
Karma
1

Wed Jun 11, 2008 8:12 am
Can you send me those torrents for testing purposes ?
imported4-brendon
Registered Member
Posts
7
Karma
0

Wed Jun 11, 2008 4:42 pm
Sure, but before I do I'm trying to better characterize the problem by starting out fresh.

The memory growth seems definitely related to the amount downloaded, so recreating this will take some time. I'm not yet sure if it is the downloading or the seeding that is creating the problem. If you can wait until I get a better handle, perhaps it will save you some time?
imported4-brendon
Registered Member
Posts
7
Karma
0

Wed Jun 11, 2008 9:44 pm
OK, so downloading 5 torrents of 3.5GB each (all the kubuntu DVDs) does not seem to result in much memory growth. Download speeds of 600KB/s upload 30KB/s. Memory use is about 200MB Virtual 20MB real. These torrents have huge swarms, for the most part.

The problem seems to be with torrents that have very small swarms, probably resulting in getting lots of different requests for lots of bits of the file due to low or no seeding in the swarm.

I'm trying to create an easy to reproduce test case. I suppose grabbing a bunch of marginal torrents from pirate bay may work :) but I'd like something a little more consistent.
imported4-brendon
Registered Member
Posts
7
Karma
0

Sat Jun 14, 2008 12:44 am
I've not been able to reliably recreate the problem. It seems to occur naturally when downloading and seeding a lot of moderate sized torrents.

It has not happened since I've been trying to recreate it deliberately!

Some of the original torrents are now unseeded, so using them is not going to cut it as far as reproducing the problem...

I'm going to try grab the torrents for a whole season of a TV show to see if that recreates it????
imported4-brendon
Registered Member
Posts
7
Karma
0

Sun Jun 15, 2008 2:57 am
I admit that I'm stumped. I've now downloaded several GB of stuff I don't want without being able to recreate the memory growth/use that I was previously experiencing without fail!

Clearly a problem exists, but it must require some subtle set of circumstances to trigger the problem. Perhaps some others will experience the problem and can add some characterization information.

Some observations: The memory usage in my last test was quite stable, downloading 26 torrents of about 350MB each. The usage was about 320M virtual/50M RSS. However, when I killed and restarted kTorrent, usage stabilized at 170M/24M, which seems a little strange.

Another interesting trend was the network performance. Initial speeds were 110k down, 69k up. After 30 minutes these had stabilized at 225k down, 55k up. After that, both speeds trended downwards until after 14 hours they were 77k down, 23k up and continuing downwards...
George
Moderator
Posts
5421
Karma
1

Sun Jun 15, 2008 5:05 pm
My guess is that this is probably caused by the download behavior of some clients, resulting in lots of chunks kept into memory, normally when no requests are received for a chunk in 5 seconds, it is unloaded.

So if somebody is starting to act funny, it could cause KT to keep a lot of chunks in memory.

I will put in some limits on the amount of chunks in memory.


Bookmarks



Who is online

Registered users: Bing [Bot], Google [Bot], Yahoo [Bot]