This forum has been archived. All content is frozen. Please use KDE Discuss instead.

ktorrent crashes when resume downloading

Tags: None
(comma "," separated)
imported4-Anonymous
Registered Member
Posts
329
Karma
0
Ktorrent crashes when resume downloading.
It seems to be when loading 'current_chunks' file

from log file:
...
Starting download
Loading 669 active chunk downloads
Loading chunk 2752
...
Loading chunk 3289

The current size of 'current_chunks' file is about 1.3 GB.
The size of every chunk is 2 MB and I'm wondering how
is possible to have 669 uncompleted chunks in memory.
imported4-Anonymous
Registered Member
Posts
329
Karma
0

Sun Dec 11, 2005 5:50 pm
I forget the version - SVN rev. 480845
George
Moderator
Posts
5421
Karma
1

Mon Dec 12, 2005 8:11 pm
A backtrace would be handy and you should also upgrade, 480485 is from 16/11, nearly a month old.
imported4-Anonymous
Registered Member
Posts
329
Karma
0

Mon Dec 12, 2005 8:38 pm
George wrote:A backtrace would be handy and you should also upgrade, 480485 is from 16/11, nearly a month old.


The revision number that I was writtien is propably wrong, but the code is from 05 December.

The problem is that the number of currently downloading chunks is increasing as can I see from "Status" dialog.
Is there a maximum value for simultaneous downloading chunks?
George
Moderator
Posts
5421
Karma
1

Tue Dec 13, 2005 5:36 pm
There is no maximum, but seeing that the number of open requests is limited per peer, the amount of chunks should also be limited.

I had this happen to me at some point in time, but that was when I was experimenting with something, it certainly never got committed.

But you should at least upgrade to the latest, and see if it happens there.
imported4-Anonymous
Registered Member
Posts
329
Karma
0

Tue Dec 13, 2005 9:54 pm
I built the SVN code from today.

Currently I'm downloadin torrent file with 1634 pieces.
The size of every piece is 4 MB.
I'm connected to 10 peers.

30 minutes after starting of the download I have the following chunk info:
Total: 1634 Currently downloading: 125 Downloaded: 128 Excluded: 0

As soon as the downloaded chunks is increased, the currently downloading
is increased too.

What is the limit for connections to the peer?
imported4-Anonymous
Registered Member
Posts
329
Karma
0

Tue Dec 13, 2005 10:21 pm
30 minutes later I'm still connected to 10 peers with total download speed about 335 KB/s.

Total: 1634 Currently downloading: 171 Downloaded: 250

And after few hours ktorrent will crash - out of memory :cry:
imported4-Anonymous
Registered Member
Posts
329
Karma
0

Wed Dec 14, 2005 6:18 am
8 hours later (connected to 9 peers):
Total: 1634 Currently downloading: 427 Downloaded: 819

And of course the memory using by ktorrent is increasing.
I'm not sure but this problem may be similar to:
http://ktorrent.pwsp.net/forum/viewtopic.php?t=138
George
Moderator
Posts
5421
Karma
1

Wed Dec 14, 2005 5:23 pm
Can you send me the torrent ? I have to check this out myself.

Also, how many peers were you connected to ?
imported4-Anonymous
Registered Member
Posts
329
Karma
0

Tue Dec 20, 2005 6:10 pm
George wrote:Can you send me the torrent ? I have to check this out myself.

I don't think that the problem is in a specific torrent file.

George wrote:Also, how many peers were you connected to ?

I'm conected to no more than 10 peers.

I make little modification to PeerDownloader::getMaximumOutstandingReqs

Uint32 PeerDownloader::getMaximumOutstandingReqs() const
{
// for each 10 KB/sec allow one with a minimum of 5
return 5; //return 5 + peer->getDownloadRate() / (10 * 1024);
}

And now ktorrent works much better.
It seems the problem is in peer->getDownloadRate() / (10 * 1024)
I'm not very happy that ktorrent will increase number of downloading chunks on every 10 KB/s download speed. If I'm downloading with 1 MB/s? And if this speed is floating between peers?

I think that should be added options for maximum number of simultaneous downloading chunks per torrent and total (for all torrents).
This can optimize memory usage.
George
Moderator
Posts
5421
Karma
1

Wed Dec 21, 2005 11:33 am
Actually it's not per chunk, but per piece. A piece is 16K large, and requests get sent for each piece. The maximum number of outstanding piece requests is for each peer different and it depends on the speed.

That way we can send lots of requests to very fast peers, and we do not send lots of requests to slow peers, because they can't keep up anyway.

But there is indeed a problem with this, the number of chunks downloading keeps going up.


Bookmarks



Who is online

Registered users: Bing [Bot], daret, Google [Bot], sandyvee, Sogou [Bot]