Page 1 of 1

[solved] really big queue = performance problem

Posted: November 27th, 2017, 8:25 pm
by zion2k
Hello,

I added a lot ob tasks to the queue (>10000). Since then my dl speed droped to 1mb/s. Normally I max out my 100mbit line.
It is a octa core cpu with 8gb of ram. (Sysload: 1.00 | 1.07 | 1.11 | V=4074M R=992M)
The system seems to idle.

Any ideas how I can increase the dl speed?


---
Version: 2.3.1 [f1695ec]
Python Version: 2.7.13 (default, Jan 19 2017, 14:48:08) [GCC 6.3.0 20170118] [UTF-8]
OpenSSL: OpenSSL 1.1.0f 25 May 2017

Re: really big queue = performance problem

Posted: November 28th, 2017, 2:30 am
by safihre
Sabnzbd is unfortunately not made for this.. I think for these kind of loads you should check nzbget.
Sab should do fine with 1000 items, but above that the overhead of searching for the next article and other bookkeeping is getting increasingly worse.

Re: really big queue = performance problem

Posted: November 28th, 2017, 12:00 pm
by zion2k
Thank you for the fast answer.
Is there a fast and save way to delete my whole queue? The GUI is not really responsive, as you can imagine.

Re: really big queue = performance problem

Posted: November 28th, 2017, 12:15 pm
by safihre
You can delete queue10.sab file from your admin folder for SABnzbd:
https://sabnzbd.org/wiki/advanced/directory-setup
Make sure to shutdown SABnzbd first.
Then you will also have to delete all the files from your incomplete folder.

Re: really big queue = performance problem

Posted: November 28th, 2017, 3:23 pm
by zion2k
Thank you for the help.
I think I will solve this with a notification script which will add 500 new items if the queue is empty.