I am running vista 64bit SP1 with sabnzbd 0.47
It requires a huge amount of memory... now about 655MB with a 25% cpu time (X4 9650)
It does run it still can process new nzb files and thing do show up in the downloaded dir....
but the web interface isn't showing up anymore and that is anoying.. I have swithced back to version 0.45 but the same problem occures here..
Maybe the webserver should get more priority
huge memory and http not working
Forum rules
Help us help you:
Help us help you:
- Are you using the latest stable version of SABnzbd? Downloads page.
- Tell us what system you run SABnzbd on.
- Adhere to the forum rules.
- Do you experience problems during downloading?
Check your connection in Status and Interface settings window.
Use Test Server in Config > Servers.
We will probably ask you to do a test using only basic settings. - Do you experience problems during repair or unpacking?
Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Re: huge memory and http not working
Have you tried restarting it?
It should hardly make a dent in a system like yours.
Do you have an enormous queue?
Which web-browser do you use?
It should hardly make a dent in a system like yours.
Do you have an enormous queue?
Which web-browser do you use?
Re: huge memory and http not working
figured the same thing
restart doesn't matter...
my queue file in my cache dir is about 80Mb
I use firefox3 as well as IE7
The process takes 0% cput time and is about 760Mb in size...
There is a little bit of HDD activity..
I am strongly thinking af moving sabnzbd back to linux...
Never had these problems with queues larger then 500gb in linux
restart doesn't matter...
my queue file in my cache dir is about 80Mb
I use firefox3 as well as IE7
The process takes 0% cput time and is about 760Mb in size...
There is a little bit of HDD activity..
I am strongly thinking af moving sabnzbd back to linux...
Never had these problems with queues larger then 500gb in linux
Re: huge memory and http not working
Strange, I test it all the time on Vista64SP1.Daniel304 wrote: I am strongly thinking af moving sabnzbd back to linux...
Never had these problems with queues larger then 500gb in linux
Not with very large queues, I have my Linux box for that.
Re: huge memory and http not working
just restarted for the fun of it in console mode
it is now 1050Mb in mem. and ir started to do the par checks
I won't add anything to the queue and remove maybe stale cache files when it is empty
it is now 1050Mb in mem. and ir started to do the par checks
I won't add anything to the queue and remove maybe stale cache files when it is empty
Re: huge memory and http not working
I might have found a few things that could explain it.. the cache folder existed of more then 120.000 files..
I cleanes everythingand restarted in console.. added a nzb file that contains a 5mb movie...
sabnzbd downloads at a rate of 5Mbyte/s so
it has no time left to render the webpage
I cleanes everythingand restarted in console.. added a nzb file that contains a 5mb movie...
sabnzbd downloads at a rate of 5Mbyte/s so
it has no time left to render the webpage
Re: huge memory and http not working
We need to do something about cleaning up the cache, it's weak point now.Daniel304 wrote: the cache folder existed of more then 120.000 files..
I cleanes everythingand restarted in console.. added a nzb file that contains a 5mb movie...
sabnzbd downloads at a rate of 5Mbyte/s so
it has no time left to render the webpage
The performance issue, I don't understand.
I can download up to 10MBytes/sec without any significant performance impact (on a very middle-of-the-road box).
The only exception is when a large par2 repair is needed, for obvious reasons.
Re: huge memory and http not working
I am getting this exact same issue
if sab has been running for around 30 min it will always become unresponsive (even if the queue is empty). If I have a large queue stacked up it will happen within a minute or 2. (by lot I mean over 10 GB queued)
If I end the process and start it back up it will work great for a few minutes depending on how much is queued
I have since upgraded to 4.8 and still have the issue
I completely uninstalled sab, cleared the cache, settings, etc and reset everything and the issue still persists so I am thinking that it is not sab that is causing the problem.
Also I event tried going back to 4.6 and that did not help.
I am running server 2008 x64 so it may be a x64 issue?
Also I am not sure if this is related, but whenever I start sab a rundll32 will always start soon after. I know that rundll32 is used by a lot of stuff, but since I started getting this issue I noticed that a rundll32 starts whenever I start sab. If I end the rundll32 it it does not come back and there is no change to sab.
Anyway I hope this all helps.
if sab has been running for around 30 min it will always become unresponsive (even if the queue is empty). If I have a large queue stacked up it will happen within a minute or 2. (by lot I mean over 10 GB queued)
If I end the process and start it back up it will work great for a few minutes depending on how much is queued
I have since upgraded to 4.8 and still have the issue
I completely uninstalled sab, cleared the cache, settings, etc and reset everything and the issue still persists so I am thinking that it is not sab that is causing the problem.
Also I event tried going back to 4.6 and that did not help.
I am running server 2008 x64 so it may be a x64 issue?
Also I am not sure if this is related, but whenever I start sab a rundll32 will always start soon after. I know that rundll32 is used by a lot of stuff, but since I started getting this issue I noticed that a rundll32 starts whenever I start sab. If I end the rundll32 it it does not come back and there is no change to sab.
Anyway I hope this all helps.
Re: huge memory and http not working
64 bit is not the problem. I do most of my development and testing on Vista 64bit.
We have no experience with Server 2008, but as far as I know it's very close to Vista.
Python (the programming language interpreter we use) and SABnzbd use a bunch of DLL's,
so rundll32 is to be expected.
I don't know what's going on inside the Python language system, so don't really know what
the expected DLL behaviour should be.
I have a friend who runs Server2008, I'll contact him about his experience with it.
If you want you can become a "Release Tester" and get access to the binaries of 0.5.0Alpha.
Since 0.5.0 is very much different from 0.4.x, it may help.
We have no experience with Server 2008, but as far as I know it's very close to Vista.
Python (the programming language interpreter we use) and SABnzbd use a bunch of DLL's,
so rundll32 is to be expected.
I don't know what's going on inside the Python language system, so don't really know what
the expected DLL behaviour should be.
I have a friend who runs Server2008, I'll contact him about his experience with it.
If you want you can become a "Release Tester" and get access to the binaries of 0.5.0Alpha.
Since 0.5.0 is very much different from 0.4.x, it may help.
Re: huge memory and http not working
Thanks for the quick reply.
I am down for alpha/beta testing testing
I am down for alpha/beta testing testing

Re: huge memory and http not working
I installed 0.48 ass well cleaned all cache log and other nzb dirs first..
I redid the complete config.. all at very save settings like stop downloading when extracting also don't delete par files and stuff like that..
Now I just make sure I do not add more then 2 dvd's at once....
for now this all works..
I tried adding a few nzb's as big as 25Mb that translates to about 50Gb worth of dvd's.
memory usage still grows fast when doing that...
I redid the complete config.. all at very save settings like stop downloading when extracting also don't delete par files and stuff like that..
Now I just make sure I do not add more then 2 dvd's at once....
for now this all works..
I tried adding a few nzb's as big as 25Mb that translates to about 50Gb worth of dvd's.
memory usage still grows fast when doing that...
Re: huge memory and http not working
Same problem here. 16,000 files in the cache (!) and my Centrino 1.6ghz is just maxed out. 800mb ram used by SABNZBD process, 100% CPU, cannot load the web interface until about 20 minutes after starting the exe.shypike wrote:We need to do something about cleaning up the cache, it's weak point now.Daniel304 wrote: the cache folder existed of more then 120.000 files..
I cleanes everythingand restarted in console.. added a nzb file that contains a 5mb movie...
sabnzbd downloads at a rate of 5Mbyte/s so
it has no time left to render the webpage
The performance issue, I don't understand.
I can download up to 10MBytes/sec without any significant performance impact (on a very middle-of-the-road box).
The only exception is when a large par2 repair is needed, for obvious reasons.
Windows has terrible performance with folders containing lots of files. I strongly recommend you give this cache cleanup a higher priority because right now the program is just not usable for me. I've been patient - I've tried to use it for the past 2+ weeks to download a medium sized queue, but I haven't even got 30% though it (200gb).
I really hope .50 will fix these problems and I'll be happy to try it out again if so.