This isn't necessarily a support issue. I guess I'm looking more for confirmation of my thoughts, or maybe a work-around:
I have (just upgraded to the latest RC+plush skin) SABnzbd running on a linux NAS (xbox1) and had massive problems with memory leaks (0.2.5) until I upgraded to a newer version quite a few months ago (so thanks for that). But anyway, it runs quite well, even with the limited hardware.
I recently downloaded an .nzb that was about 4.7mbytes itself, and referenced 750+ files totaling over 10gbytes(I guess the uploader split the files too much).
When trying to d/ed this data, SABnzb memory usage tripled, and has used up almost all the swap space, and all but shutdown the web interface, and made SSH and samba access very slow.
Normally this isn't an issue, as most .nzbs are not like this one, but any ideas would be appreciated.
Memory usage
Forum rules
Help us help you:
Help us help you:
- Are you using the latest stable version of SABnzbd? Downloads page.
- Tell us what system you run SABnzbd on.
- Adhere to the forum rules.
- Do you experience problems during downloading?
Check your connection in Status and Interface settings window.
Use Test Server in Config > Servers.
We will probably ask you to do a test using only basic settings. - Do you experience problems during repair or unpacking?
Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Re: Memory usage
This issue has been answered in:
http://forums.sabnzbd.org/index.php?topic=680.0
We do want to look at how to reduce peak usage when reading in the NZB,
but this requires quite a bit of rework.
The current method is very simple, but uses a lot of memory.
This is hardly an issue on any non-NAS system.
http://forums.sabnzbd.org/index.php?topic=680.0
We do want to look at how to reduce peak usage when reading in the NZB,
but this requires quite a bit of rework.
The current method is very simple, but uses a lot of memory.
This is hardly an issue on any non-NAS system.
Last edited by shypike on June 20th, 2008, 11:05 am, edited 1 time in total.
Re: Memory usage
Thanks for the reply.
So this is likely due to the size of the NZB, not the number of files? If I was to manually select sections of the newzbin post and create my own .nzb files, (say 4 to 8 .nzb files) and manually par/unrar, SABnzbd should have an easier time downloading?
So this is likely due to the size of the NZB, not the number of files? If I was to manually select sections of the newzbin post and create my own .nzb files, (say 4 to 8 .nzb files) and manually par/unrar, SABnzbd should have an easier time downloading?
Re: Memory usage
The number of files has an influence.
1) it makes the NZB file larger.
2) it needs some more memory for the bookkeeping.
Effect 1 is the worst, also because it gives a peak memory usage.
What happens is:
1. read NZB file in memory
2. convert to XML strcuture
3. create bookkeeping
4. discard XML structure
5. discard NZB file
Steps 1 and 2 cause the memory peak, only 3 has a lasting effect.
1) it makes the NZB file larger.
2) it needs some more memory for the bookkeeping.
Effect 1 is the worst, also because it gives a peak memory usage.
What happens is:
1. read NZB file in memory
2. convert to XML strcuture
3. create bookkeeping
4. discard XML structure
5. discard NZB file
Steps 1 and 2 cause the memory peak, only 3 has a lasting effect.