0.5.0 unpacking large jobs over and over again

Report & discuss bugs found in SABnzbd
Forum rules
Help us help you:
  • Are you using the latest stable version of SABnzbd? Downloads page.
  • Tell us what system you run SABnzbd on.
  • Adhere to the forum rules.
  • Do you experience problems during downloading?
    Check your connection in Status and Interface settings window.
    Use Test Server in Config > Servers.
    We will probably ask you to do a test using only basic settings.
  • Do you experience problems during repair or unpacking?
    Enable +Debug logging in the Status and Interface settings window and share the relevant parts of the log here using [ code ] sections.
Post Reply
MiG
Jr. Member
Jr. Member
Posts: 52
Joined: January 11th, 2010, 7:13 pm

0.5.0 unpacking large jobs over and over again

Post by MiG »

Version: 0.5.0 Final
OS: WinXP SP3
Install-type: Windows Installer
Firewall Software: WinXP SP2 Firewall
Are you using IPV6? no
Is the issue reproducible? yes

For some reason some large jobs get unpacked over and over and over again. I'd be downloading a season of something, and end up with multiple copies of all of the files, named episode.s01e03.avi, episode.s01e03(1).avi, episode.s01e03(2).avi etc. Stopped one when a 4GB job was at 80GB worth of files, caught one again tonight. Small downloads (including those coming off the TV-new RSS feed) oddly enough seem to unpack/delete just fine, but re-adding a problematic large job's NZB will result in exactly the same behaviour. Seems to have started with 0.5.0, the previous 0.4.12 version worked fine in this regard.
Last edited by MiG on February 27th, 2010, 1:34 pm, edited 1 time in total.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by shypike »

More details required.
Do you use TVSort?
Please email the NZB to bugs at sabnzbd.org
Could also be a malicious rar file.
Example: hello.rar containing the file hello.rar will
result in an endless loop.
MiG
Jr. Member
Jr. Member
Posts: 52
Joined: January 11th, 2010, 7:13 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by MiG »

shypike wrote: More details required.
Do you use TVSort?
Please email the NZB to bugs at sabnzbd.org
Could also be a malicious rar file.
Example: hello.rar containing the file hello.rar will
result in an endless loop.
Not familiar with TVsort, I assume that's not enabled by default. I did a full uninstall and reinstall last night, started a job (with +U) and left it running... Unrar appears to have continued filling up my drive with new files until I had 97MB left, turning a 5-6GB job into 101GB one with 16-17 duplicates of each file.

Anyway, NZB mailed, and I've also included a directory dump to give you an idea of the results. Manually extracting the files from the downloaded rars seems to work just fine by the way, they don't seem to be damaged or contain duplicate file names so I'm not quite sure what's going on.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by shypike »

I successfully downloaded your NZB,
but later I noticed that I had the "Ignore Samples: Delete after download" option on.
Maybe this makes the difference.

I will look further into this, but it may take a while.
MiG
Jr. Member
Jr. Member
Posts: 52
Joined: January 11th, 2010, 7:13 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by MiG »

shypike wrote: I successfully downloaded your NZB,
but later I noticed that I had the "Ignore Samples: Delete after download" option on.
Maybe this makes the difference.

I will look further into this, but it may take a while.
Thanks, let me know if you need more info on settings or debugging logs.
Samples are set to "Do not download", by the way. I can't imagine that to cause all the others to be extracted over and over again, but I'll give it a try just to be sure.
MiG
Jr. Member
Jr. Member
Posts: 52
Joined: January 11th, 2010, 7:13 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by MiG »

MiG wrote:Samples are set to "Do not download", by the way. I can't imagine that to cause all the others to be extracted over and over again, but I'll give it a try just to be sure.
No joy. It's currently on its second round of files, "(1)". I've also disabled "quick check" this time, the rest of my settings attached to bugs@.
TheBreeze
Newbie
Newbie
Posts: 1
Joined: March 5th, 2010, 8:21 am

Re: 0.5.0 unpacking large jobs over and over again

Post by TheBreeze »

I've got the same problem here on Windows 7.  0.5.0 was working fine for a while, then it got hung extracting a 700 mb file.  It was over 25 GB when I caught it.  Guess I'll have to go back to using Usenet Explorer until this gets fixed.

BTW first post here, wanted to thank the developers for a great program!  Been using it for a while and will definitively be back when this gets resolved.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by shypike »

I think I found the problem.
It's a very old error that only became potentially fatal after we changed the unrar flags.
Before unrar would simply discard a file when a file already existed.
Now we tell unrar to auto-rename such files.
A problem occurs when you have a file hello.rar and a file hello_too.rar that contains another hello.rar,
Before you would miss some of the files.
Now there's a potential for an infinite loop.

Still, this leads only to problems when you either use 'Unpack' instead of 'Unpack+Delete'
or when SABnzbd somehow fails to delete already processed rar files.
So normally you would have a problem, which is why it's hard to reproduce.

Anyway, I'm going to fix those errors and add a nesting limit to
the recursive unpacking, just in case.
MiG
Jr. Member
Jr. Member
Posts: 52
Joined: January 11th, 2010, 7:13 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by MiG »

That also explains the odd discrepancy between large and small jobs here - everything off RSS feeds (i.e. single episodes) gets +D'ed, whereas larger, individual jobs use the default +U (because I'd like the included subs and other files to stay archived).
I can still reliably reproduce the error with that NZB I sent earlier, mail me if you need a beta tester.
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by shypike »

+U is a bit of an odd option to use for large jobs.
You end up with twice the amount of used storage.
Using +D will get you all of your files.

Well, at least your problem is now 100% explained,
which is a relief too.
MiG
Jr. Member
Jr. Member
Posts: 52
Joined: January 11th, 2010, 7:13 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by MiG »

Got a large local hard drive as a cache of sorts, and as mentioned there are several things I'd like to stay packed so I don't mind having to do a manual delete action afterward if SABnzbd saves me from manually having to initiate an unpack action after a long download - the non SABnzbd way required me to do those two anyway. Just a pity it went (literally) loopy recently :)
User avatar
shypike
Administrator
Administrator
Posts: 19774
Joined: January 18th, 2008, 12:49 pm

Re: 0.5.0 unpacking large jobs over and over again

Post by shypike »

It's solved now. Will be in 0.5.1 soon.
If you stay away from +U, there's very liitle change of it happening again.
Post Reply