Page 1 of 1

URLGrabber Crashing and effect on SABnzdb

Posted: May 20th, 2020, 3:02 pm
by kkr
Hi, I have a situation where, according to the logs, URLGrabber is crashing due to something related to the way a nzb is named. I have recreated this in two separate instances: one running in docker (on my Synology NAS), and another running in Windows 10. While in both instances URLGrabber crashes, they each crash for a different reason, and it affects how SABnzbd recovers. Both are attempting to grab the same nzb file. The name of the file is partially in Japanese characters which accounts for the file name below. Apologies for what follows as I had to modify it a bit as it thinks I'm trying to spam links trying to post. :)

In docker, I get the following (I had to add a space in the 3rd line in order for me to post this):

2020-05-20 12:35:25,182::ERROR::[urlgrabber:296] URLGRABBER CRASHED
Traceback (most recent call last):
File "/usr/share/sabnzbdplus/sabnzbd/urlgrabber. py", line 252, in run
f = open(path, 'wb')
IOError: [Errno 36] File name too long: "/config/admin/future/Uncensored.Leaked...RBD.nzb; filename*=utf-8''Uncensored.Leaked.%E3%80%90%E3%83%A2%E3%82%B6%E3%82%A4%E3%82%AF%E7%A0%B4%E5%A3%8A%E7%89%88%E3%80%91%E9%8E%96%E3%81%AB%E7%B9%8B%E3%81%8C%E3%82%8C%E3%81%9F%E8%8A%B1%E5%AB%81.%E7%A5%9E%E6%B3%A2%E5%A4%9A%E4%B8%80%E8%8A%B1.RBD.nzb"


In Windows 10, I get the following (I had to add a space in the 3rd line and remove the URLs in the last 3 lines to post this):

2020-05-20 12:34:25,183::ERROR::[urlgrabber:296] URLGRABBER CRASHED
Traceback (most recent call last):
File "sabnzbd\urlgrabber. pyo", line 252, in run
IOError: [Errno 2] No such file or directory: "C:\\Users\\<USERNAME>\\AppData\\Local\\sabnzbd\\admin\\future\\Uncensored.Leaked...RBD.nzb; filename@=utf-8''Uncensored.Leaked.%E3%80%90%E3%83%A2%E3%82%B6%E3%82%A4%E3%82%AF%E7%A0%B4%E5%A3%8A%E7%89%88%E3%80%91%E9%8E%96%E3%81%AB%E7%B9%8B%E3%81%8C%E3%82%8C%E3%81%9F%E8%8A%B1%E5%AB%81.%E7%A5%9E%E6%B3%A2%E5%A4%9A%E4%B8%80%E8%8A%B1.RBD.nzb"
2020-05-20 12:36:42,107::INFO::[nzbstuff:1759] [N/A] Purging data for job Trying to fetch NZB from URL (keep_basic=False, del_files=1)
2020-05-20 12:36:42,108::INFO::[nzbqueue:418] [N/A] Removed job Trying to fetch NZB from URL
2020-05-20 12:36:42,108::INFO::[nzbqueue:260] Saving queue


You can see the different types of errors in each. The way SABnzbd handles each is very different. In docker, it seems to somewhat crash the queue screen - listing only this file with all the buttons to delete it unresponsive and as well, the history is also not displayed. It just seems to be stuck somewhere in limbo and I need to shutdown SABnzbd, then remove the file from the "future" folder and then restart it and it'll be back to normal sans the file in the queue (which is now empty.

In Windows, it handles its error more gracefully. From the log, it seems to know to purge it automatically whereas in docker, it just hangs right after the error. On Windows, it just shows up as en error on the queue screen with the full error message and you can remove it from the queue easily and that's it.

I suspect it's something in my docker settings that is causing the difference here and I was wondering if anyone knew what it could be based on the above. Both versions are 2.3.9. Would really love for my instance in docker to auto purge it as well and avoid this hanging issue.

As far as this nzb not being able to be handled, I guess that's another issue isn't as important to me. NZBGet also fails on both docker and Windows with the former saying file name is too long, and the latter saying invalid argument. But it does not crash the interface which I think is what I'm trying to prevent with my docker SABnzbd.

Re: URLGrabber Crashing and effect on SABnzdb

Posted: May 20th, 2020, 5:18 pm
by safihre
In both cases it is IOError's, these are thrown by Python when it get's errors from the underlying OS.
It seems Win10 didn't care for the long filename, but the encoding did mess it up in the end when it wanted to continue after writing.
Could maybe be fixed by applying some sort of sanity check in urlgrabber.